Sample records for defined interoperability criteria

  1. [Lessons learned in the implementation of interoperable National Health Information Systems: a systematic review].

    PubMed

    Ovies-Bernal, Diana Paola; Agudelo-Londoño, Sandra M

    2014-01-01

    Identify shared criteria used throughout the world in the implementation of interoperable National Health Information Systems (NHIS) and provide validated scientific information on the dimensions affecting interoperability. This systematic review sought to identify primary articles on the implementation of interoperable NHIS published in scientific journals in English, Portuguese, or Spanish between 1990 and 2011 through a search of eight databases of electronic journals in the health sciences and informatics: MEDLINE (PubMed), Proquest, Ovid, EBSCO, MD Consult, Virtual Health Library, Metapress, and SciELO. The full texts of the articles were reviewed, and those that focused on technical computer aspects or on normative issues were excluded, as well as those that did not meet the quality criteria for systematic reviews of interventions. Of 291 studies found and reviewed, only five met the inclusion criteria. These articles reported on the process of implementing an interoperable NHIS in Brazil, China, the United States, Turkey, and the Semiautonomous Region of Zanzíbar, respectively. Five common basic criteria affecting implementation of the NHIS were identified: standards in place to govern the process, availability of trained human talent, financial and structural constraints, definition of standards, and assurance that the information is secure. Four dimensions affecting interoperability were defined: technical, semantic, legal, and organizational. The criteria identified have to be adapted to the actual situation in each country and a proactive approach should be used to ensure that implementation of the interoperable NHIS is strategic, simple, and reliable.

  2. IHE based interoperability - benefits and challenges.

    PubMed

    Wozak, Florian; Ammenwerth, Elske; Hörbst, Alexander; Sögner, Peter; Mair, Richard; Schabetsberger, Thomas

    2008-01-01

    Optimized workflows and communication between institutions involved in a patient's treatment process can lead to improved quality and efficiency in the healthcare sector. Electronic Health Records (EHRs) provide a patient-centered access to clinical data across institutional boundaries supporting the above mentioned aspects. Interoperability is regarded as vital success factor. However a clear definition of interoperability does not exist. The aim of this work is to define and to assess interoperability criteria as required for EHRs. The definition and assessment of interoperability criteria is supported by the analysis of existing literature and personal experience as well as by discussions with several domain experts. Criteria for interoperability addresses the following aspects: Interfaces, Semantics, Legal and organizational aspects and Security. The Integrating the Healthcare Enterprises initiative (IHE) profiles make a major contribution to these aspects, but they also arise new problems. Flexibility for adoption to different organizational/regional or other specific conditions is missing. Regional or national initiatives should get a possibility to realize their specific needs within the boundaries of IHE profiles. Security so far is an optional element which is one of IHE greatest omissions. An integrated security approach seems to be preferable. Irrespective of the so far practical significance of the IHE profiles it appears to be of great importance, that the profiles are constantly checked against practical experiences and are continuously adapted.

  3. ECITE: A Testbed for Assessment of Technology Interoperability and Integration wiht Architecture Components

    NASA Astrophysics Data System (ADS)

    Graves, S. J.; Keiser, K.; Law, E.; Yang, C. P.; Djorgovski, S. G.

    2016-12-01

    ECITE (EarthCube Integration and Testing Environment) is providing both cloud-based computational testing resources and an Assessment Framework for Technology Interoperability and Integration. NSF's EarthCube program is funding the development of cyberinfrastructure building block components as technologies to address Earth science research problems. These EarthCube building blocks need to support integration and interoperability objectives to work towards a coherent cyberinfrastructure architecture for the program. ECITE is being developed to provide capabilities to test and assess the interoperability and integration across funded EarthCube technology projects. EarthCube defined criteria for interoperability and integration are applied to use cases coordinating science problems with technology solutions. The Assessment Framework facilitates planning, execution and documentation of the technology assessments for review by the EarthCube community. This presentation will describe the components of ECITE and examine the methodology of cross walking between science and technology use cases.

  4. caBIG compatibility review system: software to support the evaluation of applications using defined interoperability criteria.

    PubMed

    Freimuth, Robert R; Schauer, Michael W; Lodha, Preeti; Govindrao, Poornima; Nagarajan, Rakesh; Chute, Christopher G

    2008-11-06

    The caBIG Compatibility Review System (CRS) is a web-based application to support compatibility reviews, which certify that software applications that pass the review meet a specific set of criteria that allow them to interoperate. The CRS contains workflows that support both semantic and syntactic reviews, which are performed by the caBIG Vocabularies and Common Data Elements (VCDE) and Architecture workspaces, respectively. The CRS increases the efficiency of compatibility reviews by reducing administrative overhead and it improves uniformity by ensuring that each review is conducted according to a standard process. The CRS provides metrics that allow the review team to evaluate the level of data element reuse in an application, a first step towards quantifying the extent of harmonization between applications. Finally, functionality is being added that will provide automated validation of checklist criteria, which will further simplify the review process.

  5. Definition of information technology architectures for continuous data management and medical device integration in diabetes.

    PubMed

    Hernando, M Elena; Pascual, Mario; Salvador, Carlos H; García-Sáez, Gema; Rodríguez-Herrero, Agustín; Martínez-Sarriegui, Iñaki; Gómez, Enrique J

    2008-09-01

    The growing availability of continuous data from medical devices in diabetes management makes it crucial to define novel information technology architectures for efficient data storage, data transmission, and data visualization. The new paradigm of care demands the sharing of information in interoperable systems as the only way to support patient care in a continuum of care scenario. The technological platforms should support all the services required by the actors involved in the care process, located in different scenarios and managing diverse information for different purposes. This article presents basic criteria for defining flexible and adaptive architectures that are capable of interoperating with external systems, and integrating medical devices and decision support tools to extract all the relevant knowledge to support diabetes care.

  6. Cross border semantic interoperability for clinical research: the EHR4CR semantic resources and services

    PubMed Central

    Daniel, Christel; Ouagne, David; Sadou, Eric; Forsberg, Kerstin; Gilchrist, Mark Mc; Zapletal, Eric; Paris, Nicolas; Hussain, Sajjad; Jaulent, Marie-Christine; MD, Dipka Kalra

    2016-01-01

    With the development of platforms enabling the use of routinely collected clinical data in the context of international clinical research, scalable solutions for cross border semantic interoperability need to be developed. Within the context of the IMI EHR4CR project, we first defined the requirements and evaluation criteria of the EHR4CR semantic interoperability platform and then developed the semantic resources and supportive services and tooling to assist hospital sites in standardizing their data for allowing the execution of the project use cases. The experience gained from the evaluation of the EHR4CR platform accessing to semantically equivalent data elements across 11 European participating EHR systems from 5 countries demonstrated how far the mediation model and mapping efforts met the expected requirements of the project. Developers of semantic interoperability platforms are beginning to address a core set of requirements in order to reach the goal of developing cross border semantic integration of data. PMID:27570649

  7. UHF (Ultra High Frequency) Military Satellite Communications Ground Equipment Interoperability.

    DTIC Science & Technology

    1986-10-06

    crisis management requires interoperability between various services. These short-term crises often arise from unforeseen circumstances in which...Scheduler Qualcomm has prepared an interoperability study for the JTC3A (Reference 15) as a TA/CE for USCINCLANT ROC 5-84 requirements. It has defined a...interoperability is fundamental. A number of operational crises have occurred where interoperable communications or the lack of interoperable

  8. On the formal definition of the systems' interoperability capability: an anthropomorphic approach

    NASA Astrophysics Data System (ADS)

    Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav

    2017-03-01

    The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.

  9. Enabling interoperability in planetary sciences and heliophysics: The case for an information model

    NASA Astrophysics Data System (ADS)

    Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.

    2018-01-01

    The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.

  10. A cloud-based approach for interoperable electronic health records (EHRs).

    PubMed

    Bahga, Arshdeep; Madisetti, Vijay K

    2013-09-01

    We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.

  11. Global Interoperability of Broadband Networks (GIBN): Project Overview

    NASA Technical Reports Server (NTRS)

    DePaula, Ramon P.

    1998-01-01

    Various issues associated with the Global Interoperability of Broadband Networks (GIBN) are presented in viewgraph form. Specific topics include GIBN principles, objectives and goals, and background. GIBN/NASA status, the Transpacific High Definition Video experiment, GIBN experiment selection criteria, satellite industry involvement, and current experiments associated with GIBN are also discussed.

  12. An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring

    PubMed Central

    Mora-Jiménez, Inmaculada; Ramos-López, Javier; Quintanilla Fernández, Teresa; García-García, Antonio; Díez-Mazuela, Daniel; García-Alberola, Arcadi

    2018-01-01

    Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT)), and a complex-domain (heart rate variability (HRV)). Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages) and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT). The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain. PMID:29494497

  13. Personal Health Records: Is Rapid Adoption Hindering Interoperability?

    PubMed Central

    Studeny, Jana; Coustasse, Alberto

    2014-01-01

    The establishment of the Meaningful Use criteria has created a critical need for robust interoperability of health records. A universal definition of a personal health record (PHR) has not been agreed upon. Standardized code sets have been built for specific entities, but integration between them has not been supported. The purpose of this research study was to explore the hindrance and promotion of interoperability standards in relationship to PHRs to describe interoperability progress in this area. The study was conducted following the basic principles of a systematic review, with 61 articles used in the study. Lagging interoperability has stemmed from slow adoption by patients, creation of disparate systems due to rapid development to meet requirements for the Meaningful Use stages, and rapid early development of PHRs prior to the mandate for integration among multiple systems. Findings of this study suggest that deadlines for implementation to capture Meaningful Use incentive payments are supporting the creation of PHR data silos, thereby hindering the goal of high-level interoperability. PMID:25214822

  14. The Health Service Bus: an architecture and case study in achieving interoperability in healthcare.

    PubMed

    Ryan, Amanda; Eklund, Peter

    2010-01-01

    Interoperability in healthcare is a requirement for effective communication between entities, to ensure timely access to up to-date patient information and medical knowledge, and thus facilitate consistent patient care. An interoperability framework called the Health Service Bus (HSB), based on the Enterprise Service Bus (ESB) middleware software architecture is presented here as a solution to all three levels of interoperability as defined by the HL7 EHR Interoperability Work group in their definitive white paper "Coming to Terms". A prototype HSB system was implemented based on the Mule Open-Source ESB and is outlined and discussed, followed by a clinically-based example.

  15. Maturity model for enterprise interoperability

    NASA Astrophysics Data System (ADS)

    Guédria, Wided; Naudet, Yannick; Chen, David

    2015-01-01

    Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.

  16. Chile's National Center for Health Information Systems: A Public-Private Partnership to Foster Health Care Information Interoperability.

    PubMed

    Capurro, Daniel; Echeverry, Aisen; Figueroa, Rosa; Guiñez, Sergio; Taramasco, Carla; Galindo, César; Avendaño, Angélica; García, Alejandra; Härtel, Steffen

    2017-01-01

    Despite the continuous technical advancements around health information standards, a critical component to their widespread adoption involves political agreement between a diverse set of stakeholders. Countries that have addressed this issue have used diverse strategies. In this vision paper we present the path that Chile is taking to establish a national program to implement health information standards and achieve interoperability. The Chilean government established an inter-agency program to define the current interoperability situation, existing gaps, barriers, and facilitators for interoperable health information systems. As an answer to the identified issues, the government decided to fund a consortium of Chilean universities to create the National Center for Health Information Systems. This consortium should encourage the interaction between all health care stakeholders, both public and private, to advance the selection of national standards and define certification procedures for software and human resources in health information technologies.

  17. Interoperability, Scaling, and the Digital Libraries Research Agenda.

    ERIC Educational Resources Information Center

    Lynch, Clifford; Garcia-Molina, Hector

    1996-01-01

    Summarizes reports and activities at the Information Infrastructure Technology and Applications workshop on digital libraries (Reston, Virginia, August 22, 1995). Defines digital library roles and identifies areas of needed research, including: interoperability; protocols for digital objects; collection management; interface design; human-computer…

  18. Interoperability and information discovery

    USGS Publications Warehouse

    Christian, E.

    2001-01-01

    In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.

  19. Archive interoperability in the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Genova, Françoise

    2003-02-01

    Main goals of Virtual Observatory projects are to build interoperability between astronomical on-line services, observatory archives, databases and results published in journals, and to develop tools permitting the best scientific usage from the very large data sets stored in observatory archives and produced by large surveys. The different Virtual Observatory projects collaborate to define common exchange standards, which are the key for a truly International Virtual Observatory: for instance their first common milestone has been a standard allowing exchange of tabular data, called VOTable. The Interoperability Work Area of the European Astrophysical Virtual Observatory project aims at networking European archives, by building a prototype using the CDS VizieR and Aladin tools, and at defining basic rules to help archive providers in interoperability implementation. The prototype is accessible for scientific usage, to get user feedback (and science results!) at an early stage of the project. ISO archive participates very actively to this endeavour, and more generally to information networking. The on-going inclusion of the ISO log in SIMBAD will allow higher level links for users.

  20. Space Network Interoperability Panel (SNIP) study

    NASA Technical Reports Server (NTRS)

    Ryan, Thomas; Lenhart, Klaus; Hara, Hideo

    1991-01-01

    The Space Network Interoperability Panel (SNIP) study is a tripartite study that involves the National Aeronautics and Space Administration (NASA), the European Space Agency (ESA), and the National Space Development Agency (NASDA) of Japan. SNIP involves an ongoing interoperability study of the Data Relay Satellite (DRS) Systems of the three organizations. The study is broken down into two parts; Phase one deals with S-band (2 GHz) interoperability and Phase two deals with Ka-band (20/30 GHz) interoperability (in addition to S-band). In 1987 the SNIP formed a Working Group to define and study operations concepts and technical subjects to assure compatibility of the international data relay systems. Since that time a number of Panel and Working Group meetings have been held to continue the study. Interoperability is of interest to the three agencies because it offers a number of potential operation and economic benefits. This paper presents the history and status of the SNIP study.

  1. Semantic interoperability--HL7 Version 3 compared to advanced architecture standards.

    PubMed

    Blobel, B G M E; Engel, K; Pharow, P

    2006-01-01

    To meet the challenge for high quality and efficient care, highly specialized and distributed healthcare establishments have to communicate and co-operate in a semantically interoperable way. Information and communication technology must be open, flexible, scalable, knowledge-based and service-oriented as well as secure and safe. For enabling semantic interoperability, a unified process for defining and implementing the architecture, i.e. structure and functions of the cooperating systems' components, as well as the approach for knowledge representation, i.e. the used information and its interpretation, algorithms, etc. have to be defined in a harmonized way. Deploying the Generic Component Model, systems and their components, underlying concepts and applied constraints must be formally modeled, strictly separating platform-independent from platform-specific models. As HL7 Version 3 claims to represent the most successful standard for semantic interoperability, HL7 has been analyzed regarding the requirements for model-driven, service-oriented design of semantic interoperable information systems, thereby moving from a communication to an architecture paradigm. The approach is compared with advanced architectural approaches for information systems such as OMG's CORBA 3 or EHR systems such as GEHR/openEHR and CEN EN 13606 Electronic Health Record Communication. HL7 Version 3 is maturing towards an architectural approach for semantic interoperability. Despite current differences, there is a close collaboration between the teams involved guaranteeing a convergence between competing approaches.

  2. A loosely coupled framework for terminology controlled distributed EHR search for patient cohort identification in clinical research.

    PubMed

    Zhao, Lei; Lim Choi Keung, Sarah N; Taweel, Adel; Tyler, Edward; Ogunsina, Ire; Rossiter, James; Delaney, Brendan C; Peterson, Kevin A; Hobbs, F D Richard; Arvanitis, Theodoros N

    2012-01-01

    Heterogeneous data models and coding schemes for electronic health records present challenges for automated search across distributed data sources. This paper describes a loosely coupled software framework based on the terminology controlled approach to enable the interoperation between the search interface and heterogeneous data sources. Software components interoperate via common terminology service and abstract criteria model so as to promote component reuse and incremental system evolution.

  3. Generic Educational Knowledge Representation for Adaptive and Cognitive Systems

    ERIC Educational Resources Information Center

    Caravantes, Arturo; Galan, Ramon

    2011-01-01

    The interoperability of educational systems, encouraged by the development of specifications, standards and tools related to the Semantic Web is limited to the exchange of information in domain and student models. High system interoperability requires that a common framework be defined that represents the functional essence of educational systems.…

  4. Interoperability Is the Foundation for Successful Internet Telephony.

    ERIC Educational Resources Information Center

    Fromm, Larry

    1997-01-01

    More than 40 leading computer and telephony companies have united to lead the charge toward open standards and universal interoperability for Internet telephony products. The voice of IP Forum (VoIP) is working to define technical guidelines for two-party, real-time communications over IP networks, including provisions for compatibility with…

  5. Functional requirements document for NASA/MSFC Earth Science and Applications Division: Data and information system (ESAD-DIS). Interoperability, 1992

    NASA Technical Reports Server (NTRS)

    Stephens, J. Briscoe; Grider, Gary W.

    1992-01-01

    These Earth Science and Applications Division-Data and Information System (ESAD-DIS) interoperability requirements are designed to quantify the Earth Science and Application Division's hardware and software requirements in terms of communications between personal and visualization workstation, and mainframe computers. The electronic mail requirements and local area network (LAN) requirements are addressed. These interoperability requirements are top-level requirements framed around defining the existing ESAD-DIS interoperability and projecting known near-term requirements for both operational support and for management planning. Detailed requirements will be submitted on a case-by-case basis. This document is also intended as an overview of ESAD-DIs interoperability for new-comers and management not familiar with these activities. It is intended as background documentation to support requests for resources and support requirements.

  6. Policy brief on the current status of certification of electronic Health Records in the US and Europe.

    PubMed

    De Moor, Georges; O'Brien, John; Fridsma, Doug; Bean, Carol; Devlies, Jos; Cusack, Caitlin M; Bloomrosen, Meryl; Lorenzi, Nancy; Coorevits, Pascal

    2011-01-01

    If Electronic Health Record systems are to provide an effective contribution to healthcare, a set of benchmarks need to be set to ensure quality control and interoperability of systems. This paper outlines the prevailing status of EHR certification in the US and the EU, compares and contrasts established schemes and poses opportunities for convergence of activity in the domain designed to advance certification endeavours generally. Several EU Member States have in the past proceeded with EHR systems quality labeling and/or certification, but these differ in scope, in legal framework under which they operate, in policies (legislation and financial incentives), in organization, and perhaps most importantly in the quality criteria used for benchmarking. Harmonization, therefore, became a must. Now, through EuroRec (with approaches ranging from self-assessment to third party certification depending on the level of confidence needed) and its Seals, the possibility to achieve this for EHR systems has started in the whole of Europe. The US HITECH Act also attempts to create incentives for all hospitals and eligible providers to adopt and use electronic information. A centerpiece of the Act is to put in place strong financial incentives to adopt and meaningfully use EHRs. The HHS/EHR Certification Programme makes use of ISO/IEC 170XX standards for accreditation, testing and certification. The approved test method addresses the functional and the interoperability requirements defined in the Final Rule criteria and standards. To date six Authorized Testing and Certification Bodies (ATCBs) are testing and certifying products in the US.

  7. End-to-end interoperability and workflows from building architecture design to one or more simulations

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  8. Distributed GIS Systems, Open Specifications and Interoperability: How do They Relate to the Sustainable Management of Natural Resources?

    Treesearch

    Rafael Moreno-Sanchez

    2006-01-01

    The aim of this is paper is to provide a conceptual framework for the session: “The role of web-based Geographic Information Systems in supporting sustainable management.” The concepts of sustainability, sustainable forest management, Web Services, Distributed Geographic Information Systems, interoperability, Open Specifications, and Open Source Software are defined...

  9. Contribution of Clinical Archetypes, and the Challenges, towards Achieving Semantic Interoperability for EHRs.

    PubMed

    Tapuria, Archana; Kalra, Dipak; Kobayashi, Shinji

    2013-12-01

    The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes.

  10. An Interoperability Framework and Capability Profiling for Manufacturing Software

    NASA Astrophysics Data System (ADS)

    Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.

    ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.

  11. Dynamic Business Networks: A Headache for Sustainable Systems Interoperability

    NASA Astrophysics Data System (ADS)

    Agostinho, Carlos; Jardim-Goncalves, Ricardo

    Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.

  12. Large scale healthcare data integration and analysis using the semantic web.

    PubMed

    Timm, John; Renly, Sondra; Farkash, Ariel

    2011-01-01

    Healthcare data interoperability can only be achieved when the semantics of the content is well defined and consistently implemented across heterogeneous data sources. Achieving these objectives of interoperability requires the collaboration of experts from several domains. This paper describes tooling that integrates Semantic Web technologies with common tools to facilitate cross-domain collaborative development for the purposes of data interoperability. Our approach is divided into stages of data harmonization and representation, model transformation, and instance generation. We applied our approach on Hypergenes, an EU funded project, where we use our method to the Essential Hypertension disease model using a CDA template. Our domain expert partners include clinical providers, clinical domain researchers, healthcare information technology experts, and a variety of clinical data consumers. We show that bringing Semantic Web technologies into the healthcare interoperability toolkit increases opportunities for beneficial collaboration thus improving patient care and clinical research outcomes.

  13. An Architecture for Semantically Interoperable Electronic Health Records.

    PubMed

    Toffanello, André; Gonçalves, Ricardo; Kitajima, Adriana; Puttini, Ricardo; Aguiar, Atualpa

    2017-01-01

    Despite the increasing adhesion of electronic health records, the challenge of semantic interoperability remains unsolved. The fact that different parties can exchange messages does not mean they can understand the underlying clinical meaning, therefore, it cannot be assumed or treated as a requirement. This work introduces an architecture designed to achieve semantic interoperability, in a way which organizations that follow different policies may still share medical information through a common infrastructure comparable to an ecosystem, whose organisms are exemplified within the Brazilian scenario. Nonetheless, the proposed approach describes a service-oriented design with modules adaptable to different contexts. We also discuss the establishment of an enterprise service bus to mediate a health infrastructure defined on top of international standards, such as openEHR and IHE. Moreover, we argue that, in order to achieve truly semantic interoperability in a wide sense, a proper profile must be published and maintained.

  14. Contribution of Clinical Archetypes, and the Challenges, towards Achieving Semantic Interoperability for EHRs

    PubMed Central

    Kalra, Dipak; Kobayashi, Shinji

    2013-01-01

    Objectives The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Methods Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. Results The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Conclusions Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes. PMID:24523993

  15. Achieving Interoperability Through Base Registries for Governmental Services and Document Management

    NASA Astrophysics Data System (ADS)

    Charalabidis, Yannis; Lampathaki, Fenareti; Askounis, Dimitris

    As digital infrastructures increase their presence worldwide, following the efforts of governments to provide citizens and businesses with high-quality one-stop services, there is a growing need for the systematic management of those newly defined and constantly transforming processes and electronic documents. E-government Interoperability Frameworks usually cater to the technical standards of e-government systems interconnection, but do not address service composition and use by citizens, businesses, or other administrations.

  16. Applying Registry Services to Spaceflight Technologies to Aid in the Assignment of Assigned Numbers to Disparate Systems and Their Technologies to Further Enable Interoperability

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Nichols, Kelvin F.

    2006-01-01

    To date very little effort has been made to provide interoperability between various space agency projects. To effectively get to the Moon and beyond systems must interoperate. To provide interoperability, standardization and registries of various technologies will be required. These registries will be created as they relate to space flight. With the new NASA Moon/Mars initiative a requirement to standardize and control the naming conventions of very disparate systems and technologies are emerging. The need to provide numbering to the many processes, schemas, vehicles, robots, space suits and technologies (e.g. versions), to name a few, in the highly complex Constellation Initiative is imperative. The number of corporations, developer personnel, system interfaces, people interfaces will require standardization and registries on a scale not currently envisioned. It would only take one exception (stove piped system development) to weaken, if not, destroy interoperability. To start, a standardized registry process must be defined that allows many differing engineers, organizations and operators the ability to easily access disparate registry information across numerous technological and scientific disciplines. Once registries are standardized the need to provide registry support in terms of setup and operations, resolution of conflicts between registries and other issues will need to be addressed. Registries should not be confused with repositories. No end user data is "stored" in a registry nor is it a configuration control system. Once a registry standard is created and approved, the technologies that should be registered must be identified and prioritized. In this paper, we will identify and define a registry process that is compatible with the Constellation Initiative and other non related space activities and organizations. We will then identify and define the various technologies that should use a registry to provide interoperability. The first set of technologies will be those that are currently in need of expansion namely the assignment of satellite designations and the process which controls assignments. Second, we will analyze the technologies currently standardized under the Consultative Committee for Space Data Systems (CCSDS) banner. Third, we will analyze the current CCSDS working group and birds of a feather activities to ascertain registry requirements. Lastly, we will identify technologies that are either currently under the auspices of another

  17. Assessment of Life Cycle Information Exchanges (LCie): Understanding the Value-Added Benefit of a COBie Process

    DTIC Science & Technology

    2013-10-01

    exchange (COBie), Building Information Modeling ( BIM ), value-added analysis, business processes, project management 16. SECURITY CLASSIFICATION OF: 17...equipment. The innovative aspect of Building In- formation Modeling ( BIM ) is that it creates a computable building descrip- tion. The ability to use a...interoperability. In order for the building information to be interoperable, it must also con- form to a common data model , or schema, that defines the class

  18. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    NASA Astrophysics Data System (ADS)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  19. Current state of the mass storage system reference model

    NASA Technical Reports Server (NTRS)

    Coyne, Robert

    1993-01-01

    IEEE SSSWG was chartered in May 1990 to abstract the hardware and software components of existing and emerging storage systems and to define the software interfaces between these components. The immediate goal is the decomposition of a storage system into interoperable functional modules which vendors can offer as separate commercial products. The ultimate goal is to develop interoperable standards which define the software interfaces, and in the distributed case, the associated protocols to each of the architectural modules in the model. The topics are presented in viewgraph form and include the following: IEEE SSSWG organization; IEEE SSSWG subcommittees & chairs; IEEE standards activity board; layered view of the reference model; layered access to storage services; IEEE SSSWG emphasis; and features for MSSRM version 5.

  20. Enabling Interoperable Space Robots With the Joint Technical Architecture for Robotic Systems (JTARS)

    NASA Technical Reports Server (NTRS)

    Bradley, Arthur; Dubowsky, Steven; Quinn, Roger; Marzwell, Neville

    2005-01-01

    Robots that operate independently of one another will not be adequate to accomplish the future exploration tasks of long-distance autonomous navigation, habitat construction, resource discovery, and material handling. Such activities will require that systems widely share information, plan and divide complex tasks, share common resources, and physically cooperate to manipulate objects. Recognizing the need for interoperable robots to accomplish the new exploration initiative, NASA s Office of Exploration Systems Research & Technology recently funded the development of the Joint Technical Architecture for Robotic Systems (JTARS). JTARS charter is to identify the interface standards necessary to achieve interoperability among space robots. A JTARS working group (JTARS-WG) has been established comprising recognized leaders in the field of space robotics including representatives from seven NASA centers along with academia and private industry. The working group s early accomplishments include addressing key issues required for interoperability, defining which systems are within the project s scope, and framing the JTARS manuals around classes of robotic systems.

  1. Interoperability in planetary research for geospatial data analysis

    NASA Astrophysics Data System (ADS)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  2. Simplifying HL7 Version 3 messages.

    PubMed

    Worden, Robert; Scott, Philip

    2011-01-01

    HL7 Version 3 offers a semantically robust method for healthcare interoperability but has been criticized as overly complex to implement. This paper reviews initiatives to simplify HL7 Version 3 messaging and presents a novel approach based on semantic mapping. Based on user-defined definitions, precise transforms between simple and full messages are automatically generated. Systems can be interfaced with the simple messages and achieve interoperability with full Version 3 messages through the transforms. This reduces the costs of HL7 interfacing and will encourage better uptake of HL7 Version 3 and CDA.

  3. Data interoperabilty between European Environmental Research Infrastructures and their contribution to global data networks

    NASA Astrophysics Data System (ADS)

    Kutsch, W. L.; Zhao, Z.; Hardisty, A.; Hellström, M.; Chin, Y.; Magagna, B.; Asmi, A.; Papale, D.; Pfeil, B.; Atkinson, M.

    2017-12-01

    Environmental Research Infrastructures (ENVRIs) are expected to become important pillars not only for supporting their own scientific communities, but also a) for inter-disciplinary research and b) for the European Earth Observation Program Copernicus as a contribution to the Global Earth Observation System of Systems (GEOSS) or global thematic data networks. As such, it is very important that data-related activities of the ENVRIs will be well integrated. This requires common policies, models and e-infrastructure to optimise technological implementation, define workflows, and ensure coordination, harmonisation, integration and interoperability of data, applications and other services. The key is interoperating common metadata systems (utilising a richer metadata model as the `switchboard' for interoperation with formal syntax and declared semantics). The metadata characterises data, services, users and ICT resources (including sensors and detectors). The European Cluster Project ENVRIplus has developed a reference model (ENVRI RM) for common data infrastructure architecture to promote interoperability among ENVRIs. The presentation will provide an overview of recent progress and give examples for the integration of ENVRI data in global integration networks.

  4. Relevance of eHealth standards for big data interoperability in radiology and beyond.

    PubMed

    Marcheschi, Paolo

    2017-06-01

    The aim of this paper is to report on the implementation of radiology and related information technology standards to feed big data repositories and so to be able to create a solid substrate on which to operate with analysis software. Digital Imaging and Communications in Medicine (DICOM) and Health Level 7 (HL7) are the major standards for radiology and medical information technology. They define formats and protocols to transmit medical images, signals, and patient data inside and outside hospital facilities. These standards can be implemented but big data expectations are stimulating a new approach, simplifying data collection and interoperability, seeking reduction of time to full implementation inside health organizations. Virtual Medical Record, DICOM Structured Reporting and HL7 Fast Healthcare Interoperability Resources (FHIR) are changing the way medical data are shared among organization and they will be the keys to big data interoperability. Until we do not find simple and comprehensive methods to store and disseminate detailed information on the patient's health we will not be able to get optimum results from the analysis of those data.

  5. Application-Level Interoperability Across Grids and Clouds

    NASA Astrophysics Data System (ADS)

    Jha, Shantenu; Luckow, Andre; Merzky, Andre; Erdely, Miklos; Sehgal, Saurabh

    Application-level interoperability is defined as the ability of an application to utilize multiple distributed heterogeneous resources. Such interoperability is becoming increasingly important with increasing volumes of data, multiple sources of data as well as resource types. The primary aim of this chapter is to understand different ways in which application-level interoperability can be provided across distributed infrastructure. We achieve this by (i) using the canonical wordcount application, based on an enhanced version of MapReduce that scales-out across clusters, clouds, and HPC resources, (ii) establishing how SAGA enables the execution of wordcount application using MapReduce and other programming models such as Sphere concurrently, and (iii) demonstrating the scale-out of ensemble-based biomolecular simulations across multiple resources. We show user-level control of the relative placement of compute and data and also provide simple performance measures and analysis of SAGA-MapReduce when using multiple, different, heterogeneous infrastructures concurrently for the same problem instance. Finally, we discuss Azure and some of the system-level abstractions that it provides and show how it is used to support ensemble-based biomolecular simulations.

  6. Rail-CR : railroad cognitive radio.

    DOT National Transportation Integrated Search

    2012-12-01

    Robust, reliable, and interoperable wireless communication devices or technologies are vital to the success of positive train control (PTC) systems. Accordingly, the railway industry has started adopting software-defined radios (SDRs) for packet-data...

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Tian-Jy; Kim, Younghun

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented thatmore » communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.« less

  8. Combining Archetypes with Fast Health Interoperability Resources in Future-proof Health Information Systems.

    PubMed

    Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat

    2015-01-01

    Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes.

  9. Clinical data integration model. Core interoperability ontology for research using primary care data.

    PubMed

    Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A

    2015-01-01

    This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.

  10. Facilitating the openEHR approach - organizational structures for defining high-quality archetypes.

    PubMed

    Kohl, Christian Dominik; Garde, Sebastian; Knaup, Petra

    2008-01-01

    Using openEHR archetypes to establish an electronic patient record promises rapid development and system interoperability by using or adopting existing archetypes. However, internationally accepted, high quality archetypes which enable a comprehensive semantic interoperability require adequate development and maintenance processes. Therefore, structures have to be created involving different health professions. In the following we present a model which facilitates and governs distributed but cooperative development and adoption of archetypes by different professionals including peer reviews. Our model consists of a hierarchical structure of professional committees and descriptions of the archetype development process considering these different committees.

  11. An EarthCube Roadmap for Cross-Domain Interoperability in the Geosciences: Governance Aspects

    NASA Astrophysics Data System (ADS)

    Zaslavsky, I.; Couch, A.; Richard, S. M.; Valentine, D. W.; Stocks, K.; Murphy, P.; Lehnert, K. A.

    2012-12-01

    The goal of cross-domain interoperability is to enable reuse of data and models outside the original context in which these data and models are collected and used and to facilitate analysis and modeling of physical processes that are not confined to disciplinary or jurisdictional boundaries. A new research initiative of the U.S. National Science Foundation, called EarthCube, is developing a roadmap to address challenges of interoperability in the earth sciences and create a blueprint for community-guided cyberinfrastructure accessible to a broad range of geoscience researchers and students. Infrastructure readiness for cross-domain interoperability encompasses the capabilities that need to be in place for such secondary or derivative-use of information to be both scientifically sound and technically feasible. In this initial assessment we consider the following four basic infrastructure components that need to be present to enable cross-domain interoperability in the geosciences: metadata catalogs (at the appropriate community defined granularity) that provide standard discovery services over datasets, data access services, models and other resources of the domain; vocabularies that support unambiguous interpretation of domain resources and metadata; services used to access data repositories and other resources including models, visualizations and workflows; and formal information models that define structure and semantics of the information returned on service requests. General standards for these components have been proposed; they form the backbone of large scale integration activities in the geosciences. By utilizing these standards, EarthCube research designs can take advantage of data discovery across disciplines using the commonality in key data characteristics related to shared models of spatial features, time measurements, and observations. Data can be discovered via federated catalogs and linked nomenclatures from neighboring domains, while standard data services can be used to transparently compile composite data products. Key questions addressed in this presentation are: (1) How to define and assess readiness of existing domain information systems for cross-domain re-use? (2) How to determine EarthCube development priorities given a multitude of use cases that involve cross-domain data flows? and (3) How to involve a wider community of geoscientists in the development and curation of cross-domain resources and incorporate community feedback in the CI design? Answering them involves consideration of governance mechanisms for cross-domain interoperability: while domain information systems and projects developed governance mechanisms, managing cross-domain CI resources and supporting cross-domain information re-use hasn't been the development focus at the scale of the geosciences. We present a cross-domain readiness model as enabling effective communication among scientists, governance bodies, and information providers. We also present an initial readiness assessment and a cross-domain connectivity map for the geosciences, and outline processes for eliciting user requirements, setting priorities, and obtaining community consensus.

  12. Applying Registry Services to Spaceflight Technologies to Aid in the Assignment of Assigned Numbers to Disparate Systems and their Technologies to Further Enable Interoperability

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Nichols, Kelvin F.; Witherspoon, Keith R.

    2006-01-01

    To date very little effort has been made to provide interoperability between various space agency projects. To effectively get to the Moon and beyond systems must interoperate. To provide interoperability, standardization and registries of various technologies will be required. These registries will be created as they relate to space flight. With the new NASA Moon/Mars initiative, a requirement to standardize and control the naming conventions of very disparate systems and technologies is emerging. The need to provide numbering to the many processes, schemas, vehicles, robots, space suits and technologies (e.g. versions), to name a few, in the highly complex Constellation initiative is imperative. The number of corporations, developer personnel, system interfaces, people interfaces will require standardization and registries on a scale not currently envisioned. It would only take one exception (stove piped system development) to weaken, if not, destroy interoperability. To start, a standardized registry process must be defined that allows many differing engineers, organizations and operators the ability to easily access disparate registry information across numerous technological and scientific disciplines. Once registries are standardized the need to provide registry support in terms of setup and operations, resolution of conflicts between registries and other issues will need to be addressed. Registries should not be confused with repositories. No end user data is "stored" in a registry nor is it a configuration control system. Once a registry standard is created and approved, the technologies that should be registered must be identified and prioritized. In this paper, we will identify and define a registry process that is compatible with the Constellation initiative and other non related space activities and organizations. We will then identify and define the various technologies that should use a registry to provide interoperability. The first set of technologies will be those that are currently in need of expansion namely the assignment of satellite designations and the process which controls assignments. Second, we will analyze the technologies currently standardized under the Consultative Committee for Space Data Systems (CCSDS) banner. Third, we will analyze the current CCSDS working group and Birds of a Feather (BoF) activities to ascertain registry requirements. Lastly, we will identify technologies that are either currently under the auspices of another standards body or technologies that are currently not standardized. For activities one through three, we will provide the analysis by either discipline or technology with rationale, identification and brief description of requirements and precedence. For activity four, we will provide a list of current standards bodies e.g. IETF and a list of potential candidates.

  13. Understanding dental CAD/CAM for restorations--dental milling machines from a mechanical engineering viewpoint. Part B: labside milling machines.

    PubMed

    Lebon, Nicolas; Tapie, Laurent; Duret, Francois; Attal, Jean-Pierre

    2016-01-01

    Nowadays, dental numerical controlled (NC) milling machines are available for dental laboratories (labside solution) and dental production centers. This article provides a mechanical engineering approach to NC milling machines to help dental technicians understand the involvement of technology in digital dentistry practice. The technical and economic criteria are described for four labside and two production center dental NC milling machines available on the market. The technical criteria are focused on the capacities of the embedded technologies of milling machines to mill prosthetic materials and various restoration shapes. The economic criteria are focused on investment cost and interoperability with third-party software. The clinical relevance of the technology is discussed through the accuracy and integrity of the restoration. It can be asserted that dental production center milling machines offer a wider range of materials and types of restoration shapes than labside solutions, while labside solutions offer a wider range than chairside solutions. The accuracy and integrity of restorations may be improved as a function of the embedded technologies provided. However, the more complex the technical solutions available, the more skilled the user must be. Investment cost and interoperability with third-party software increase according to the quality of the embedded technologies implemented. Each private dental practice may decide which fabrication option to use depending on the scope of the practice.

  14. An approach to define semantics for BPM systems interoperability

    NASA Astrophysics Data System (ADS)

    Rico, Mariela; Caliusco, María Laura; Chiotti, Omar; Rosa Galli, María

    2015-04-01

    This article proposes defining semantics for Business Process Management systems interoperability through the ontology of Electronic Business Documents (EBD) used to interchange the information required to perform cross-organizational processes. The semantic model generated allows aligning enterprise's business processes to support cross-organizational processes by matching the business ontology of each business partner with the EBD ontology. The result is a flexible software architecture that allows dynamically defining cross-organizational business processes by reusing the EBD ontology. For developing the semantic model, a method is presented, which is based on a strategy for discovering entity features whose interpretation depends on the context, and representing them for enriching the ontology. The proposed method complements ontology learning techniques that can not infer semantic features not represented in data sources. In order to improve the representation of these entity features, the method proposes using widely accepted ontologies, for representing time entities and relations, physical quantities, measurement units, official country names, and currencies and funds, among others. When the ontologies reuse is not possible, the method proposes identifying whether that feature is simple or complex, and defines a strategy to be followed. An empirical validation of the approach has been performed through a case study.

  15. Across the Atlantic cooperation to address international challenges in eHealth and health IT: managing toward a common goal.

    PubMed

    Friedman, Charles P; Iakovidis, Ilias; Debenedetti, Laurent; Lorenzi, Nancy M

    2009-11-01

    Countries on both sides of the Atlantic Ocean have invested in health information and communication technologies. Since eHealth challenges cross borders a European Union-United States of America conference on public policies relating to health IT and eHealth was held October 20-21, 2008 in Paris, France. The conference was organized around the four themes: (1) privacy and security, (2) health IT interoperability, (3) deployment and adoption of health IT, and (4) Public Private Collaborative Governance. The four key themes framed the discussion over the two days of plenary sessions and workshops. Key findings of the conference were organized along the four themes. (1) Privacy and security: Patients' access to their own data and key elements of a patient identification management framework were discussed. (2) Health IT interoperability: Three significant and common interoperability challenges emerged: (a) the need to establish common or compatible standards and clear guidelines for their implementation, (b) the desirability for shared certification criteria and (c) the need for greater awareness of the importance of interoperability. (3) Deployment and adoption of health IT: Three major areas of need emerged: (a) a shared knowledge base and assessment framework, (b) public-private collaboration and (c) and effective organizational change strategies. (4) Public Private Collaborative Governance: Sharing and communication are central to success in this area. Nations can learn from one another about ways to develop harmonious, effective partnerships. Three areas that were identified as highest priority for collaboration included: (1) health data security, (2) developing effective strategies to ensure healthcare professionals' acceptance of health IT tools, and (3) interoperability.

  16. A federated design for a neurobiological simulation engine: the CBI federated software architecture.

    PubMed

    Cornelis, Hugo; Coop, Allan D; Bower, James M

    2012-01-01

    Simulator interoperability and extensibility has become a growing requirement in computational biology. To address this, we have developed a federated software architecture. It is federated by its union of independent disparate systems under a single cohesive view, provides interoperability through its capability to communicate, execute programs, or transfer data among different independent applications, and supports extensibility by enabling simulator expansion or enhancement without the need for major changes to system infrastructure. Historically, simulator interoperability has relied on development of declarative markup languages such as the neuron modeling language NeuroML, while simulator extension typically occurred through modification of existing functionality. The software architecture we describe here allows for both these approaches. However, it is designed to support alternative paradigms of interoperability and extensibility through the provision of logical relationships and defined application programming interfaces. They allow any appropriately configured component or software application to be incorporated into a simulator. The architecture defines independent functional modules that run stand-alone. They are arranged in logical layers that naturally correspond to the occurrence of high-level data (biological concepts) versus low-level data (numerical values) and distinguish data from control functions. The modular nature of the architecture and its independence from a given technology facilitates communication about similar concepts and functions for both users and developers. It provides several advantages for multiple independent contributions to software development. Importantly, these include: (1) Reduction in complexity of individual simulator components when compared to the complexity of a complete simulator, (2) Documentation of individual components in terms of their inputs and outputs, (3) Easy removal or replacement of unnecessary or obsoleted components, (4) Stand-alone testing of components, and (5) Clear delineation of the development scope of new components.

  17. A Federated Design for a Neurobiological Simulation Engine: The CBI Federated Software Architecture

    PubMed Central

    Cornelis, Hugo; Coop, Allan D.; Bower, James M.

    2012-01-01

    Simulator interoperability and extensibility has become a growing requirement in computational biology. To address this, we have developed a federated software architecture. It is federated by its union of independent disparate systems under a single cohesive view, provides interoperability through its capability to communicate, execute programs, or transfer data among different independent applications, and supports extensibility by enabling simulator expansion or enhancement without the need for major changes to system infrastructure. Historically, simulator interoperability has relied on development of declarative markup languages such as the neuron modeling language NeuroML, while simulator extension typically occurred through modification of existing functionality. The software architecture we describe here allows for both these approaches. However, it is designed to support alternative paradigms of interoperability and extensibility through the provision of logical relationships and defined application programming interfaces. They allow any appropriately configured component or software application to be incorporated into a simulator. The architecture defines independent functional modules that run stand-alone. They are arranged in logical layers that naturally correspond to the occurrence of high-level data (biological concepts) versus low-level data (numerical values) and distinguish data from control functions. The modular nature of the architecture and its independence from a given technology facilitates communication about similar concepts and functions for both users and developers. It provides several advantages for multiple independent contributions to software development. Importantly, these include: (1) Reduction in complexity of individual simulator components when compared to the complexity of a complete simulator, (2) Documentation of individual components in terms of their inputs and outputs, (3) Easy removal or replacement of unnecessary or obsoleted components, (4) Stand-alone testing of components, and (5) Clear delineation of the development scope of new components. PMID:22242154

  18. A framework for secure and decentralized sharing of medical imaging data via blockchain consensus.

    PubMed

    Patel, Vishal

    2018-04-01

    The electronic sharing of medical imaging data is an important element of modern healthcare systems, but current infrastructure for cross-site image transfer depends on trust in third-party intermediaries. In this work, we examine the blockchain concept, which enables parties to establish consensus without relying on a central authority. We develop a framework for cross-domain image sharing that uses a blockchain as a distributed data store to establish a ledger of radiological studies and patient-defined access permissions. The blockchain framework is shown to eliminate third-party access to protected health information, satisfy many criteria of an interoperable health system, and readily generalize to domains beyond medical imaging. Relative drawbacks of the framework include the complexity of the privacy and security models and an unclear regulatory environment. Ultimately, the large-scale feasibility of such an approach remains to be demonstrated and will depend on a number of factors which we discuss in detail.

  19. Peripapillary nerve fiber layer thickness measurement reproducibility using optical coherence tomography.

    PubMed

    Villain, Max A; Greenfield, David S

    2003-01-01

    To assess reproducibility of quadrantic and clock hour sectors of retinal nerve fiber layer thickness in normal eyes using optical coherence tomography. Normal eyes of healthy volunteers meeting eligibility criteria were imaged by two inexperienced operators. Six 360 degrees circular scans with a diameter of 3.4 mm centered on the optic disc were obtained during each scanning session, and a baseline image was formed using 3 high-quality images defined by the software. Images were obtained on three different days within a 4-week period. Variance and coefficient of variation (CV) were calculated for quadrantic and retinal nerve fiber layer clock hour sectors obtained from the baseline image. Five normal eyes were scanned. Intraoperator reproducibility was high. The mean (+/- SD) CV for total retinal nerve fiber layer thickness was 5.3 +/- 3.82% and 4.33 +/- 3.7% for operators 1 and 2, respectively. Interoperator reproducibility was good with statistically similar variance for all quadrantic and clock hour retinal nerve fiber layer parameters (P = .42 to .99). The nasal retinal nerve fiber layer was the most variable sector for both operators (mean CV: 10.42% and 7.83% for operators 1 and 2, respectively). Differences in mean total, nasal, temporal, and superior retinal nerve fiber layer thickness were not statistically significant between operators for all eyes; however, for inferior retinal nerve fiber layer thickness, there was a significant (P = .0007) difference between operators in one eye. Peripapillary retinal nerve fiber layer thickness assessments using optical coherence tomography have good intraoperator and interoperator reproducibility. Inexperienced operators can generate useful measurement data with acceptable levels of variance.

  20. Seeking the Path to Metadata Nirvana

    NASA Astrophysics Data System (ADS)

    Graybeal, J.

    2008-12-01

    Scientists have always found reusing other scientists' data challenging. Computers did not fundamentally change the problem, but enabled more and larger instances of it. In fact, by removing human mediation and time delays from the data sharing process, computers emphasize the contextual information that must be exchanged in order to exchange and reuse data. This requirement for contextual information has two faces: "interoperability" when talking about systems, and "the metadata problem" when talking about data. As much as any single organization, the Marine Metadata Interoperability (MMI) project has been tagged with the mission "Solve the metadata problem." Of course, if that goal is achieved, then sustained, interoperable data systems for interdisciplinary observing networks can be easily built -- pesky metadata differences, like which protocol to use for data exchange, or what the data actually measures, will be a thing of the past. Alas, as you might imagine, there will always be complexities and incompatibilities that are not addressed, and data systems that are not interoperable, even within a science discipline. So should we throw up our hands and surrender to the inevitable? Not at all. Rather, we try to minimize metadata problems as much as we can. In this we increasingly progress, despite natural forces that pull in the other direction. Computer systems let us work with more complexity, build community knowledge and collaborations, and preserve and publish our progress and (dis-)agreements. Funding organizations, science communities, and technologists see the importance interoperable systems and metadata, and direct resources toward them. With the new approaches and resources, projects like IPY and MMI can simultaneously define, display, and promote effective strategies for sustainable, interoperable data systems. This presentation will outline the role metadata plays in durable interoperable data systems, for better or worse. It will describe times when "just choosing a standard" can work, and when it probably won't work. And it will point out signs that suggest a metadata storm is coming to your community project, and how you might avoid it. From these lessons we will seek a path to producing interoperable, interdisciplinary, metadata-enlightened environment observing systems.

  1. Distributed controller clustering in software defined networks.

    PubMed

    Abdelaziz, Ahmed; Fong, Ang Tan; Gani, Abdullah; Garba, Usman; Khan, Suleman; Akhunzada, Adnan; Talebian, Hamid; Choo, Kim-Kwang Raymond

    2017-01-01

    Software Defined Networking (SDN) is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs) brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN) SDN and Open Network Operating System (ONOS) controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability.

  2. Railway cognitive radio to enhance safety, security, and performance of positive train control.

    DOT National Transportation Integrated Search

    2013-02-01

    Robust and interoperable wireless communications are vital to Positive Train Control (PTC). The railway industry has started adopting software-defined radios (SDRs) for packet-data transmission. SDR systems realize previously fixed components as reco...

  3. Interoperability Assets for Patient Summary Components: A Gap Analysis.

    PubMed

    Heitmann, Kai U; Cangioli, Giorgio; Melgara, Marcello; Chronaki, Catherine

    2018-01-01

    The International Patient Summary (IPS) standards aim to define the specifications for a minimal and non-exhaustive Patient Summary, which is specialty-agnostic and condition-independent, but still clinically relevant. Meanwhile, health systems are developing and implementing their own variation of a patient summary while, the eHealth Digital Services Infrastructure (eHDSI) initiative is deploying patient summary services across countries in the Europe. In the spirit of co-creation, flexible governance, and continuous alignment advocated by eStandards, the Trillum-II initiative promotes adoption of the patient summary by engaging standards organizations, and interoperability practitioners in a community of practice for digital health to share best practices, tools, data, specifications, and experiences. This paper compares operational aspects of patient summaries in 14 case studies in Europe, the United States, and across the world, focusing on how patient summary components are used in practice, to promote alignment and joint understanding that will improve quality of standards and lower costs of interoperability.

  4. Solving Identity Management and Interoperability Problems at Pan-European Level

    NASA Astrophysics Data System (ADS)

    Sánchez García, Sergio; Gómez Oliva, Ana

    In a globalized digital world, it is essential for persons and entities to have a recognized and unambiguous electronic identity that allows them to communicate with one another. The management of this identity by public administrations is an important challenge that becomes even more crucial when interoperability among public administrations of different countries becomes necessary, as persons and entities have different credentials depending on their own national legal frameworks. More specifically, different credentials and legal frameworks cause interoperability problems that prevent reliable access to public services in a cross-border scenarios like today's European Union. Work in this doctoral thesis try to analyze the problem in a carefully detailed manner by studying existing proposals (basically in Europe), proposing improvements in defined architectures and performing practical work to test the viability of solutions. Moreover, this thesis will also address the long-standing security problem of identity delegation, which is especially important in complex and heterogeneous service delivery environments like those mentioned above. This is a position paper.

  5. A common type system for clinical natural language processing

    PubMed Central

    2013-01-01

    Background One challenge in reusing clinical data stored in electronic medical records is that these data are heterogenous. Clinical Natural Language Processing (NLP) plays an important role in transforming information in clinical text to a standard representation that is comparable and interoperable. Information may be processed and shared when a type system specifies the allowable data structures. Therefore, we aim to define a common type system for clinical NLP that enables interoperability between structured and unstructured data generated in different clinical settings. Results We describe a common type system for clinical NLP that has an end target of deep semantics based on Clinical Element Models (CEMs), thus interoperating with structured data and accommodating diverse NLP approaches. The type system has been implemented in UIMA (Unstructured Information Management Architecture) and is fully functional in a popular open-source clinical NLP system, cTAKES (clinical Text Analysis and Knowledge Extraction System) versions 2.0 and later. Conclusions We have created a type system that targets deep semantics, thereby allowing for NLP systems to encapsulate knowledge from text and share it alongside heterogenous clinical data sources. Rather than surface semantics that are typically the end product of NLP algorithms, CEM-based semantics explicitly build in deep clinical semantics as the point of interoperability with more structured data types. PMID:23286462

  6. Development and application of a real-time testbed for multiagent system interoperability: A case study on hierarchical microgrid control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.

    This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less

  7. Development and application of a real-time testbed for multiagent system interoperability: A case study on hierarchical microgrid control

    DOE PAGES

    Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.

    2016-08-10

    This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less

  8. A common type system for clinical natural language processing.

    PubMed

    Wu, Stephen T; Kaggal, Vinod C; Dligach, Dmitriy; Masanz, James J; Chen, Pei; Becker, Lee; Chapman, Wendy W; Savova, Guergana K; Liu, Hongfang; Chute, Christopher G

    2013-01-03

    One challenge in reusing clinical data stored in electronic medical records is that these data are heterogenous. Clinical Natural Language Processing (NLP) plays an important role in transforming information in clinical text to a standard representation that is comparable and interoperable. Information may be processed and shared when a type system specifies the allowable data structures. Therefore, we aim to define a common type system for clinical NLP that enables interoperability between structured and unstructured data generated in different clinical settings. We describe a common type system for clinical NLP that has an end target of deep semantics based on Clinical Element Models (CEMs), thus interoperating with structured data and accommodating diverse NLP approaches. The type system has been implemented in UIMA (Unstructured Information Management Architecture) and is fully functional in a popular open-source clinical NLP system, cTAKES (clinical Text Analysis and Knowledge Extraction System) versions 2.0 and later. We have created a type system that targets deep semantics, thereby allowing for NLP systems to encapsulate knowledge from text and share it alongside heterogenous clinical data sources. Rather than surface semantics that are typically the end product of NLP algorithms, CEM-based semantics explicitly build in deep clinical semantics as the point of interoperability with more structured data types.

  9. Best Practices for Preparing Interoperable Geospatial Data

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Beaty, T. W.

    2010-12-01

    Geospatial data is critically important for a wide scope of research and applications: carbon cycle and ecosystem, climate change, land use and urban planning, environmental protecting, etc. Geospatial data is created by different organizations using different methods, from remote sensing observations, field surveys, model simulations, etc., and stored in various formats. So geospatial data is diverse and heterogeneous, which brings a huge barrier for the sharing and using of geospatial data, especially when targeting a broad user community. Many efforts have been taken to address different aspects of using geospatial data by improving its interoperability. For example, the specification for Open Geospatial Consortium (OGC) catalog services defines a standard way for geospatial information discovery; OGC Web Coverage Services (WCS) and OPeNDAP define interoperable protocols for geospatial data access, respectively. But the reality is that only having the standard mechanisms for data discovery and access is not enough. The geospatial data content itself has to be organized in standard, easily understandable, and readily usable formats. The Oak Ridge National Lab Distributed Archived Data Center (ORNL DAAC) archives data and information relevant to biogeochemical dynamics, ecological data, and environmental processes. The Modeling and Synthesis Thematic Data Center (MAST-DC) prepares and distributes both input data and output data of carbon cycle models and provides data support for synthesis and terrestrial model inter-comparison in multi-scales. Both of these NASA-funded data centers compile and distribute a large amount of diverse geospatial data and have broad user communities, including GIS users, Earth science researchers, and ecosystem modeling teams. The ORNL DAAC and MAST-DC address this geospatial data interoperability issue by standardizing the data content and feeding them into a well-designed Spatial Data Infrastructure (SDI) which provides interoperable mechanisms to advertise, visualize, and distribute the standardized geospatial data. In this presentation, we summarize the experiences learned and the best practices for geospatial data standardization. The presentation will describe how diverse and historical data archived in the ORNL DAAC were converted into standard and non-proprietary formats; what tools were used to make the conversion; how the spatial and temporal information are properly captured in a consistent manor; how to name a data file or a variable to make it both human-friendly and semantically interoperable; how NetCDF file format and CF convention can promote the data usage in ecosystem modeling user community; how those standardized geospatial data can be fed into OGC Web Services to support on-demand data visualization and access; and how the metadata should be collected and organized so that they can be discovered through standard catalog services.

  10. Heterogeneous Software System Interoperability Through Computer-Aided Resolution of Modeling Differences

    DTIC Science & Technology

    2002-06-01

    techniques for addressing the software component retrieval problem. Steigerwald [Ste91] introduced the use of algebraic specifications for defining the...provided in terms of a specification written using Luqi’s Prototype Specification Description Language (PSDL) [LBY88] augmented with an algebraic

  11. Enabling complex queries to drug information sources through functional composition.

    PubMed

    Peters, Lee; Mortensen, Jonathan; Nguyen, Thang; Bodenreider, Olivier

    2013-01-01

    Our objective was to enable an end-user to create complex queries to drug information sources through functional composition, by creating sequences of functions from application program interfaces (API) to drug terminologies. The development of a functional composition model seeks to link functions from two distinct APIs. An ontology was developed using Protégé to model the functions of the RxNorm and NDF-RT APIs by describing the semantics of their input and output. A set of rules were developed to define the interoperable conditions for functional composition. The operational definition of interoperability between function pairs is established by executing the rules on the ontology. We illustrate that the functional composition model supports common use cases, including checking interactions for RxNorm drugs and deploying allergy lists defined in reference to drug properties in NDF-RT. This model supports the RxMix application (http://mor.nlm.nih.gov/RxMix/), an application we developed for enabling complex queries to the RxNorm and NDF-RT APIs.

  12. Open Architecture SDR for Space

    NASA Technical Reports Server (NTRS)

    Smith, Carl; Long, Chris; Liebetreu, John; Reinhart, Richard C.

    2005-01-01

    This paper describes an open-architecture SDR (software defined radio) infrastructure that is suitable for space-based operations (Space-SDR). SDR technologies will endow space and planetary exploration systems with dramatically increased capability, reduced power consumption, and significantly less mass than conventional systems, at costs reduced by vigorous competition, hardware commonality, dense integration, reduced obsolescence, interoperability, and software re-use. Significant progress has been recorded on developments like the Joint Tactical Radio System (JSTRS) Software Communication Architecture (SCA), which is oriented toward reconfigurable radios for defense forces operating in multiple theaters of engagement. The JTRS-SCA presents a consistent software interface for waveform development, and facilitates interoperability, waveform portability, software re-use, and technology evolution.

  13. Operational Plan Ontology Model for Interconnection and Interoperability

    NASA Astrophysics Data System (ADS)

    Long, F.; Sun, Y. K.; Shi, H. Q.

    2017-03-01

    Aiming at the assistant decision-making system’s bottleneck of processing the operational plan data and information, this paper starts from the analysis of the problem of traditional expression and the technical advantage of ontology, and then it defines the elements of the operational plan ontology model and determines the basis of construction. Later, it builds up a semi-knowledge-level operational plan ontology model. Finally, it probes into the operational plan expression based on the operational plan ontology model and the usage of the application software. Thus, this paper has the theoretical significance and application value in the improvement of interconnection and interoperability of the operational plan among assistant decision-making systems.

  14. ImageJ-MATLAB: a bidirectional framework for scientific image analysis interoperability.

    PubMed

    Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W

    2017-02-15

    ImageJ-MATLAB is a lightweight Java library facilitating bi-directional interoperability between MATLAB and ImageJ. By defining a standard for translation between matrix and image data structures, researchers are empowered to select the best tool for their image-analysis tasks. Freely available extension to ImageJ2 ( http://imagej.net/Downloads ). Installation and use instructions available at http://imagej.net/MATLAB_Scripting. Tested with ImageJ 2.0.0-rc-54 , Java 1.8.0_66 and MATLAB R2015b. eliceiri@wisc.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  15. Reference Architecture Model Enabling Standards Interoperability.

    PubMed

    Blobel, Bernd

    2017-01-01

    Advanced health and social services paradigms are supported by a comprehensive set of domains managed by different scientific disciplines. Interoperability has to evolve beyond information and communication technology (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. So, the system must properly reflect the environment in front and around the computer as essential and even defining part of the health system. This paper introduces an ICT-independent system-theoretical, ontology-driven reference architecture model allowing the representation and harmonization of all domains involved including the transformation into an appropriate ICT design and implementation. The entire process is completely formalized and can therefore be fully automated.

  16. Distributed controller clustering in software defined networks

    PubMed Central

    Gani, Abdullah; Akhunzada, Adnan; Talebian, Hamid; Choo, Kim-Kwang Raymond

    2017-01-01

    Software Defined Networking (SDN) is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs) brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN) SDN and Open Network Operating System (ONOS) controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability. PMID:28384312

  17. Semantics-Based Interoperability Framework for the Geosciences

    NASA Astrophysics Data System (ADS)

    Sinha, A.; Malik, Z.; Raskin, R.; Barnes, C.; Fox, P.; McGuinness, D.; Lin, K.

    2008-12-01

    Interoperability between heterogeneous data, tools and services is required to transform data to knowledge. To meet geoscience-oriented societal challenges such as forcing of climate change induced by volcanic eruptions, we suggest the need to develop semantic interoperability for data, services, and processes. Because such scientific endeavors require integration of multiple data bases associated with global enterprises, implicit semantic-based integration is impossible. Instead, explicit semantics are needed to facilitate interoperability and integration. Although different types of integration models are available (syntactic or semantic) we suggest that semantic interoperability is likely to be the most successful pathway. Clearly, the geoscience community would benefit from utilization of existing XML-based data models, such as GeoSciML, WaterML, etc to rapidly advance semantic interoperability and integration. We recognize that such integration will require a "meanings-based search, reasoning and information brokering", which will be facilitated through inter-ontology relationships (ontologies defined for each discipline). We suggest that Markup languages (MLs) and ontologies can be seen as "data integration facilitators", working at different abstraction levels. Therefore, we propose to use an ontology-based data registration and discovery approach to compliment mark-up languages through semantic data enrichment. Ontologies allow the use of formal and descriptive logic statements which permits expressive query capabilities for data integration through reasoning. We have developed domain ontologies (EPONT) to capture the concept behind data. EPONT ontologies are associated with existing ontologies such as SUMO, DOLCE and SWEET. Although significant efforts have gone into developing data (object) ontologies, we advance the idea of developing semantic frameworks for additional ontologies that deal with processes and services. This evolutionary step will facilitate the integrative capabilities of scientists as we examine the relationships between data and external factors such as processes that may influence our understanding of "why" certain events happen. We emphasize the need to go from analysis of data to concepts related to scientific principles of thermodynamics, kinetics, heat flow, mass transfer, etc. Towards meeting these objectives, we report on a pair of related service engines: DIA (Discovery, integration and analysis), and SEDRE (Semantically-Enabled Data Registration Engine) that utilize ontologies for semantic interoperability and integration.

  18. Archetype-based electronic health records: a literature review and evaluation of their applicability to health data interoperability and access.

    PubMed

    Wollersheim, Dennis; Sari, Anny; Rahayu, Wenny

    Health Information Managers (HIMs) are responsible for overseeing health information. The change management necessary during the transition to electronic health records (EHR) is substantial, and ongoing. Archetype-based EHRs are a core health information system component which solve many of the problems that arise during this period of change. Archetypes are models of clinical content, and they have many beneficial properties. They are interoperable, both between settings and through time. They are more amenable to change than conventional paradigms, and their design is congruent with clinical practice. This paper is an overview of the current archetype literature relevant to Health Information Managers. The literature was sourced in the English language sections of ScienceDirect, IEEE Explore, Pubmed, Google Scholar, ACM Digital library and other databases on the usage of archetypes for electronic health record storage, looking at the current areas of archetype research, appropriate usage, and future research. We also used reference lists from the cited papers, papers referenced by the openEHR website, and the recommendations from experts in the area. Criteria for inclusion were (a) if studies covered archetype research and (b) were either studies of archetype use, archetype system design, or archetype effectiveness. The 47 papers included show a wide and increasing worldwide archetype usage, in a variety of medical domains. Most of the papers noted that archetypes are an appropriate solution for future-proof and interoperable medical data storage. We conclude that archetypes are a suitable solution for the complex problem of electronic health record storage and interoperability.

  19. Health information technology: initial set of standards, implementation specifications, and certification criteria for electronic health record technology. Interim final rule.

    PubMed

    2010-01-13

    The Department of Health and Human Services (HHS) is issuing this interim final rule with a request for comments to adopt an initial set of standards, implementation specifications, and certification criteria, as required by section 3004(b)(1) of the Public Health Service Act. This interim final rule represents the first step in an incremental approach to adopting standards, implementation specifications, and certification criteria to enhance the interoperability, functionality, utility, and security of health information technology and to support its meaningful use. The certification criteria adopted in this initial set establish the capabilities and related standards that certified electronic health record (EHR) technology will need to include in order to, at a minimum, support the achievement of the proposed meaningful use Stage 1 (beginning in 2011) by eligible professionals and eligible hospitals under the Medicare and Medicaid EHR Incentive Programs.

  20. Mediation, Alignment, and Information Services for Semantic interoperability (MAISSI): A Trade Study

    DTIC Science & Technology

    2007-06-01

    Modeling Notation ( BPMN ) • Business Process Definition Metamodel (BPDM) A Business Process (BP) is a defined sequence of steps to be executed in...enterprise applications, to evaluate the capabilities of suppliers, and to compare against the competition. BPMN standardizes flowchart diagrams that

  1. Extended Relation Metadata for SCORM-Based Learning Content Management Systems

    ERIC Educational Resources Information Center

    Lu, Eric Jui-Lin; Horng, Gwoboa; Yu, Chia-Ssu; Chou, Ling-Ying

    2010-01-01

    To increase the interoperability and reusability of learning objects, Advanced Distributed Learning Initiative developed a model called Content Aggregation Model (CAM) to describe learning objects and express relationships between learning objects. However, the suggested relations defined in the CAM can only describe structure-oriented…

  2. Interdisciplinarity, Interactivity, and Interoperability for Educating the Digerati

    ERIC Educational Resources Information Center

    Coleman, Anita

    2005-01-01

    As networked digital information proliferates and modern society's need to have access to information irrespective of location rises, the education needed for the digerati, defined loosely as the digital intelligentsia, the whole class of "expert" digital information professionals, becomes an important area about which all information…

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, John; Halbgewachs, Ron; Chavez, Adrian

    The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relatingmore » to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock utilities into proprietary and closed systems.« less

  4. Progress of Interoperability in Planetary Research for Geospatial Data Analysis

    NASA Astrophysics Data System (ADS)

    Hare, T. M.; Gaddis, L. R.

    2015-12-01

    For nearly a decade there has been a push in the planetary science community to support interoperable methods of accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (i.e., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized image formats that retain geographic information (e.g., GeoTiff, GeoJpeg2000), digital geologic mapping conventions, planetary extensions for symbols that comply with U.S. Federal Geographic Data Committee cartographic and geospatial metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they have been modified to support the planetary domain. The motivation to support common, interoperable data format and delivery standards is not only to improve access for higher-level products but also to address the increasingly distributed nature of the rapidly growing volumes of data. The strength of using an OGC approach is that it provides consistent access to data that are distributed across many facilities. While data-steaming standards are well-supported by both the more sophisticated tools used in Geographic Information System (GIS) and remote sensing industries, they are also supported by many light-weight browsers which facilitates large and small focused science applications and public use. Here we provide an overview of the interoperability initiatives that are currently ongoing in the planetary research community, examples of their successful application, and challenges that remain.

  5. Development of NATO's recognized environmental picture

    NASA Astrophysics Data System (ADS)

    Teufert, John F.; Trabelsi, Mourad

    2006-05-01

    An important element for the fielding of a viable, effective NATO Response Force (NRF) is access to meteorological, oceanographic, geospatial data (GEOMETOC) and imagery. Currently, the available GEOMETOC information suffers from being very fragmented. NATO defines the Recognised Environmental Picture as controlled information base for GEOMETOC data. The NATO REP proposes an architecture that is both flexible and open. The focus lies on enabling a network-centric approach. The key into achieving this is relying on using open, well recognized standards that apply to both the data exchange protocols and the data formats. Communication and information exchange based on open standards enables system interoperability. Diverse systems, each with unique, specialized contributions to an increased understanding of the battlespace, can now cooperate to a manageable information sphere. By clearly defining responsibilities in the generation of information, a reduction in data transfer overhead is achieved . REP identifies three main stages in the dissemination of GEOMETOC data. These are Collection, Fusion (and Analysis) and Publication. A REP architecture has been successfully deployed during the NATO Coalition Warrior Interoperability Demonstration (CWID) in Lillehammer, Norway during June 2005. CWID is an annual event to validate and improve the interoperability of NATO and national Consultation and command, control, communications, computers, intelligence, surveillance, and reconnaissance (C4ISR) systems. With a test case success rate of 84%, it was able to provide relevant GEOMETOC support to the main NRF component headquarters. In 2006, the REP architecture will be deployed and validated during the NATO NRF Steadfast live exercises.

  6. Common Data Model for Neuroscience Data and Data Model Exchange

    PubMed Central

    Gardner, Daniel; Knuth, Kevin H.; Abato, Michael; Erde, Steven M.; White, Thomas; DeBellis, Robert; Gardner, Esther P.

    2001-01-01

    Objective: Generalizing the data models underlying two prototype neurophysiology databases, the authors describe and propose the Common Data Model (CDM) as a framework for federating a broad spectrum of disparate neuroscience information resources. Design: Each component of the CDM derives from one of five superclasses—data, site, method, model, and reference—or from relations defined between them. A hierarchic attribute-value scheme for metadata enables interoperability with variable tree depth to serve specific intra- or broad inter-domain queries. To mediate data exchange between disparate systems, the authors propose a set of XML-derived schema for describing not only data sets but data models. These include biophysical description markup language (BDML), which mediates interoperability between data resources by providing a meta-description for the CDM. Results: The set of superclasses potentially spans data needs of contemporary neuroscience. Data elements abstracted from neurophysiology time series and histogram data represent data sets that differ in dimension and concordance. Site elements transcend neurons to describe subcellular compartments, circuits, regions, or slices; non-neuroanatomic sites include sequences to patients. Methods and models are highly domain-dependent. Conclusions: True federation of data resources requires explicit public description, in a metalanguage, of the contents, query methods, data formats, and data models of each data resource. Any data model that can be derived from the defined superclasses is potentially conformant and interoperability can be enabled by recognition of BDML-described compatibilities. Such metadescriptions can buffer technologic changes. PMID:11141510

  7. Access control and privilege management in electronic health record: a systematic literature review.

    PubMed

    Jayabalan, Manoj; O'Daniel, Thomas

    2016-12-01

    This study presents a systematic literature review of access control for electronic health record systems to protect patient's privacy. Articles from 2006 to 2016 were extracted from the ACM Digital Library, IEEE Xplore Digital Library, Science Direct, MEDLINE, and MetaPress using broad eligibility criteria, and chosen for inclusion based on analysis of ISO22600. Cryptographic standards and methods were left outside the scope of this review. Three broad classes of models are being actively investigated and developed: access control for electronic health records, access control for interoperability, and access control for risk analysis. Traditional role-based access control models are extended with spatial, temporal, probabilistic, dynamic, and semantic aspects to capture contextual information and provide granular access control. Maintenance of audit trails and facilities for overriding normal roles to allow full access in emergency cases are common features. Access privilege frameworks utilizing ontology-based knowledge representation for defining the rules have attracted considerable interest, due to the higher level of abstraction that makes it possible to model domain knowledge and validate access requests efficiently.

  8. Enabling communication concurrency through flexible MPI endpoints

    DOE PAGES

    Dinan, James; Grant, Ryan E.; Balaji, Pavan; ...

    2014-09-23

    MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. Our paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Also, endpoints enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. Furthermore, these characteristics are illustrated through several examples and an empirical study thatmore » contrasts current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less

  9. Enabling communication concurrency through flexible MPI endpoints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinan, James; Grant, Ryan E.; Balaji, Pavan

    MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. Our paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Also, endpoints enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. Furthermore, these characteristics are illustrated through several examples and an empirical study thatmore » contrasts current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less

  10. Enabling communication concurrency through flexible MPI endpoints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinan, James; Grant, Ryan E.; Balaji, Pavan

    MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. This paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Endpoints also enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. These characteristics are illustrated through several examples and an empirical study that contrastsmore » current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less

  11. Medicare and state health care programs: fraud and abuse; electronic health records safe harbor under the anti-kickback statute. Final rule.

    PubMed

    2013-12-27

    In this final rule, the Office of Inspector General (OIG) amends the safe harbor regulation concerning electronic health records items and services, which defines certain conduct that is protected from liability under the Federal anti-kickback statute, section 1128B(b) of the Social Security Act (the Act). Amendments include updating the provision under which electronic health records software is deemed interoperable; removing the electronic prescribing capability requirement; extending the sunset provision until December 31, 2021; limiting the scope of protected donors to exclude laboratory companies; and clarifying the condition that prohibits a donor from taking any action to limit or restrict the use, compatibility, or interoperability of the donated items or services.

  12. Maximizing the use of EO products: how to leverage the potential of open geospatial service architectures

    NASA Astrophysics Data System (ADS)

    Usländer, Thomas

    2012-10-01

    The demand for the rapid provision of EO products with well-defined characteristics in terms of temporal, spatial, image-specific and thematic criteria is increasing. Examples are products to support near real-time damage assessment after a natural disaster event, e.g. an earthquake. However, beyond the organizational and economic questions, there are technological and systemic barriers to enable a comfortable search, order, delivery or even combination of EO products. Most portals of space agencies and EO product providers require sophisticated satellite and product knowledge and, even worse, are all different and not interoperable. This paper gives an overview about the use cases and the architectural solutions that aim at an open and flexible EO mission infrastructure with application-oriented user interfaces and well-defined service interfaces based upon open standards. It presents corresponding international initiatives such as INSPIRE (Infrastructure for Spatial Information in the European Community), GMES (Global Monitoring for Environment and Security), GEOSS (Global Earth Observation System of Systems) and HMA (Heterogeneous Missions Accessibility) and their associated infrastructure approaches. The paper presents a corresponding analysis and design methodology and two examples how such architectures are already successfully used in early warning systems for geo-hazards and toolsets for environmentallyinduced health risks. Finally, the paper concludes with an outlook how these ideas relate to the vision of the Future Internet.

  13. The ISO Edi Conceptual Model Activity and Its Relationship to OSI.

    ERIC Educational Resources Information Center

    Fincher, Judith A.

    1990-01-01

    The edi conceptual model is being developed to define common structures, services, and processes that syntax-specific standards like X12 and EDIFACT could adopt. Open Systems Interconnection (OSI) is of interest to edi because of its potential to help enable global interoperability across Electronic Data Interchange (EDI) functional groups. A…

  14. Governance in the Digital Age: A Research and Action Framework for an Uncertain Future

    ERIC Educational Resources Information Center

    Dawes, Sharon S.

    2009-01-01

    Research into relationships among government, society and technology has grown substantially over the past 30 years. However, most research and most advances in practice address narrowly defined categories of concern such as government organization, citizen services, interoperability, or personal privacy. By contrast, the future presents complex…

  15. Provenance-Based Approaches to Semantic Web Service Discovery and Usage

    ERIC Educational Resources Information Center

    Narock, Thomas William

    2012-01-01

    The World Wide Web Consortium defines a Web Service as "a software system designed to support interoperable machine-to-machine interaction over a network." Web Services have become increasingly important both within and across organizational boundaries. With the recent advent of the Semantic Web, web services have evolved into semantic…

  16. GMSEC Interface Specification Document 2016 March

    NASA Technical Reports Server (NTRS)

    Handy, Matthew

    2016-01-01

    The GMSEC Interface Specification Document contains the standard set of defined messages. Each GMSEC standard message contains a GMSEC Information Bus Header section and a Message Contents section. Each message section identifies required fields, optional fields, data type and recommended use of the fields. Additionally, this document includes the message subjects associated with the standard messages. The system design of the operations center should ensure the components that are selected use both the API and the defined standard messages in order to achieve full interoperability from component to component.

  17. XML schemas for common bioinformatic data types and their application in workflow systems

    PubMed Central

    Seibel, Philipp N; Krüger, Jan; Hartmeier, Sven; Schwarzer, Knut; Löwenthal, Kai; Mersch, Henning; Dandekar, Thomas; Giegerich, Robert

    2006-01-01

    Background Today, there is a growing need in bioinformatics to combine available software tools into chains, thus building complex applications from existing single-task tools. To create such workflows, the tools involved have to be able to work with each other's data – therefore, a common set of well-defined data formats is needed. Unfortunately, current bioinformatic tools use a great variety of heterogeneous formats. Results Acknowledging the need for common formats, the Helmholtz Open BioInformatics Technology network (HOBIT) identified several basic data types used in bioinformatics and developed appropriate format descriptions, formally defined by XML schemas, and incorporated them in a Java library (BioDOM). These schemas currently cover sequence, sequence alignment, RNA secondary structure and RNA secondary structure alignment formats in a form that is independent of any specific program, thus enabling seamless interoperation of different tools. All XML formats are available at , the BioDOM library can be obtained at . Conclusion The HOBIT XML schemas and the BioDOM library simplify adding XML support to newly created and existing bioinformatic tools, enabling these tools to interoperate seamlessly in workflow scenarios. PMID:17087823

  18. The Development of Clinical Document Standards for Semantic Interoperability in China

    PubMed Central

    Yang, Peng; Pan, Feng; Wan, Yi; Tu, Haibo; Tang, Xuejun; Hu, Jianping

    2011-01-01

    Objectives This study is aimed at developing a set of data groups (DGs) to be employed as reusable building blocks for the construction of the eight most common clinical documents used in China's general hospitals in order to achieve their structural and semantic standardization. Methods The Diagnostics knowledge framework, the related approaches taken from the Health Level Seven (HL7), the Integrating the Healthcare Enterprise (IHE), and the Healthcare Information Technology Standards Panel (HITSP) and 1,487 original clinical records were considered together to form the DG architecture and data sets. The internal structure, content, and semantics of each DG were then defined by mapping each DG data set to a corresponding Clinical Document Architecture data element and matching each DG data set to the metadata in the Chinese National Health Data Dictionary. By using the DGs as reusable building blocks, standardized structures and semantics regarding the clinical documents for semantic interoperability were able to be constructed. Results Altogether, 5 header DGs, 48 section DGs, and 17 entry DGs were developed. Several issues regarding the DGs, including their internal structure, identifiers, data set names, definitions, length and format, data types, and value sets, were further defined. Standardized structures and semantics regarding the eight clinical documents were structured by the DGs. Conclusions This approach of constructing clinical document standards using DGs is a feasible standard-driven solution useful in preparing documents possessing semantic interoperability among the disparate information systems in China. These standards need to be validated and refined through further study. PMID:22259722

  19. A common layer of interoperability for biomedical ontologies based on OWL EL.

    PubMed

    Hoehndorf, Robert; Dumontier, Michel; Oellrich, Anika; Wimalaratne, Sarala; Rebholz-Schuhmann, Dietrich; Schofield, Paul; Gkoutos, Georgios V

    2011-04-01

    Ontologies are essential in biomedical research due to their ability to semantically integrate content from different scientific databases and resources. Their application improves capabilities for querying and mining biological knowledge. An increasing number of ontologies is being developed for this purpose, and considerable effort is invested into formally defining them in order to represent their semantics explicitly. However, current biomedical ontologies do not facilitate data integration and interoperability yet, since reasoning over these ontologies is very complex and cannot be performed efficiently or is even impossible. We propose the use of less expressive subsets of ontology representation languages to enable efficient reasoning and achieve the goal of genuine interoperability between ontologies. We present and evaluate EL Vira, a framework that transforms OWL ontologies into the OWL EL subset, thereby enabling the use of tractable reasoning. We illustrate which OWL constructs and inferences are kept and lost following the conversion and demonstrate the performance gain of reasoning indicated by the significant reduction of processing time. We applied EL Vira to the open biomedical ontologies and provide a repository of ontologies resulting from this conversion. EL Vira creates a common layer of ontological interoperability that, for the first time, enables the creation of software solutions that can employ biomedical ontologies to perform inferences and answer complex queries to support scientific analyses. The EL Vira software is available from http://el-vira.googlecode.com and converted OBO ontologies and their mappings are available from http://bioonto.gen.cam.ac.uk/el-ont.

  20. Device interoperability and authentication for telemedical appliance based on the ISO/IEEE 11073 Personal Health Device (PHD) Standards.

    PubMed

    Caranguian, Luther Paul R; Pancho-Festin, Susan; Sison, Luis G

    2012-01-01

    In this study, we focused on the interoperability and authentication of medical devices in the context of telemedical systems. A recent standard called the ISO/IEEE 11073 Personal Health Device (X73-PHD) Standards addresses the device interoperability problem by defining common protocols for agent (medical device) and manager (appliance) interface. The X73-PHD standard however has not addressed security and authentication of medical devices which is important in establishing integrity of a telemedical system. We have designed and implemented a security policy within the X73-PHD standards. The policy will enable device authentication using Asymmetric-Key Cryptography and the RSA algorithm as the digital signature scheme. We used two approaches for performing the digital signatures: direct software implementation and use of embedded security modules (ESM). The two approaches were evaluated and compared in terms of execution time and memory requirement. For the standard 2048-bit RSA, ESM calculates digital signatures only 12% of the total time for the direct implementation. Moreover, analysis shows that ESM offers more security advantage such as secure storage of keys compared to using direct implementation. Interoperability with other systems was verified by testing the system with LNI Healthlink, a manager software that implements the X73-PHD standard. Lastly, security analysis was done and the system's response to common attacks on authentication systems was analyzed and several measures were implemented to protect the system against them.

  1. Intelligent Mortality Reporting with FHIR

    PubMed Central

    Hoffman, Ryan A.; Wu, Hang; Venugopalan, Janani; Braun, Paula; Wang, May D.

    2017-01-01

    One pressing need in the area of public health is timely, accurate, and complete reporting of deaths and the conditions leading up to them. Fast Healthcare Interoperability Resources (FHIR) is a new HL7 interoperability standard for electronic health record (EHR), while Sustainable Medical Applications and Reusable Technologies (SMART)-on-FHIR enables third-party app development that can work “out of the box”. This research demonstrates the feasibility of developing SMART-on-FHIR applications to enable medical professionals to perform timely and accurate death reporting within multiple different jurisdictions of US. We explored how the information on a standard certificate of death can be mapped to resources defined in the FHIR standard (DSTU2). We also demonstrated analytics for potentially improving the accuracy and completeness of mortality reporting data. PMID:28804791

  2. Interoperability and models for exchange of data between information systems in public administration

    NASA Astrophysics Data System (ADS)

    Glavev, Victor

    2016-12-01

    The types of software applications used by public administrations can be divided in three main groups: document management systems, record management systems and business process systems. Each one of them generates outputs that can be used as input data to the others. This is the main reason that requires exchange of data between these three groups and well defined models that should be followed. There are also many other reasons that will be discussed in the paper. Interoperability is a key aspect when those models are implemented, especially when there are different manufactures of systems in the area of software applications used by public authorities. The report includes examples of implementation of models for exchange of data between software systems deployed in one of the biggest administration in Bulgaria.

  3. Improving the Interoperability of Disaster Models: a Case Study of Proposing Fireml for Forest Fire Model

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Wang, F.; Meng, Q.; Li, Z.; Liu, B.; Zheng, X.

    2018-04-01

    This paper presents a new standardized data format named Fire Markup Language (FireML), extended by the Geography Markup Language (GML) of OGC, to elaborate upon the fire hazard model. The proposed FireML is able to standardize the input and output documents of a fire model for effectively communicating with different disaster management systems to ensure a good interoperability. To demonstrate the usage of FireML and testify its feasibility, an adopted forest fire spread model being compatible with FireML is described. And a 3DGIS disaster management system is developed to simulate the dynamic procedure of forest fire spread with the defined FireML documents. The proposed approach will enlighten ones who work on other disaster models' standardization work.

  4. Extending the GI Brokering Suite to Support New Interoperability Specifications

    NASA Astrophysics Data System (ADS)

    Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.

    2014-12-01

    The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by public administrations. CERIF: used by CRIS (Current Research Information System) instances. HYRAX Server: a scientific dataset publishing component. This presentation will discuss these and other latest GI suite extensions implemented to support new interoperability protocols in use by the Earth Science Communities.

  5. Development of an electronic claim system based on an integrated electronic health record platform to guarantee interoperability.

    PubMed

    Kim, Hwa Sun; Cho, Hune; Lee, In Keun

    2011-06-01

    We design and develop an electronic claim system based on an integrated electronic health record (EHR) platform. This system is designed to be used for ambulatory care by office-based physicians in the United States. This is achieved by integrating various medical standard technologies for interoperability between heterogeneous information systems. The developed system serves as a simple clinical data repository, it automatically fills out the Centers for Medicare and Medicaid Services (CMS)-1500 form based on information regarding the patients and physicians' clinical activities. It supports electronic insurance claims by creating reimbursement charges. It also contains an HL7 interface engine to exchange clinical messages between heterogeneous devices. The system partially prevents physician malpractice by suggesting proper treatments according to patient diagnoses and supports physicians by easily preparing documents for reimbursement and submitting claim documents to insurance organizations electronically, without additional effort by the user. To show the usability of the developed system, we performed an experiment that compares the time spent filling out the CMS-1500 form directly and time required create electronic claim data using the developed system. From the experimental results, we conclude that the system could save considerable time for physicians in making claim documents. The developed system might be particularly useful for those who need a reimbursement-specialized EHR system, even though the proposed system does not completely satisfy all criteria requested by the CMS and Office of the National Coordinator for Health Information Technology (ONC). This is because the criteria are not sufficient but necessary condition for the implementation of EHR systems. The system will be upgraded continuously to implement the criteria and to offer more stable and transparent transmission of electronic claim data.

  6. Heterogeneous but “Standard” Coding Systems for Adverse Events: Issues in Achieving Interoperability between Apples and Oranges

    PubMed Central

    Richesson, Rachel L.; Fung, Kin Wah; Krischer, Jeffrey P.

    2008-01-01

    Monitoring adverse events (AEs) is an important part of clinical research and a crucial target for data standards. The representation of adverse events themselves requires the use of controlled vocabularies with thousands of needed clinical concepts. Several data standards for adverse events currently exist, each with a strong user base. The structure and features of these current adverse event data standards (including terminologies and classifications) are different, so comparisons and evaluations are not straightforward, nor are strategies for their harmonization. Three different data standards - the Medical Dictionary for Regulatory Activities (MedDRA) and the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT) terminologies, and Common Terminology Criteria for Adverse Events (CTCAE) classification - are explored as candidate representations for AEs. This paper describes the structural features of each coding system, their content and relationship to the Unified Medical Language System (UMLS), and unsettled issues for future interoperability of these standards. PMID:18406213

  7. Toward a North American Standard for Mobile Data Services

    NASA Technical Reports Server (NTRS)

    Dean, Richard A.; Levesque, Allen H.

    1991-01-01

    The rapid introduction of digital mobile communications systems is an important part of the emerging digital communications scene. These developments pose both a potential problem and a challenge. On one hand, these separate market driven developments can result in an uncontrolled mixture of analog and digital links which inhibit data modem services across the mobile/Public Switched network (PSTN). On the other hand, the near coincidence of schedules for development of some of these systems, i.e., Digital Cellular, Mobile Satellite, Land Mobile Radio, and ISDN, provides an opportunity to address interoperability problems by defining interfaces, control, and service standards that are compatible among these new services. In this paper we address the problem of providing data services interoperation between mobile terminals and data devices on the PSTN. The expected data services include G3 Fax, asynchronous data, and the government's STU-3 secure voice system, and future data services such as ISDN. We address a common architecture and a limited set of issues that are key to interoperable mobile data services. We believe that common mobile data standards will both improve the quality of data service and simplify the systems for manufacturers, data users, and service providers.

  8. Toward a North American standard for mobile data services

    NASA Astrophysics Data System (ADS)

    Dean, Richard A.; Levesque, Allen H.

    1991-09-01

    The rapid introduction of digital mobile communications systems is an important part of the emerging digital communications scene. These developments pose both a potential problem and a challenge. On one hand, these separate market driven developments can result in an uncontrolled mixture of analog and digital links which inhibit data modem services across the mobile/Public Switched network (PSTN). On the other hand, the near coincidence of schedules for development of some of these systems, i.e., Digital Cellular, Mobile Satellite, Land Mobile Radio, and ISDN, provides an opportunity to address interoperability problems by defining interfaces, control, and service standards that are compatible among these new services. In this paper we address the problem of providing data services interoperation between mobile terminals and data devices on the PSTN. The expected data services include G3 Fax, asynchronous data, and the government's STU-3 secure voice system, and future data services such as ISDN. We address a common architecture and a limited set of issues that are key to interoperable mobile data services. We believe that common mobile data standards will both improve the quality of data service and simplify the systems for manufacturers, data users, and service providers.

  9. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability.

    PubMed

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-08-31

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human's daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner.

  10. 75 FR 2013 - Health Information Technology: Initial Set of Standards, Implementation Specifications, and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-13

    ...The Department of Health and Human Services (HHS) is issuing this interim final rule with a request for comments to adopt an initial set of standards, implementation specifications, and certification criteria, as required by section 3004(b)(1) of the Public Health Service Act. This interim final rule represents the first step in an incremental approach to adopting standards, implementation specifications, and certification criteria to enhance the interoperability, functionality, utility, and security of health information technology and to support its meaningful use. The certification criteria adopted in this initial set establish the capabilities and related standards that certified electronic health record (EHR) technology will need to include in order to, at a minimum, support the achievement of the proposed meaningful use Stage 1 (beginning in 2011) by eligible professionals and eligible hospitals under the Medicare and Medicaid EHR Incentive Programs.

  11. Improving EHR Capabilities to Facilitate Stage 3 Meaningful Use Care Coordination Criteria.

    PubMed

    Cross, Dori A; Cohen, Genna R; Nong, Paige; Day, Anya-Victoria; Vibbert, Danielle; Naraharisetti, Ramya; Adler-Milstein, Julia

    Primary care practices have been limited in their ability to leverage electronic health records (EHRs) and health information exchange (HIE) to improve care coordination, but will soon be incentivized to do so under proposed Stage 3 meaningful use criteria. We use mixed methods to understand how primary care practices manage, share and reconcile electronic patient information across care settings, and identify innovations in EHR design to support enhanced care coordination. Opportunities identified by practices focused on availability and usability of features that facilitate (1) generation of customized summary of care records, (2) team-based care approaches, and (3) management of the increased volume of electronic information generated and exchanged during care transitions. More broadly, vendors and policymakers need to continue to work together to improve interoperability as the key to effective care coordination. If these EHR innovations were widespread, the value of meeting the proposed Stage 3 care coordination criteria would be substantially enhanced.

  12. Understanding dental CAD/CAM for restorations - dental milling machines from a mechanical engineering viewpoint. Part A: chairside milling machines.

    PubMed

    Lebon, Nicolas; Tapie, Laurent; Duret, Francois; Attal, Jean-Pierre

    2016-01-01

    The dental milling machine is an important device in the dental CAD/CAM chain. Nowadays, dental numerical controlled (NC) milling machines are available for dental surgeries (chairside solution). This article provides a mechanical engineering approach to NC milling machines to help dentists understand the involvement of technology in digital dentistry practice. First, some technical concepts and definitions associated with NC milling machines are described from a mechanical engineering viewpoint. The technical and economic criteria of four chairside dental NC milling machines that are available on the market are then described. The technical criteria are focused on the capacities of the embedded technologies of these milling machines to mill both prosthetic materials and types of shape restorations. The economic criteria are focused on investment costs and interoperability with third-party software. The clinical relevance of the technology is assessed in terms of the accuracy and integrity of the restoration.

  13. Multi-disciplinary interoperability challenges (Ian McHarg Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Annoni, Alessandro

    2013-04-01

    Global sustainability research requires multi-disciplinary efforts to address the key research challenges to increase our understanding of the complex relationships between environment and society. For this reason dependence on ICT systems interoperability is rapidly growing but, despite some relevant technological improvement is observed, in practice operational interoperable solutions are still lacking. Among the causes is the absence of a generally accepted definition of "interoperability" in all its broader aspects. In fact the concept of interoperability is just a concept and the more popular definitions are not addressing all challenges to realize operational interoperable solutions. The problem become even more complex when multi-disciplinary interoperability is required because in that case solutions for interoperability of different interoperable solution should be envisaged. In this lecture the following definition will be used: "interoperability is the ability to exchange information and to use it". In the lecture the main challenges for addressing multi-disciplinary interoperability will be presented and a set of proposed approaches/solutions shortly introduced.

  14. Towards BioDBcore: a community-defined information specification for biological databases

    PubMed Central

    Gaudet, Pascale; Bairoch, Amos; Field, Dawn; Sansone, Susanna-Assunta; Taylor, Chris; Attwood, Teresa K.; Bateman, Alex; Blake, Judith A.; Bult, Carol J.; Cherry, J. Michael; Chisholm, Rex L.; Cochrane, Guy; Cook, Charles E.; Eppig, Janan T.; Galperin, Michael Y.; Gentleman, Robert; Goble, Carole A.; Gojobori, Takashi; Hancock, John M.; Howe, Douglas G.; Imanishi, Tadashi; Kelso, Janet; Landsman, David; Lewis, Suzanna E.; Mizrachi, Ilene Karsch; Orchard, Sandra; Ouellette, B. F. Francis; Ranganathan, Shoba; Richardson, Lorna; Rocca-Serra, Philippe; Schofield, Paul N.; Smedley, Damian; Southan, Christopher; Tan, Tin Wee; Tatusova, Tatiana; Whetzel, Patricia L.; White, Owen; Yamasaki, Chisato

    2011-01-01

    The present article proposes the adoption of a community-defined, uniform, generic description of the core attributes of biological databases, BioDBCore. The goals of these attributes are to provide a general overview of the database landscape, to encourage consistency and interoperability between resources and to promote the use of semantic and syntactic standards. BioDBCore will make it easier for users to evaluate the scope and relevance of available resources. This new resource will increase the collective impact of the information present in biological databases. PMID:21097465

  15. Towards BioDBcore: a community-defined information specification for biological databases

    PubMed Central

    Gaudet, Pascale; Bairoch, Amos; Field, Dawn; Sansone, Susanna-Assunta; Taylor, Chris; Attwood, Teresa K.; Bateman, Alex; Blake, Judith A.; Bult, Carol J.; Cherry, J. Michael; Chisholm, Rex L.; Cochrane, Guy; Cook, Charles E.; Eppig, Janan T.; Galperin, Michael Y.; Gentleman, Robert; Goble, Carole A.; Gojobori, Takashi; Hancock, John M.; Howe, Douglas G.; Imanishi, Tadashi; Kelso, Janet; Landsman, David; Lewis, Suzanna E.; Karsch Mizrachi, Ilene; Orchard, Sandra; Ouellette, B.F. Francis; Ranganathan, Shoba; Richardson, Lorna; Rocca-Serra, Philippe; Schofield, Paul N.; Smedley, Damian; Southan, Christopher; Tan, Tin W.; Tatusova, Tatiana; Whetzel, Patricia L.; White, Owen; Yamasaki, Chisato

    2011-01-01

    The present article proposes the adoption of a community-defined, uniform, generic description of the core attributes of biological databases, BioDBCore. The goals of these attributes are to provide a general overview of the database landscape, to encourage consistency and interoperability between resources; and to promote the use of semantic and syntactic standards. BioDBCore will make it easier for users to evaluate the scope and relevance of available resources. This new resource will increase the collective impact of the information present in biological databases. PMID:21205783

  16. Future Interoperability of Camp Protection Systems (FICAPS)

    NASA Astrophysics Data System (ADS)

    Caron, Sylvie; Gündisch, Rainer; Marchand, Alain; Stahl, Karl-Hermann

    2013-05-01

    The FICAPS Project has been established as a Project of the European Defence Agency based on an initiative of Germany and France. Goal of this Project was to derive Guidelines, which by a proper implementation in future developments improve Camp Protection Systems (CPS) by enabling and improving interoperability between Camp Protection Systems and its Equipments of different Nations involved in multinational missions. These Guidelines shall allow for: • Real-time information exchange between equipments and systems of different suppliers and nations (even via SatCom), • Quick and easy replacement of equipments (even of different Nations) at run-time in the field by means of plug and play capability, thus lowering the operational and logistic costs and making the system highly available, • Enhancement of system capabilities (open and modular systems) by adding new equipment with new capabilities (just plug-in, automatic adjustment of the HMI Human Machine Interface) without costly and time consuming validation and test on system level (validation and test can be done on Equipment level), Four scenarios have been identified to summarize the interoperability requirements from an operational viewpoint. To prove the definitions given in the Guideline Document, a French and a German Demonstration System, based on existing national assets, were realized. Demonstrations, showing the capabilities given by the defined interoperability requirements with respect to the operational scenarios, were performed. Demonstrations included remote control of a CPS by another CPS, remote sensor control (Electro-Optic/InfraRed EO/IR) and remote effector control. This capability can be applied to extend the protection area or to protect distant infrastructural assets Demonstrations have been performed. The required interoperability functionality was shown successfully. Even if the focus of the FICAPS project was on camp protection, the solution found is also appropriate for other force protection and ISR (Intelligence Surveillance Reconnaissance) tasks not only due to its flexibility but also due to the chosen interfacing.

  17. Fundamental Data Standards for Science Data System Interoperability and Data Correlation

    NASA Astrophysics Data System (ADS)

    Hughes, J. Steven; Gopala Krishna, Barla; Rye, Elizabeth; Crichton, Daniel

    The advent of the Web and languages such as XML have brought an explosion of online science data repositories and the promises of correlated data and interoperable systems. However there have been relatively few successes in meeting the expectations of science users in the internet age. For example a Google-like search for images of Mars will return many highly-derived and appropriately tagged images but largely ignore the majority of images in most online image repositories. Once retrieved, users are further frustrated by poor data descriptions, arcane formats, and badly organized ancillary information. A wealth of research indicates that shared information models are needed to enable system interoperability and data correlation. However, at a more fundamental level, data correlation and system interoperability are dependant on a relatively few shared data standards. A com-mon data dictionary standard, for example, allows the controlled vocabulary used in a science repository to be shared with potential collaborators. Common data registry and product iden-tification standards enable systems to efficiently find, locate, and retrieve data products and their metadata from remote repositories. Information content standards define categories of descriptive data that help make the data products scientifically useful to users who were not part of the original team that produced the data. The Planetary Data System (PDS) has a plan to move the PDS to a fully online, federated system. This plan addresses new demands on the system including increasing data volume, numbers of missions, and complexity of missions. A key component of this plan is the upgrade of the PDS Data Standards. The adoption of the core PDS data standards by the International Planetary Data Alliance (IPDA) adds the element of international cooperation to the plan. This presentation will provide an overview of the fundamental data standards being adopted by the PDS that transcend science domains and that will help to meet the PDS's and IPDA's system interoperability and data correlation requirements.

  18. Operator Variability in Scan Positioning is a Major Component of HR-pQCT Precision Error and is Reduced by Standardized Training

    PubMed Central

    Bonaretti, Serena; Vilayphiou, Nicolas; Chan, Caroline Mai; Yu, Andrew; Nishiyama, Kyle; Liu, Danmei; Boutroy, Stephanie; Ghasem-Zadeh, Ali; Boyd, Steven K.; Chapurlat, Roland; McKay, Heather; Shane, Elizabeth; Bouxsein, Mary L.; Black, Dennis M.; Majumdar, Sharmila; Orwoll, Eric S.; Lang, Thomas F.; Khosla, Sundeep; Burghardt, Andrew J.

    2017-01-01

    Introduction HR-pQCT is increasingly used to assess bone quality, fracture risk and anti-fracture interventions. The contribution of the operator has not been adequately accounted in measurement precision. Operators acquire a 2D projection (“scout view image”) and define the region to be scanned by positioning a “reference line” on a standard anatomical landmark. In this study, we (i) evaluated the contribution of positioning variability to in vivo measurement precision, (ii) measured intra- and inter-operator positioning variability, and (iii) tested if custom training software led to superior reproducibility in new operators compared to experienced operators. Methods To evaluate the operator in vivo measurement precision we compared precision errors calculated in 64 co-registered and non-co-registered scan-rescan images. To quantify operator variability, we developed software that simulates the positioning process of the scanner’s software. Eight experienced operators positioned reference lines on scout view images designed to test intra- and inter-operator reproducibility. Finally, we developed modules for training and evaluation of reference line positioning. We enrolled 6 new operators to participate in a common training, followed by the same reproducibility experiments performed by the experienced group. Results In vivo precision errors were up to three-fold greater (Tt.BMD and Ct.Th) when variability in scan positioning was included. Inter-operator precision errors were significantly greater than short-term intra-operator precision (p<0.001). New trained operators achieved comparable intra-operator reproducibility to experienced operators, and lower inter-operator reproducibility (p<0.001). Precision errors were significantly greater for the radius than for the tibia. Conclusion Operator reference line positioning contributes significantly to in vivo measurement precision and is significantly greater for multi-operator datasets. Inter-operator variability can be significantly reduced using a systematic training platform, now available online (http://webapps.radiology.ucsf.edu/refline/). PMID:27475931

  19. e-Infrastructures for Astronomy: An Integrated View

    NASA Astrophysics Data System (ADS)

    Pasian, F.; Longo, G.

    2010-12-01

    As for other disciplines, the capability of performing “Big Science” in astrophysics requires the availability of large facilities. In the field of ICT, computational resources (e.g. HPC) are important, but are far from being enough for the community: as a matter of fact, the whole set of e-infrastructures (network, computing nodes, data repositories, applications) need to work in an interoperable way. This implies the development of common (or at least compatible) user interfaces to computing resources, transparent access to observations and numerical simulations through the Virtual Observatory, integrated data processing pipelines, data mining and semantic web applications. Achieving this interoperability goal is a must to build a real “Knowledge Infrastructure” in the astrophysical domain. Also, the emergence of new professional profiles (e.g. the “astro-informatician”) is necessary to allow defining and implementing properly this conceptual schema.

  20. Cohort Selection and Management Application Leveraging Standards-based Semantic Interoperability and a Groovy DSL

    PubMed Central

    Bucur, Anca; van Leeuwen, Jasper; Chen, Njin-Zu; Claerhout, Brecht; de Schepper, Kristof; Perez-Rey, David; Paraiso-Medina, Sergio; Alonso-Calvo, Raul; Mehta, Keyur; Krykwinski, Cyril

    2016-01-01

    This paper describes a new Cohort Selection application implemented to support streamlining the definition phase of multi-centric clinical research in oncology. Our approach aims at both ease of use and precision in defining the selection filters expressing the characteristics of the desired population. The application leverages our standards-based Semantic Interoperability Solution and a Groovy DSL to provide high expressiveness in the definition of filters and flexibility in their composition into complex selection graphs including splits and merges. Widely-adopted ontologies such as SNOMED-CT are used to represent the semantics of the data and to express concepts in the application filters, facilitating data sharing and collaboration on joint research questions in large communities of clinical users. The application supports patient data exploration and efficient collaboration in multi-site, heterogeneous and distributed data environments. PMID:27570644

  1. Clinical information modeling processes for semantic interoperability of electronic health records: systematic review and inductive analysis.

    PubMed

    Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak

    2015-07-01

    This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Tool and data interoperability in the SSE system

    NASA Technical Reports Server (NTRS)

    Shotton, Chuck

    1988-01-01

    Information is given in viewgraph form on tool and data interoperability in the Software Support Environment (SSE). Information is given on industry problems, SSE system interoperability issues, SSE solutions to tool and data interoperability, and attainment of heterogeneous tool/data interoperability.

  3. Transformation of standardized clinical models based on OWL technologies: from CEM to OpenEHR archetypes

    PubMed Central

    Legaz-García, María del Carmen; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás; Chute, Christopher G; Tao, Cui

    2015-01-01

    Introduction The semantic interoperability of electronic healthcare records (EHRs) systems is a major challenge in the medical informatics area. International initiatives pursue the use of semantically interoperable clinical models, and ontologies have frequently been used in semantic interoperability efforts. The objective of this paper is to propose a generic, ontology-based, flexible approach for supporting the automatic transformation of clinical models, which is illustrated for the transformation of Clinical Element Models (CEMs) into openEHR archetypes. Methods Our transformation method exploits the fact that the information models of the most relevant EHR specifications are available in the Web Ontology Language (OWL). The transformation approach is based on defining mappings between those ontological structures. We propose a way in which CEM entities can be transformed into openEHR by using transformation templates and OWL as common representation formalism. The transformation architecture exploits the reasoning and inferencing capabilities of OWL technologies. Results We have devised a generic, flexible approach for the transformation of clinical models, implemented for the unidirectional transformation from CEM to openEHR, a series of reusable transformation templates, a proof-of-concept implementation, and a set of openEHR archetypes that validate the methodological approach. Conclusions We have been able to transform CEM into archetypes in an automatic, flexible, reusable transformation approach that could be extended to other clinical model specifications. We exploit the potential of OWL technologies for supporting the transformation process. We believe that our approach could be useful for international efforts in the area of semantic interoperability of EHR systems. PMID:25670753

  4. Towards technical interoperability in telemedicine.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard Layne, II

    2004-05-01

    For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.

  5. The interoperability force in the ERP field

    NASA Astrophysics Data System (ADS)

    Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon

    2015-04-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.

  6. Interoperable Architecture for Command and Control

    DTIC Science & Technology

    2014-06-01

    defined objective. Elements can include other systems, people, processes, technology and other support elements (Adapted from [9]). Enterprise System...An enterprise is an intentionally created entity of human endeavour with a certain purpose. An enterprise could be considered a type of system [7]. In...this case the enterprise is a Defence Enterprise System required by government as a tool to maintain national sovereignty. Capability

  7. XML schemas for common bioinformatic data types and their application in workflow systems.

    PubMed

    Seibel, Philipp N; Krüger, Jan; Hartmeier, Sven; Schwarzer, Knut; Löwenthal, Kai; Mersch, Henning; Dandekar, Thomas; Giegerich, Robert

    2006-11-06

    Today, there is a growing need in bioinformatics to combine available software tools into chains, thus building complex applications from existing single-task tools. To create such workflows, the tools involved have to be able to work with each other's data--therefore, a common set of well-defined data formats is needed. Unfortunately, current bioinformatic tools use a great variety of heterogeneous formats. Acknowledging the need for common formats, the Helmholtz Open BioInformatics Technology network (HOBIT) identified several basic data types used in bioinformatics and developed appropriate format descriptions, formally defined by XML schemas, and incorporated them in a Java library (BioDOM). These schemas currently cover sequence, sequence alignment, RNA secondary structure and RNA secondary structure alignment formats in a form that is independent of any specific program, thus enabling seamless interoperation of different tools. All XML formats are available at http://bioschemas.sourceforge.net, the BioDOM library can be obtained at http://biodom.sourceforge.net. The HOBIT XML schemas and the BioDOM library simplify adding XML support to newly created and existing bioinformatic tools, enabling these tools to interoperate seamlessly in workflow scenarios.

  8. Architecting the Communication and Navigation Networks for NASA's Space Exploration Systems

    NASA Technical Reports Server (NTRS)

    Bhassin, Kul B.; Putt, Chuck; Hayden, Jeffrey; Tseng, Shirley; Biswas, Abi; Kennedy, Brian; Jennings, Esther H.; Miller, Ron A.; Hudiburg, John; Miller, Dave; hide

    2007-01-01

    NASA is planning a series of short and long duration human and robotic missions to explore the Moon and then Mars. A key objective of the missions is to grow, through a series of launches, a system of systems communication, navigation, and timing infrastructure at minimum cost while providing a network-centric infrastructure that maximizes the exploration capabilities and science return. There is a strong need to use architecting processes in the mission pre-formulation stage to describe the systems, interfaces, and interoperability needed to implement multiple space communication systems that are deployed over time, yet support interoperability with each deployment phase and with 20 years of legacy systems. In this paper we present a process for defining the architecture of the communications, navigation, and networks needed to support future space explorers with the best adaptable and evolable network-centric space exploration infrastructure. The process steps presented are: 1) Architecture decomposition, 2) Defining mission systems and their interfaces, 3) Developing the communication, navigation, networking architecture, and 4) Integrating systems, operational and technical views and viewpoints. We demonstrate the process through the architecture development of the communication network for upcoming NASA space exploration missions.

  9. Application of Coalition Battle Management Language (C-BML) and C-BML Services to Live, Virtual, and Constructive (LVC) Simulation Environments

    DTIC Science & Technology

    2011-12-01

    Task Based Approach to Planning.” Paper 08F- SIW -033. In Proceed- ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability...Paper 06F- SIW -003. In Proceed- 2597 Blais ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability Standards Organi...MSDL).” Paper 10S- SIW -003. In Proceedings of the Spring Simulation Interoperability Workshop. Simulation Interoperability Standards Organization

  10. The Semantic Management of Environmental Resources within the Interoperable Context of the EuroGEOSS: Alignment of GEMET and the GEOSS SBAs

    NASA Astrophysics Data System (ADS)

    Cialone, Claudia; Stock, Kristin

    2010-05-01

    EuroGEOSS is a European Commission funded project. It aims at improving a scientific understanding of the complex mechanisms which drive changes affecting our planet, identifying and establishing interoperable arrangements between environmental information systems. These systems would be sustained and operated by organizations with a clear mandate and resources and rendered available following the specifications of already existent frameworks such as GEOSS (the Global Earth Observation System of systems)1 and INSPIRE (the Infrastructure for Spatial Information in the European Community)2. The EuroGEOSS project's infrastructure focuses on three thematic areas: forestry, drought and biodiversity. One of the important activities in the project is the retrieval, parsing and harmonization of the large amount of heterogeneous environmental data available at local, regional and global levels between these strategic areas. The challenge is to render it semantically and technically interoperable in a simple way. An initial step in achieving this semantic and technical interoperability involves the selection of appropriate classification schemes (for example, thesauri, ontologies and controlled vocabularies) to describe the resources in the EuroGEOSS framework. These classifications become a crucial part of the interoperable framework scaffolding because they allow data providers to describe their resources and thus support resource discovery, execution and orchestration of varying levels of complexity. However, at present, given the diverse range of environmental thesauri, controlled vocabularies and ontologies and the large number of resources provided by project participants, the selection of appropriate classification schemes involves a number of considerations. First of all, there is the semantic difficulty of selecting classification schemes that contain concepts that are relevant to each thematic area. Secondly, EuroGEOSS is intended to accommodate a number of existing environmental projects (for example, GEOSS and INSPIRE). This requirement imposes constraints on the selection. Thirdly, the selected classification scheme or group of schemes (if more than one) must be capable of alignment (establishing different kinds of mappings between concepts, hence preserving intact the original knowledge schemes) or merging (the creation of another unique ontology from the original ontological sources) (Pérez-Gómez et al., 2004). Last but not least, there is the issue of including multi-lingual schemes that are based on free, open standards (non-proprietary). Using these selection criteria, we aim to support open and convenient data discovery and exchange for users who speak different languages (particularly the European ones for the broad scopes of EuroGEOSS). In order to support the project, we have developed a solution that employs two classification schemes: the Societal Benefit Areas (SBAs)3: the upper-level environmental categorization developed for the GEOSS project and the GEneral Multilingual Environmental Thesaurus (GEMET)4: a general environmental thesaurus whose conceptual structure has already been integrated with the spatial data themes proposed by the INSPIRE project. The former seems to provide the spatial data keywords relevant to the INSPIRE's Directive (JRC, 2008). In this way, we provide users with a basic set of concepts to support resource description and discovery in the thematic areas while supporting the requirements of INSPIRE and GEOSS. Furthermore, the use of only two classification schemes together with the fact that the SBAs are very general categories while GEMET includes much more detailed, yet still top-level, concepts, makes alignment an achievable task. Alignment was selected over merging because it leaves the existing classification schemes intact and requires only a simple activity of defining mappings from GEMET to the SBAs. In order to accomplish this task we are developing a simple, automated, open-source application to assist thematic experts in defining the mappings between concepts in the two classification schemes. The application will then generate SKOS mappings (exactMatch, closeMatch, broadMatch, narrowMatch, relatedMatch) based on thematic expert selections between the concepts in GEMET with the SBAs (including both the general Societal Benefit Areas and their subcategories). Once these mappings are defined and the SKOS files generated, resource providers will be able to select concepts from either GEMET or the SBAs (or a mixture) to describe their resources, and discovery approaches will support selection of concepts from either classification scheme, also returning results classified using the other scheme. While the focus of our work has been on the SBAs and GEMET, we also plan to provide a method for resource providers to further extend the semantic infrastructure by defining alignments to new classification schemes if these are required to support particular specialized thematic areas that are not covered by GEMET. In this way, the approach is flexible and suited to the general scope of EuroGEOSS, allowing specialists to increase at will the level of semantic quality and specificity of data to the initial infrastructural skeleton of the project. References ____________________________________________ Joint research Centre (JRC), 2008. INSPIRE Metadata Editor User Guide Pérez-Gómez A., Fernandez-Lopez M., Corcho O. Ontological engineering: With Examples from the Areas of Knowledge Management, e-Commerce and the Semantic Web.Spinger: London, 2004

  11. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability

    PubMed Central

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-01-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human’s daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner. PMID:27589759

  12. Promoting A-Priori Interoperability of HLA-Based Simulations in the Space Domain: The SISO Space Reference FOM Initiative

    NASA Technical Reports Server (NTRS)

    Moller, Bjorn; Garro, Alfredo; Falcone, Alberto; Crues, Edwin Z.; Dexter, Daniel E.

    2016-01-01

    Distributed and Real-Time Simulation plays a key-role in the Space domain being exploited for missions and systems analysis and engineering as well as for crew training and operational support. One of the most popular standards is the 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA). HLA supports the implementation of distributed simulations (called Federations) in which a set of simulation entities (called Federates) can interact using a Run-Time Infrastructure (RTI). In a given Federation, a Federate can publish and/or subscribes objects and interactions on the RTI only in accordance with their structures as defined in a FOM (Federation Object Model). Currently, the Space domain is characterized by a set of incompatible FOMs that, although meet the specific needs of different organizations and projects, increases the long-term cost for interoperability. In this context, the availability of a reference FOM for the Space domain will enable the development of interoperable HLA-based simulators for related joint projects and collaborations among worldwide organizations involved in the Space domain (e.g. NASA, ESA, Roscosmos, and JAXA). The paper presents a first set of results achieved by a SISO standardization effort that aims at providing a Space Reference FOM for international collaboration on Space systems simulations.

  13. A step-by-step methodology for enterprise interoperability projects

    NASA Astrophysics Data System (ADS)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  14. OAI and NASA's Scientific and Technical Information

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Rocker, JoAnne; Harrison, Terry L.

    2002-01-01

    The Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH) is an evolving protocol and philosophy regarding interoperability for digital libraries (DLs). Previously, "distributed searching" models were popular for DL interoperability. However, experience has shown distributed searching systems across large numbers of DLs to be difficult to maintain in an Internet environment. The OAI-PMH is a move away from distributed searching, focusing on the arguably simpler model of "metadata harvesting". We detail NASA s involvement in defining and testing the OAI-PMH and experience to date with adapting existing NASA distributed searching DLs (such as the NASA Technical Report Server) to use the OAI-PMH and metadata harvesting. We discuss some of the entirely new DL projects that the OAI-PMH has made possible, such as the Technical Report Interchange project. We explain the strategic importance of the OAI-PMH to the mission of NASA s Scientific and Technical Information Program.

  15. Experience with abstract notation one

    NASA Technical Reports Server (NTRS)

    Harvey, James D.; Weaver, Alfred C.

    1990-01-01

    The development of computer science has produced a vast number of machine architectures, programming languages, and compiler technologies. The cross product of these three characteristics defines the spectrum of previous and present data representation methodologies. With regard to computer networks, the uniqueness of these methodologies presents an obstacle when disparate host environments are to be interconnected. Interoperability within a heterogeneous network relies upon the establishment of data representation commonality. The International Standards Organization (ISO) is currently developing the abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively address this problem. When used within the presentation layer of the open systems interconnection reference model, these two standards provide the data representation commonality required to facilitate interoperability. The details of a compiler that was built to automate the use of ASN.1 and BER are described. From this experience, insights into both standards are given and potential problems relating to this development effort are discussed.

  16. CAD Services: an Industry Standard Interface for Mechanical CAD Interoperability

    NASA Technical Reports Server (NTRS)

    Claus, Russell; Weitzer, Ilan

    2002-01-01

    Most organizations seek to design and develop new products in increasingly shorter time periods. At the same time, increased performance demands require a team-based multidisciplinary design process that may span several organizations. One approach to meet these demands is to use 'Geometry Centric' design. In this approach, design engineers team their efforts through one united representation of the design that is usually captured in a CAD system. Standards-based interfaces are critical to provide uniform, simple, distributed services that enable the 'Geometry Centric' design approach. This paper describes an industry-wide effort, under the Object Management Group's (OMG) Manufacturing Domain Task Force, to define interfaces that enable the interoperability of CAD, Computer Aided Manufacturing (CAM), and Computer Aided Engineering (CAE) tools. This critical link to enable 'Geometry Centric' design is called: Cad Services V1.0. This paper discusses the features of this standard and proposed application.

  17. Digital document imaging systems: An overview and guide

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This is an aid to NASA managers in planning the selection of a Digital Document Imaging System (DDIS) as a possible solution for document information processing and storage. Intended to serve as a manager's guide, this document contains basic information on digital imaging systems, technology, equipment standards, issues of interoperability and interconnectivity, and issues related to selecting appropriate imaging equipment based upon well defined needs.

  18. Federating Cyber and Physical Models for Event-Driven Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan, Eric G.; Pawlowski, Ronald A.; Sridhar, Siddharth

    The purpose of this paper is to describe a novel method to improve electric power system monitoring and control software application interoperability. This method employs the concept of federation, which is defined as the use of existing models that represent aspects of a system in specific domains (such as physical and cyber security domains) and building interface to link all of domain models.

  19. Content analysis of physical examination templates in electronic health records using SNOMED CT.

    PubMed

    Gøeg, Kirstine Rosenbeck; Chen, Rong; Højen, Anne Randorff; Elberg, Pia

    2014-10-01

    Most electronic health record (EHR) systems are built on proprietary information models and terminology, which makes achieving semantic interoperability a challenge. Solving interoperability problems requires well-defined standards. In contrast, the need to support clinical work practice requires a local customization of EHR systems. Consequently, contrasting goals may be evident in EHR template design because customization means that local EHR organizations can define their own templates, whereas standardization implies consensus at some level. To explore the complexity of balancing these two goals, this study analyzes the differences and similarities between templates in use today. A similarity analysis was developed on the basis of SNOMED CT. The analysis was performed on four physical examination templates from Denmark and Sweden. The semantic relationships in SNOMED CT were used to quantify similarities and differences. Moreover, the analysis used these identified similarities to investigate the common content of a physical examination template. The analysis showed that there were both similarities and differences in physical examination templates, and the size of the templates varied from 18 to 49 fields. In the SNOMED CT analysis, exact matches and terminology similarities were represented in all template pairs. The number of exact matches ranged from 7 to 24. Moreover, the number of unrelated fields differed a lot from 1/18 to 22/35. Cross-country comparisons tended to have more unrelated content than within-country comparisons. On the basis of identified similarities, it was possible to define the common content of a physical examination. Nevertheless, a complete view on the physical examination required the inclusion of both exact matches and terminology similarities. This study revealed that a core set of items representing the physical examination templates can be generated when the analysis takes into account not only exact matches but also terminology similarities. This core set of items could be a starting point for standardization and semantic interoperability. However, both unmatched terms and terminology matched terms pose a challenge for standardization. Future work will include using local templates as a point of departure in standardization to see if local requirements can be maintained in a standardized framework. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. Constellation's Command, Control, Communications and Information (C3I) Architecture

    NASA Technical Reports Server (NTRS)

    Breidenthal, Julian C.

    2007-01-01

    Operations concepts are highly effective for: 1) Developing consensus; 2) Discovering stakeholder needs, goals, objectives; 3) Defining behavior of system components (especially emergent behaviors). An interoperability standard can provide an excellent lever to define the capabilities needed for system evolution. Two categories of architectures are needed in a program of this size are: 1) Generic - Needed for planning, design and construction standards; 2) Specific - Needed for detailed requirement allocations, interface specs. A wide variety of architectural views are needed to address stakeholder concerns, including: 1) Physical; 2) Information (structure, flow, evolution); 3) Processes (design, manufacturing, operations); 4) Performance; 5) Risk.

  1. Open system environment procurement

    NASA Technical Reports Server (NTRS)

    Fisher, Gary

    1994-01-01

    Relationships between the request for procurement (RFP) process and open system environment (OSE) standards are described. A guide was prepared to help Federal agency personnel overcome problems in writing an adequate statement of work and developing realistic evaluation criteria when transitioning to an OSE. The guide contains appropriate decision points and transition strategies for developing applications that are affordable, scalable and interoperable across a broad range of computing environments. While useful, the guide does not eliminate the requirement that agencies posses in-depth expertise in software development, communications, and database technology in order to evaluate open systems.

  2. Transformation of standardized clinical models based on OWL technologies: from CEM to OpenEHR archetypes.

    PubMed

    Legaz-García, María del Carmen; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás; Chute, Christopher G; Tao, Cui

    2015-05-01

    The semantic interoperability of electronic healthcare records (EHRs) systems is a major challenge in the medical informatics area. International initiatives pursue the use of semantically interoperable clinical models, and ontologies have frequently been used in semantic interoperability efforts. The objective of this paper is to propose a generic, ontology-based, flexible approach for supporting the automatic transformation of clinical models, which is illustrated for the transformation of Clinical Element Models (CEMs) into openEHR archetypes. Our transformation method exploits the fact that the information models of the most relevant EHR specifications are available in the Web Ontology Language (OWL). The transformation approach is based on defining mappings between those ontological structures. We propose a way in which CEM entities can be transformed into openEHR by using transformation templates and OWL as common representation formalism. The transformation architecture exploits the reasoning and inferencing capabilities of OWL technologies. We have devised a generic, flexible approach for the transformation of clinical models, implemented for the unidirectional transformation from CEM to openEHR, a series of reusable transformation templates, a proof-of-concept implementation, and a set of openEHR archetypes that validate the methodological approach. We have been able to transform CEM into archetypes in an automatic, flexible, reusable transformation approach that could be extended to other clinical model specifications. We exploit the potential of OWL technologies for supporting the transformation process. We believe that our approach could be useful for international efforts in the area of semantic interoperability of EHR systems. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Design and Implementation of e-Health System Based on Semantic Sensor Network Using IETF YANG.

    PubMed

    Jin, Wenquan; Kim, Do Hyeun

    2018-02-20

    Recently, healthcare services can be delivered effectively to patients anytime and anywhere using e-Health systems. e-Health systems are developed through Information and Communication Technologies (ICT) that involve sensors, mobiles, and web-based applications for the delivery of healthcare services and information. Remote healthcare is an important purpose of the e-Health system. Usually, the eHealth system includes heterogeneous sensors from diverse manufacturers producing data in different formats. Device interoperability and data normalization is a challenging task that needs research attention. Several solutions are proposed in the literature based on manual interpretation through explicit programming. However, programmatically implementing the interpretation of the data sender and data receiver in the e-Health system for the data transmission is counterproductive as modification will be required for each new device added into the system. In this paper, an e-Health system with the Semantic Sensor Network (SSN) is proposed to address the device interoperability issue. In the proposed system, we have used IETF YANG for modeling the semantic e-Health data to represent the information of e-Health sensors. This modeling scheme helps in provisioning semantic interoperability between devices and expressing the sensing data in a user-friendly manner. For this purpose, we have developed an ontology for e-Health data that supports different styles of data formats. The ontology is defined in YANG for provisioning semantic interpretation of sensing data in the system by constructing meta-models of e-Health sensors. The proposed approach assists in the auto-configuration of eHealth sensors and querying the sensor network with semantic interoperability support for the e-Health system.

  4. Design and Implementation of e-Health System Based on Semantic Sensor Network Using IETF YANG

    PubMed Central

    Kim, Do Hyeun

    2018-01-01

    Recently, healthcare services can be delivered effectively to patients anytime and anywhere using e-Health systems. e-Health systems are developed through Information and Communication Technologies (ICT) that involve sensors, mobiles, and web-based applications for the delivery of healthcare services and information. Remote healthcare is an important purpose of the e-Health system. Usually, the eHealth system includes heterogeneous sensors from diverse manufacturers producing data in different formats. Device interoperability and data normalization is a challenging task that needs research attention. Several solutions are proposed in the literature based on manual interpretation through explicit programming. However, programmatically implementing the interpretation of the data sender and data receiver in the e-Health system for the data transmission is counterproductive as modification will be required for each new device added into the system. In this paper, an e-Health system with the Semantic Sensor Network (SSN) is proposed to address the device interoperability issue. In the proposed system, we have used IETF YANG for modeling the semantic e-Health data to represent the information of e-Health sensors. This modeling scheme helps in provisioning semantic interoperability between devices and expressing the sensing data in a user-friendly manner. For this purpose, we have developed an ontology for e-Health data that supports different styles of data formats. The ontology is defined in YANG for provisioning semantic interpretation of sensing data in the system by constructing meta-models of e-Health sensors. The proposed approach assists in the auto-configuration of eHealth sensors and querying the sensor network with semantic interoperability support for the e-Health system. PMID:29461493

  5. Data interoperability software solution for emergency reaction in the Europe Union

    NASA Astrophysics Data System (ADS)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2015-07-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency-first responders: the Netherlands-Germany border fire.

  6. The National Opportunity for Interoperability and its Benefits for a Reliable, Robust, and Future Grid Realized Through Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    Today, increasing numbers of intermittent generation sources (e.g., wind and photovoltaic) and new mobile intermittent loads (e.g., electric vehicles) can significantly affect traditional utility business practices and operations. At the same time, a growing number of technologies and devices, from appliances to lighting systems, are being deployed at consumer premises that have more sophisticated controls and information that remain underused for anything beyond basic building equipment operations. The intersection of these two drivers is an untapped opportunity and underused resource that, if appropriately configured and realized in open standards, can provide significant energy efficiency and commensurate savings on utility bills,more » enhanced and lower cost reliability to utilities, and national economic benefits in the creation of new markets, sectors, and businesses being fueled by the seamless coordination of energy and information through device and technology interoperability. Or, as the Quadrennial Energy Review puts it, “A plethora of both consumer-level and grid-level devices are either in the market, under development, or at the conceptual stage. When tied together through the information technology that is increasingly being deployed on electric utilities’ distribution grids, they can be an important enabling part of the emerging grid of the future. However, what is missing is the ability for all of these devices to coordinate and communicate their operations with the grid, and among themselves, in a common language — an open standard.” In this paper, we define interoperability as the ability to exchange actionable information between two or more systems within a home or building, or across and within organizational boundaries. Interoperability relies on the shared meaning of the exchanged information, with agreed-upon expectations and consequences, for the response to the information exchange.« less

  7. Designing learning management system interoperability in semantic web

    NASA Astrophysics Data System (ADS)

    Anistyasari, Y.; Sarno, R.; Rochmawati, N.

    2018-01-01

    The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.

  8. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    DTIC Science & Technology

    2017-10-01

    Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...Sept 2016 – 20 Sept 2017 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology ...efficiency through interoperable medical technologies . We played a leadership role on interoperability safety standards (AAMI, AAMI/UL Joint

  9. Towards a Quantitative Performance Measurement Framework to Assess the Impact of Geographic Information Standards

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, D.; Van Orshoven, J.; Vancauwenberghe, G.

    2012-12-01

    Over the last decennia, the use of Geographic Information (GI) has gained importance, in public as well as in private sector. But even if many spatial data and related information exist, data sets are scattered over many organizations and departments. In practice it remains difficult to find the spatial data sets needed, and to access, obtain and prepare them for using in applications. Therefore Spatial Data Infrastructures (SDI) haven been developed to enhance the access, the use and sharing of GI. SDIs consist of a set of technological and non-technological components to reach this goal. Since the nineties many SDI initiatives saw light. Ultimately, all these initiatives aim to enhance the flow of spatial data between organizations (users as well as producers) involved in intra- and inter-organizational and even cross-country business processes. However, the flow of information and its re-use in different business processes requires technical and semantic interoperability: the first should guarantee that system components can interoperate and use the data, while the second should guarantee that data content is understood by all users in the same way. GI-standards within the SDI are necessary to make this happen. However, it is not known if this is realized in practice. Therefore the objective of the research is to develop a quantitative framework to assess the impact of GI-standards on the performance of business processes. For that purpose, indicators are defined and tested in several cases throughout Europe. The proposed research will build upon previous work carried out in the SPATIALIST project. It analyzed the impact of different technological and non-technological factors on the SDI-performance of business processes (Dessers et al., 2011). The current research aims to apply quantitative performance measurement techniques - which are frequently used to measure performance of production processes (Anupindi et al., 2005). Key to reach the research objectives is a correct design of the test cases. The major challenge is: to set-up the analytical framework for analyzing the impact of GI-standards on the process performance, to define the appropriate indicators and to choose the right test cases. In order to do so, it is proposed to define the test cases as 8 pairs of organizations (see figure). The paper will present the state of the art of performance measurement in the context of work processes, propose a series of SMART indicators for describing the set-up and measure the performance, define the test case set-up and suggest criteria for the selection of the test cases, i.e. the organizational pairs. References Anupindi, R., Chopra, S., Deshmukh, S.D., Van Mieghem, J.A., & Zemel, E. (2006). Managing Business Process Flows: Principles of Operations Management. New-Jersey, USA: Prentice Hall. Dessers, D., Crompvoets, J., Janssen, K., Vancauwenberghe, G., Vandenbroucke, D. & Vanhaverbeke, L. (2011). SDI at work: The Spatial Zoning Plans Case. Leuven, Belgium: Katholieke Universiteit Leuven.

  10. Structured electronic physiotherapy records.

    PubMed

    Buyl, Ronald; Nyssen, Marc

    2009-07-01

    With the introduction of the electronic health record, physiotherapists too are encouraged to store their patient records in a structured digital format. The typical nature of a physiotherapy treatment requires a specific record structure to be implemented, with special attention to user-friendliness and communication with other healthcare providers. The objective of this study was to establish a framework for the electronic physiotherapy record and to define a model for the interoperability with the other healthcare providers involved in the patients' care. Although we started from the Belgian context, we used a generic approach so that the results can easily be extrapolated to other countries. The framework we establish here defines not only the different building blocks of the electronic physiotherapy record, but also describes the structure and the content of the exchanged data elements. Through a combined effort by all involved parties, we elaborated an eight-level structure for the electronic physiotherapy record. Furthermore we designed a server-based model for the exchange of data between electronic record systems held by physicians and those held by physiotherapists. Two newly defined XML messages enable data interchange: the physiotherapy prescription and the physiotherapy report. We succeeded in defining a solid, structural model for electronic physiotherapist record systems. Recent wide scale implementation of operational elements such as the electronic registry has proven to make the administrative work easier for the physiotherapist. Moreover, within the proposed framework all the necessary building blocks are present for further data exchange and communication with other healthcare parties in the future. Although we completed the design of the structure and already implemented some new aspects of the electronic physiotherapy record, the real challenge lies in persuading the end-users to start using these electronic record systems. Via a quality label certification procedure, based on adequate criteria, the Ministry of Health tries to promote the use of electronic physiotherapy records. We must keep in mind that physiotherapists will show an interest in electronic record keeping, only if this will lead to a positive return for them.

  11. Defense Modeling and Simulation Initiative

    DTIC Science & Technology

    1992-05-01

    project solicitation and priority ranking process, and reviewing policy issues . The activities of the DMSO and MSWG are also supported by a series of... issues have been raised for discussion, including: *Proumulgation of standards for the interoperability of models and simulations " Modeling and...have been completed or will be completed in the near term. The policy issues should be defined at a high level in the near term, although their

  12. Low-Cost Manufacturing, Usability, and Security: An Analysis of Bluetooth Simple Pairing and Wi-Fi Protected Setup

    NASA Astrophysics Data System (ADS)

    Kuo, Cynthia; Walker, Jesse; Perrig, Adrian

    Bluetooth Simple Pairing and Wi-Fi Protected Setup specify mechanisms for exchanging authentication credentials in wireless networks. Both Simple Pairing and Protected Setup support multiple setup mechanisms, which increases security risks and hurts the user experience. To improve the security and usability of these specifications, we suggest defining a common baseline for hardware features and a consistent, interoperable user experience across devices.

  13. Secure E-Business applications based on the European Citizen Card

    NASA Astrophysics Data System (ADS)

    Zipfel, Christian; Daum, Henning; Meister, Gisela

    The introduction of ID cards enhanced with electronic authentication services opens up the possibility to use these for identification and authentication in e-business applications. To avoid incompatible national solutions, the specification of the European Citizen Card aims at defining interoperable services for such use cases. Especially the given device authentication methods can help to eliminate security problems with current e-business and online banking applications.

  14. Semantic Integration for Marine Science Interoperability Using Web Technologies

    NASA Astrophysics Data System (ADS)

    Rueda, C.; Bermudez, L.; Graybeal, J.; Isenor, A. W.

    2008-12-01

    The Marine Metadata Interoperability Project, MMI (http://marinemetadata.org) promotes the exchange, integration, and use of marine data through enhanced data publishing, discovery, documentation, and accessibility. A key effort is the definition of an Architectural Framework and Operational Concept for Semantic Interoperability (http://marinemetadata.org/sfc), which is complemented with the development of tools that realize critical use cases in semantic interoperability. In this presentation, we describe a set of such Semantic Web tools that allow performing important interoperability tasks, ranging from the creation of controlled vocabularies and the mapping of terms across multiple ontologies, to the online registration, storage, and search services needed to work with the ontologies (http://mmisw.org). This set of services uses Web standards and technologies, including Resource Description Framework (RDF), Web Ontology language (OWL), Web services, and toolkits for Rich Internet Application development. We will describe the following components: MMI Ontology Registry: The MMI Ontology Registry and Repository provides registry and storage services for ontologies. Entries in the registry are associated with projects defined by the registered users. Also, sophisticated search functions, for example according to metadata items and vocabulary terms, are provided. Client applications can submit search requests using the WC3 SPARQL Query Language for RDF. Voc2RDF: This component converts an ASCII comma-delimited set of terms and definitions into an RDF file. Voc2RDF facilitates the creation of controlled vocabularies by using a simple form-based user interface. Created vocabularies and their descriptive metadata can be submitted to the MMI Ontology Registry for versioning and community access. VINE: The Vocabulary Integration Environment component allows the user to map vocabulary terms across multiple ontologies. Various relationships can be established, for example exactMatch, narrowerThan, and subClassOf. VINE can compute inferred mappings based on the given associations. Attributes about each mapping, like comments and a confidence level, can also be included. VINE also supports registering and storing resulting mapping files in the Ontology Registry. The presentation will describe the application of semantic technologies in general, and our planned applications in particular, to solve data management problems in the marine and environmental sciences.

  15. Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes

    NASA Astrophysics Data System (ADS)

    Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.

    2014-12-01

    With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data Web developer-friendly with a RESTful service. This goal was achieved by defining a proxy layer on top of the existing SPARQL endpoint that 1) translates HTTP requests into SPARQL queries, and 2) renders the returned results as required by the request sender using content negotiation, suffixes and parameters.

  16. Supply Chain Interoperability Measurement

    DTIC Science & Technology

    2015-06-19

    Supply Chain Interoperability Measurement DISSERTATION June 2015 Christos E. Chalyvidis, Major, Hellenic Air...ENS-DS-15-J-001 SUPPLY CHAIN INTEROPERABILITY MEASUREMENT DISSERTATION Presented to the Faculty Department of Operational Sciences...INTEROPERABILITY MEASUREMENT Christos E. Chalyvidis, BS, MSc. Major, Hellenic Air Force Committee Membership: Dr. A.W. Johnson Chair

  17. 80 FR 46010 - Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2015-08-03

    ...] Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments AGENCY: Food... workshop entitled ``FDA/CDC/NLM Workshop on Promoting Semantic Interoperability of Laboratory Data.'' The... to promoting the semantic interoperability of laboratory data between in vitro diagnostic devices and...

  18. 75 FR 63462 - Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-15

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid Interoperability Standards October 7, 2010... directs the development of a framework to achieve interoperability of smart grid devices and systems...

  19. 81 FR 68435 - Workshop on Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2016-10-04

    ...] Workshop on Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments... Semantic Interoperability of Laboratory Data.'' The purpose of this public workshop is to receive and... Semantic Interoperability of Laboratory Data.'' Received comments will be placed in the docket and, except...

  20. Ensuring Interoperability between UAS Detect-and-Avoid and Manned Aircraft Collision Avoidance

    NASA Technical Reports Server (NTRS)

    Thipphavong, David; Cone, Andrew; Lee, Seung Man; Santiago, Confesor

    2017-01-01

    The UAS community in the United States has identified the need for a collision avoidance region in which UAS Detect-and-Avoid (DAA) vertical guidance is restricted to preclude interoperability issues with manned aircraft collision avoidance system vertical resolution advisories (RAs). This paper documents the process by which the collision avoidance region was defined. Three candidate definitions were evaluated on 1.3 million simulated pairwise encounters between UAS and manned aircraft covering a wide range of horizontal and vertical closure rates, angles, and miss distances. They were evaluated with regard to UAS DAA interoperability with manned aircraft collision avoidance systems in terms of: 1) the primary objective of restricting DAA vertical guidance before RAs when the aircraft are close, and 2) the secondary objective of avoiding unnecessary restrictions of DAA vertical guidance at a DAA alert when the aircraft are further apart. The collision avoidance region definition that fully achieves the primary objective and best achieves the secondary objective was recommended to and accepted by the UAS community in the United States. By this definition, UAS and manned aircraft are in the collision avoidance region--during which DAA vertical guidance is restricted--when the time to closest point of approach is less than 50 seconds and either the time to co-altitude is less than 50 seconds or the current vertical separation is less than 800 feet.

  1. Ensuring Interoperability Between Unmanned Aircraft Detect-and-Avoid and Manned Aircraft Collision Avoidance

    NASA Technical Reports Server (NTRS)

    Thipphavong, David; Cone, Andrew; Lee, Seungman

    2017-01-01

    The Unmanned Aircraft Systems (UAS) community in the United States has identified the need for a collision avoidance region in which UAS Detect-and-Avoid (DAA) vertical guidance is restricted to preclude interoperability issues with manned aircraft collision avoidance system vertical resolution advisories (RAs). This paper documents the process by which the collision avoidance region was defined. Three candidate definitions were evaluated on 1.3 million simulated pairwise encounters between UAS and manned aircraft covering a wide range of horizontal and vertical closure rates, angles, and miss distances. Each definition was evaluated with regard to UAS DAA interoperability with manned aircraft collision avoidance in terms of how well it achieved: 1) the primary objective of restricting DAA vertical guidance prior to RAs when the aircraft are close, and 2) the secondary objective of avoiding unnecessary restrictions of DAA vertical guidance at DAA alerts when the aircraft are further apart. The collision avoidance region definition that fully achieves the primary objective and best achieves the secondary objective was recommended to and accepted by the UAS community in the United States. By this definition, UAS and manned aircraft are in the collision avoidance region where DAA vertical guidance is restricted when the time to closest point of approach (CPA) is less than 50 seconds and either the time to co-altitude is less than 50 seconds or the current vertical separation is less than 800 feet.

  2. Resource allocation in shared spectrum access communications for operators with diverse service requirements

    NASA Astrophysics Data System (ADS)

    Kibria, Mirza Golam; Villardi, Gabriel Porto; Ishizu, Kentaro; Kojima, Fumihide; Yano, Hiroyuki

    2016-12-01

    In this paper, we study inter-operator spectrum sharing and intra-operator resource allocation in shared spectrum access communication systems and propose efficient dynamic solutions to address both inter-operator and intra-operator resource allocation optimization problems. For inter-operator spectrum sharing, we present two competent approaches, namely the subcarrier gain-based sharing and fragmentation-based sharing, which carry out fair and flexible allocation of the available shareable spectrum among the operators subject to certain well-defined sharing rules, traffic demands, and channel propagation characteristics. The subcarrier gain-based spectrum sharing scheme has been found to be more efficient in terms of achieved throughput. However, the fragmentation-based sharing is more attractive in terms of computational complexity. For intra-operator resource allocation, we consider resource allocation problem with users' dissimilar service requirements, where the operator supports users with delay constraint and non-delay constraint service requirements, simultaneously. This optimization problem is a mixed-integer non-linear programming problem and non-convex, which is computationally very expensive, and the complexity grows exponentially with the number of integer variables. We propose less-complex and efficient suboptimal solution based on formulating exact linearization, linear approximation, and convexification techniques for the non-linear and/or non-convex objective functions and constraints. Extensive simulation performance analysis has been carried out that validates the efficiency of the proposed solution.

  3. Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101

    DTIC Science & Technology

    2012-07-01

    interoperability, although they are supported by some interoperability attributes  For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p

  4. Software-Defined Radio for Space-to-Space Communications

    NASA Technical Reports Server (NTRS)

    Fisher, Ken; Jih, Cindy; Moore, Michael S.; Price, Jeremy C.; Abbott, Ben A.; Fritz, Justin A.

    2011-01-01

    A paper describes the Space- to-Space Communications System (SSCS) Software- Defined Radio (SDR) research project to determine the most appropriate method for creating flexible and reconfigurable radios to implement wireless communications channels for space vehicles so that fewer radios are required, and commonality in hardware and software architecture can be leveraged for future missions. The ability to reconfigure the SDR through software enables one radio platform to be reconfigured to interoperate with many different waveforms. This means a reduction in the number of physical radio platforms necessary to support a space mission s communication requirements, thus decreasing the total size, weight, and power needed for a mission.

  5. [Overview of the US policies for health information technology and lessons learned for Israel].

    PubMed

    Topaz, Maxim; Ash, Nachman

    2013-05-01

    The heaLthcare system in the United States (U.S.) faces a number of significant changes aimed at improving the quality and availability of medical services and reducing costs. Implementation of health information technologies, especiaLly ELectronic Health Records (EHR), is central to achieving these goals. Several recent Legislative efforts in the U.S. aim at defining standards and promoting wide scale "Meaningful Use" of the novel technologies. In Israel, the majority of heaLthcare providers adopted EHR throughout the Last decade. Unlike the U.S., the process of EHR adoption occurred spontaneously, without governmental control or the definition of standards. In this article, we review the U.S. health information technology policies and standards and suggest potential lessons Learned for Israel. First, we present the three-staged Meaningful Use regulations that require eligible healthcare practitioners to use EHR in their practice. We also describe the standards for EHR certification and national efforts to create interoperable health information technology networks. Finally, we provide a brief overview of the IsraeLi regulation in the field of EHR. Although the adoption of health information technology is wider in Israel, the Lack of technology standards and governmental control has Led to Large technology gaps between providers. The example of the U.S. Legislation urges the adoption of several critical steps to further enhance the quality and efficiency of the Israeli healthcare system, in particular: strengthening health information technology regulation; developing Licensure criteria for health information technology; bridging the digital gap between healthcare organizations; defining quality measures; and improving the accessibility of health information for patients.

  6. Criteria for Evaluating Alternative Network and Link Layer Protocols for the NASA Constellation Program Communication Architecture

    NASA Technical Reports Server (NTRS)

    Benbenek, Daniel; Soloff, Jason; Lieb, Erica

    2010-01-01

    Selecting a communications and network architecture for future manned space flight requires an evaluation of the varying goals and objectives of the program, development of communications and network architecture evaluation criteria, and assessment of critical architecture trades. This paper uses Cx Program proposed exploration activities as a guideline; lunar sortie, outpost, Mars, and flexible path options are described. A set of proposed communications network architecture criteria are proposed and described. They include: interoperability, security, reliability, and ease of automating topology changes. Finally a key set of architecture options are traded including (1) multiplexing data at a common network layer vs. at the data link layer, (2) implementing multiple network layers vs. a single network layer, and (3) the use of a particular network layer protocol, primarily IPv6 vs. Delay Tolerant Networking (DTN). In summary, the protocol options are evaluated against the proposed exploration activities and their relative performance with respect to the criteria are assessed. An architectural approach which includes (a) the capability of multiplexing at both the network layer and the data link layer and (b) a single network layer for operations at each program phase, as these solutions are best suited to respond to the widest array of program needs and meet each of the evaluation criteria.

  7. 76 FR 66040 - NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-25

    ...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...

  8. Warfighter IT Interoperability Standards Study

    DTIC Science & Technology

    2012-07-22

    data (e.g. messages) between systems ? ii) What process did you used to validate and certify semantic interoperability between your...other systems at this time There was no requirement to validate and certify semantic interoperability The DLS program exchanges data with... semantics Testing for System Compliance with Data Models Verify and Certify Interoperability Using Data

  9. Applying A Formal Language of Command and Control For Interoperability Between Systems

    DTIC Science & Technology

    2008-05-21

    Initially, it must be determined which type of grammar is to be used. The Chomsky hierarchy specifies that grammars can be Type 0 (unrestricted...future research. 2. Development of Formal Grammars In his book “Syntactic Structures” [5], published in 1957, Noam Chomsky answered the question...be finite because recursion is allowed. 2.2 Types of Grammars Chomsky defines four types of grammar . They are ordered within what is

  10. Concept of information technology of monitoring and decision-making support

    NASA Astrophysics Data System (ADS)

    Kovalenko, Aleksandr S.; Tymchyk, Sergey V.; Kostyshyn, Sergey V.; Zlepko, Sergey M.; Wójcik, Waldemar; Kalizhanova, Aliya; Burlibay, Aron; Kozbekova, Ainur

    2017-08-01

    Presented concept of information technology monitoring and decision support to determine the health of students. The preconditions of a concept formulated its goal and purpose. Subject area concepts proposed to consider a set of problems, grouped into 8 categories, which in turn necessitates the application when creating technology basic principles from the principles of "first head" and "systems approach" to the principles of "interoperability" and "system integration ". The content of the information providing IT, its position in the segment of single information space, stages of creation. To evaluate the efficiency of the IT system developed proposed criteria.

  11. 76 FR 51271 - Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the 700 MHz Band

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-18

    ... Docket 07-100; FCC 11-6] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the... interoperable public safety broadband network. The establishment of a common air interface for 700 MHz public safety broadband networks will create a foundation for interoperability and provide a clear path for the...

  12. Political, policy and social barriers to health system interoperability: emerging opportunities of Web 2.0 and 3.0.

    PubMed

    Juzwishin, Donald W M

    2009-01-01

    Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.

  13. D-ATM, a working example of health care interoperability: From dirt path to gravel road.

    PubMed

    DeClaris, John-William

    2009-01-01

    For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.

  14. Establishing semantic interoperability of biomedical metadata registries using extended semantic relationships.

    PubMed

    Park, Yu Rang; Yoon, Young Jo; Kim, Hye Hyeon; Kim, Ju Han

    2013-01-01

    Achieving semantic interoperability is critical for biomedical data sharing between individuals, organizations and systems. The ISO/IEC 11179 MetaData Registry (MDR) standard has been recognized as one of the solutions for this purpose. The standard model, however, is limited. Representing concepts consist of two or more values, for instance, are not allowed including blood pressure with systolic and diastolic values. We addressed the structural limitations of ISO/IEC 11179 by an integrated metadata object model in our previous research. In the present study, we introduce semantic extensions for the model by defining three new types of semantic relationships; dependency, composite and variable relationships. To evaluate our extensions in a real world setting, we measured the efficiency of metadata reduction by means of mapping to existing others. We extracted metadata from the College of American Pathologist Cancer Protocols and then evaluated our extensions. With no semantic loss, one third of the extracted metadata could be successfully eliminated, suggesting better strategy for implementing clinical MDRs with improved efficiency and utility.

  15. Adapting the design of Anesthesia Information Management Systems to innovations depicted in Industrial Property documents.

    PubMed

    Spyropoulos, B; Tzavaras, A; Zogogianni, D; Botsivaly, M

    2013-01-01

    The purpose of this paper is to present the design and the current development status of an Anesthesia Information Management System (AIMS). For this system, the physical and technical advances, depicted in relevant, recently published Industrial Property documents, have been taken into account. Additional innovative sensors create further data-load to be managed. Novel wireless data-transmission modes demand eventually compliance to further proper standards, so that interoperability between AIMS and the existing Hospital Information Systems is being sustained. We attempted to define, the state-of-the-art concerning the functions, the design-prerequisites and the relevant standards and of an "emerging" AIMS that is combining hardware innovation, real-time data acquisition, processing and displaying and lastly enabling the necessary interoperability with the other components of the existing Hospital Information Systems. Finally, we report based on this approach, about the design and implementation status, of our "real-world" system under development and discuss the multifarious obstacles encountered during this still on-going project.

  16. Request redirection paradigm in medical image archive implementation.

    PubMed

    Dragan, Dinu; Ivetić, Dragan

    2012-08-01

    It is widely recognized that the JPEG2000 facilitates issues in medical imaging: storage, communication, sharing, remote access, interoperability, and presentation scalability. Therefore, JPEG2000 support was added to the DICOM standard Supplement 61. Two approaches to support JPEG2000 medical image are explicitly defined by the DICOM standard: replacing the DICOM image format with corresponding JPEG2000 codestream, or by the Pixel Data Provider service, DICOM supplement 106. The latest one supposes two-step retrieval of medical image: DICOM request and response from a DICOM server, and then JPIP request and response from a JPEG2000 server. We propose a novel strategy for transmission of scalable JPEG2000 images extracted from a single codestream over DICOM network using the DICOM Private Data Element without sacrificing system interoperability. It employs the request redirection paradigm: DICOM request and response from JPEG2000 server through DICOM server. The paper presents programming solution for implementation of request redirection paradigm in a DICOM transparent manner. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  17. A Framework for a Decision Support System in a Hierarchical Extended Enterprise Decision Context

    NASA Astrophysics Data System (ADS)

    Boza, Andrés; Ortiz, Angel; Vicens, Eduardo; Poler, Raul

    Decision Support System (DSS) tools provide useful information to decision makers. In an Extended Enterprise, a new goal, changes in the current objectives or small changes in the extended enterprise configuration produce a necessary adjustment in its decision system. A DSS in this context must be flexible and agile to make suitable an easy and quickly adaptation to this new context. This paper proposes to extend the Hierarchical Production Planning (HPP) structure to an Extended Enterprise decision making context. In this way, a framework for DSS in Extended Enterprise context is defined using components of HPP. Interoperability details have been reviewed to identify the impact in this framework. The proposed framework allows overcoming some interoperability barriers, identifying and organizing components for a DSS in Extended Enterprise context, and working in the definition of an architecture to be used in the design process of a flexible DSS in Extended Enterprise context which can reuse components for futures Extended Enterprise configurations.

  18. Emergence of a Common Modeling Architecture for Earth System Science (Invited)

    NASA Astrophysics Data System (ADS)

    Deluca, C.

    2010-12-01

    Common modeling architecture can be viewed as a natural outcome of common modeling infrastructure. The development of model utility and coupling packages (ESMF, MCT, OpenMI, etc.) over the last decade represents the realization of a community vision for common model infrastructure. The adoption of these packages has led to increased technical communication among modeling centers and newly coupled modeling systems. However, adoption has also exposed aspects of interoperability that must be addressed before easy exchange of model components among different groups can be achieved. These aspects include common physical architecture (how a model is divided into components) and model metadata and usage conventions. The National Unified Operational Prediction Capability (NUOPC), an operational weather prediction consortium, is collaborating with weather and climate researchers to define a common model architecture that encompasses these advanced aspects of interoperability and looks to future needs. The nature and structure of the emergent common modeling architecture will be discussed along with its implications for future model development.

  19. Towards Standardized Patient Data Exchange: Integrating a FHIR Based API for the Open Medical Record System.

    PubMed

    Kasthurirathne, Suranga N; Mamlin, Burke; Grieve, Grahame; Biondich, Paul

    2015-01-01

    Interoperability is essential to address limitations caused by the ad hoc implementation of clinical information systems and the distributed nature of modern medical care. The HL7 V2 and V3 standards have played a significant role in ensuring interoperability for healthcare. FHIR is a next generation standard created to address fundamental limitations in HL7 V2 and V3. FHIR is particularly relevant to OpenMRS, an Open Source Medical Record System widely used across emerging economies. FHIR has the potential to allow OpenMRS to move away from a bespoke, application specific API to a standards based API. We describe efforts to design and implement a FHIR based API for the OpenMRS platform. Lessons learned from this effort were used to define long term plans to transition from the legacy OpenMRS API to a FHIR based API that greatly reduces the learning curve for developers and helps enhance adhernce to standards.

  20. The Gaia Archive at ESAC: a VO-inside archive

    NASA Astrophysics Data System (ADS)

    Gonzalez-Nunez, J.

    2015-12-01

    The ESDC (ESAC Science Data Center) is one of the active members of the IVOA (International Virtual Observatory Alliance) that have defined a set of standards, libraries and concepts that allows to create flexible,scalable and interoperable architectures on the data archives development. In the case of astronomy science that involves the use of big catalogues, as in Gaia or Euclid, TAP, UWS and VOSpace standards can be used to create an architecture that allows the explotation of this valuable data from the community. Also, new challenges arise like the implementation of the new paradigm "move code close to the data", what can be partially obtained by the extension of the protocols (TAP+, UWS+, etc) or the languages (ADQL). We explain how we have used VO standards and libraries for the Gaia Archive that, not only have producing an open and interoperable archive but, also, minimizing the developement on certain areas. Also we will explain how we have extended these protocols and the future plans.

  1. EPOS Thematic Core Service Anthropogenic Hazards: Implementation Plan

    NASA Astrophysics Data System (ADS)

    Orlecka-Sikora, Beata; Lasocki, Stanislaw; Grasso, Jean Robert; Schmittbuhl, Jean; Styles, Peter; Kwiatek, Grzegorz; Sterzel, Mariusz; Garcia, Alexander

    2015-04-01

    EPOS Thematic Core Service ANTHROPOGENIC HAZARDS (TCS AH) aims to integrate distributed research infrastructures (RI) to facilitate and stimulate research on anthropogenic hazards (AH) especially those associated with the exploration and exploitation of geo-resources. The innovative element is the uniqueness of the integrated RI which comprises two main deliverables: (1) Exceptional datasets, called "episodes", which comprehensively describe a geophysical process; induced or triggered by human technological activity, posing hazard for populations, infrastructure and the environment, (2) Problem-oriented, bespoke services uniquely designed for the discrimination and analysis of correlations between technology, geophysical response and resulting hazard. These objectives will be achieved through the Science-Industry Synergy (SIS) built by EPOS WG10, ensuring bi-directional information exchange, including unique and previously unavailable data furnished by industrial partners. The Episodes and services to be integrated have been selected using strict criteria during the EPOS PP. The data are related to a wide spectrum of inducing technologies, with seismic/aseismic deformation and production history as a minimum data set requirement and the quality of software services is confirmed and referenced in literature. Implementation of TCS AH is planned for four years and requires five major activities: (1) Strategic Activities and Governance: will define and establish the governance structure to ensure the long-term sustainability of these research infrastructures for data provision through EPOS. (2) Coordination and Interaction with the Community: will establish robust communication channels within the whole TCS AH community while supporting global EPOS communication strategy. (3) Interoperability with EPOS Integrated Core Service (ICS) and Testing Activities: will coordinate and ensure interoperability between the RIs and the ICS. Within this modality a functional e-research environment with access to High-Performance Computing will be built. A prototype for such an environment is already under construction and will become operational in mid -2015 (is-epos.eu). (4) Integration of AH Episodes: will address at least 20 global episodes related to conventional hydrocarbon extraction, reservoir treatment, underground mining and geothermal energy production which will be integrated into the e-environment of TCS AH. All the multi-disciplinary heterogeneous data from these particular episodes will be transformed to unified structures to form integrated data sets articulated with the defined standards of ICS and other TCS's. (5) Implementation of services for analyzing Episodes: will deliver the protocols and methodologies for analysis of the seismic/deformation response to time-varying georesource exploitation technologies on long and short time scales and the related time- and technology-dependent seismic hazard issues.

  2. Interoperability of Information Systems Managed and Used by the Local Health Departments.

    PubMed

    Shah, Gulzar H; Leider, Jonathon P; Luo, Huabin; Kaur, Ravneet

    2016-01-01

    In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide.

  3. Evolving Frameworks for Different Communities of Scientists and End Users

    NASA Astrophysics Data System (ADS)

    Graves, S. J.; Keiser, K.

    2016-12-01

    Two evolving frameworks for interdisciplinary science will be described in the context of the Common Data Framework for Earth-Observation Data and the importance of standards and protocols. The Event Data Driven Delivery (ED3) Framework, funded by NASA Applied Sciences, provides the delivery of data based on predetermined subscriptions and associated workflows to various communities of end users. ED3's capabilities are used by scientists, as well as policy and resource managers, when event alerts are triggered to respond to their needs. The EarthCube Integration and Testing Environment (ECITE) Assessment Framework for Technology Interoperability and Integration is being developed to facilitate the EarthCube community's assessment of NSF funded technologies addressing Earth science problems. ECITE is addressing the translation of geoscience researchers' use cases into technology use case that apply EarthCube-funded building block technologies (and other existing technologies) for solving science problems. EarthCube criteria for technology assessment include the use of data, metadata and service standards to improve interoperability and integration across program components. The long-range benefit will be the growth of a cyberinfrastructure with technology components that have been shown to work together to solve known science objectives.

  4. A systematic literature review on security and privacy of electronic health record systems: technical perspectives.

    PubMed

    Rezaeibagha, Fatemeh; Win, Khin Than; Susilo, Willy

    Even though many safeguards and policies for electronic health record (EHR) security have been implemented, barriers to the privacy and security protection of EHR systems persist. This article presents the results of a systematic literature review regarding frequently adopted security and privacy technical features of EHR systems. Our inclusion criteria were full articles that dealt with the security and privacy of technical implementations of EHR systems published in English in peer-reviewed journals and conference proceedings between 1998 and 2013; 55 selected studies were reviewed in detail. We analysed the review results using two International Organization for Standardization (ISO) standards (29100 and 27002) in order to consolidate the study findings. Using this process, we identified 13 features that are essential to security and privacy in EHRs. These included system and application access control, compliance with security requirements, interoperability, integration and sharing, consent and choice mechanism, policies and regulation, applicability and scalability and cryptography techniques. This review highlights the importance of technical features, including mandated access control policies and consent mechanisms, to provide patients' consent, scalability through proper architecture and frameworks, and interoperability of health information systems, to EHR security and privacy requirements.

  5. National electronic health record interoperability chronology.

    PubMed

    Hufnagel, Stephen P

    2009-05-01

    The federal initiative for electronic health record (EHR) interoperability began in 2000 and set the stage for the establishment of the 2004 Executive Order for EHR interoperability by 2014. This article discusses the chronology from the 2001 e-Government Consolidated Health Informatics (CHI) initiative through the current congressional mandates for an aligned, interoperable, and agile DoD AHLTA and VA VistA.

  6. Employing Semantic Technologies for the Orchestration of Government Services

    NASA Astrophysics Data System (ADS)

    Sabol, Tomáš; Furdík, Karol; Mach, Marián

    The main aim of the eGovernment is to provide efficient, secure, inclusive services for its citizens and businesses. The necessity to integrate services and information resources, to increase accessibility, to reduce the administrative burden on citizens and enterprises - these are only a few reasons why the paradigm of the eGovernment has been shifted from the supply-driven approach toward the connected governance, emphasizing the concept of interoperability (Archmann and Nielsen 2008). On the EU level, the interoperability is explicitly addressed as one of the four main challenges, including in the i2010 strategy (i2010 2005). The Commission's Communication (Interoperability for Pan-European eGovernment Services 2006) strongly emphasizes the necessity of interoperable eGovernment services, based on standards, open specifications, and open interfaces. The Pan-European interoperability initiatives, such as the European Interoperability Framework (2004) and IDABC, as well as many projects supported by the European Commission within the IST Program and the Competitiveness and Innovation Program (CIP), illustrate the importance of interoperability on the EU level.

  7. Empowering open systems through cross-platform interoperability

    NASA Astrophysics Data System (ADS)

    Lyke, James C.

    2014-06-01

    Most of the motivations for open systems lie in the expectation of interoperability, sometimes referred to as "plug-and-play". Nothing in the notion of "open-ness", however, guarantees this outcome, which makes the increased interest in open architecture more perplexing. In this paper, we explore certain themes of open architecture. We introduce the concept of "windows of interoperability", which can be used to align disparate portions of architecture. Such "windows of interoperability", which concentrate on a reduced set of protocol and interface features, might achieve many of the broader purposes assigned as benefits in open architecture. Since it is possible to engineer proprietary systems that interoperate effectively, this nuanced definition of interoperability may in fact be a more important concept to understand and nurture for effective systems engineering and maintenance.

  8. Multi-model-based interactive authoring environment for creating shareable medical knowledge.

    PubMed

    Ali, Taqdir; Hussain, Maqbool; Ali Khan, Wajahat; Afzal, Muhammad; Hussain, Jamil; Ali, Rahman; Hassan, Waseem; Jamshed, Arif; Kang, Byeong Ho; Lee, Sungyoung

    2017-10-01

    Technologically integrated healthcare environments can be realized if physicians are encouraged to use smart systems for the creation and sharing of knowledge used in clinical decision support systems (CDSS). While CDSSs are heading toward smart environments, they lack support for abstraction of technology-oriented knowledge from physicians. Therefore, abstraction in the form of a user-friendly and flexible authoring environment is required in order for physicians to create shareable and interoperable knowledge for CDSS workflows. Our proposed system provides a user-friendly authoring environment to create Arden Syntax MLM (Medical Logic Module) as shareable knowledge rules for intelligent decision-making by CDSS. Existing systems are not physician friendly and lack interoperability and shareability of knowledge. In this paper, we proposed Intelligent-Knowledge Authoring Tool (I-KAT), a knowledge authoring environment that overcomes the above mentioned limitations. Shareability is achieved by creating a knowledge base from MLMs using Arden Syntax. Interoperability is enhanced using standard data models and terminologies. However, creation of shareable and interoperable knowledge using Arden Syntax without abstraction increases complexity, which ultimately makes it difficult for physicians to use the authoring environment. Therefore, physician friendliness is provided by abstraction at the application layer to reduce complexity. This abstraction is regulated by mappings created between legacy system concepts, which are modeled as domain clinical model (DCM) and decision support standards such as virtual medical record (vMR) and Systematized Nomenclature of Medicine - Clinical Terms (SNOMED CT). We represent these mappings with a semantic reconciliation model (SRM). The objective of the study is the creation of shareable and interoperable knowledge using a user-friendly and flexible I-KAT. Therefore we evaluated our system using completeness and user satisfaction criteria, which we assessed through the system- and user-centric evaluation processes. For system-centric evaluation, we compared the implementation of clinical information modelling system requirements in our proposed system and in existing systems. The results suggested that 82.05% of the requirements were fully supported, 7.69% were partially supported, and 10.25% were not supported by our system. In the existing systems, 35.89% of requirements were fully supported, 28.20% were partially supported, and 35.89% were not supported. For user-centric evaluation, the assessment criterion was 'ease of use'. Our proposed system showed 15 times better results with respect to MLM creation time than the existing systems. Moreover, on average, the participants made only one error in MLM creation using our proposed system, but 13 errors per MLM using the existing systems. We provide a user-friendly authoring environment for creation of shareable and interoperable knowledge for CDSS to overcome knowledge acquisition complexity. The authoring environment uses state-of-the-art decision support-related clinical standards with increased ease of use. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Electronic Health Records: VA and DOD Need to Support Cost and Schedule Claims, Develop Interoperability Plans, and Improve Collaboration

    DTIC Science & Technology

    2014-02-01

    Page 13 GAO-14-302 Electronic Health Records known as the Captain James A. Lovell Federal Health Care Center ( FHCC ). The FHCC is unique in...Healthcare Management System Modernization DOD Department of Defense FHCC Federal Health Care Center FHIE Federal Health Information Exchange GCPR... FHCC and, in accordance with the fiscal year 2010 NDAA, defined the relationship between the two departments for operating the new, integrated

  10. Managing Interoperability for GEOSS - A Report from the SIF

    NASA Astrophysics Data System (ADS)

    Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.

    2009-04-01

    The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of GEOSS.

  11. Interoperability of Information Systems Managed and Used by the Local Health Departments

    PubMed Central

    Leider, Jonathon P.; Luo, Huabin; Kaur, Ravneet

    2016-01-01

    Background: In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). Objectives: To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. Data and Methods: This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. Results: For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Conclusion: Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide. PMID:27684616

  12. System architecture of communication infrastructures for PPDR organisations

    NASA Astrophysics Data System (ADS)

    Müller, Wilmuth

    2017-04-01

    The growing number of events affecting public safety and security (PS and S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on organizations responsible for PS and S. In order to respond timely and in an adequate manner to such events Public Protection and Disaster Relief (PPDR) organizations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies do not provide broadband capability, which is a major limitation in supporting new services hence new information flows and currently they have no successor. There is also no known standard that addresses interoperability of these technologies. The paper at hands provides an approach to tackle the above mentioned aspects by defining an Enterprise Architecture (EA) of PPDR organizations and a System Architecture of next generation PPDR communication networks for a variety of applications and services on broadband networks, including the ability of inter-system, inter-agency and cross-border operations. The Open Safety and Security Architecture Framework (OSSAF) provides a framework and approach to coordinate the perspectives of different types of stakeholders within a PS and S organization. It aims at bridging the silos in the chain of commands and on leveraging interoperability between PPDR organizations. The framework incorporates concepts of several mature enterprise architecture frameworks including the NATO Architecture Framework (NAF). However, OSSAF is not providing details on how NAF should be used for describing the OSSAF perspectives and views. In this contribution a mapping of the NAF elements to the OSSAF views is provided. Based on this mapping, an EA of PPDR organizations with a focus on communication infrastructure related capabilities is presented. Following the capability modeling, a system architecture for secure and interoperable communication infrastructures for PPDR organizations is presented. This architecture was implemented within a project sponsored by the European Union and successfully demonstrated in a live validation exercise in June 2016.

  13. Approaching semantic interoperability in Health Level Seven

    PubMed Central

    Alschuler, Liora

    2010-01-01

    ‘Semantic Interoperability’ is a driving objective behind many of Health Level Seven's standards. The objective in this paper is to take a step back, and consider what semantic interoperability means, assess whether or not it has been achieved, and, if not, determine what concrete next steps can be taken to get closer. A framework for measuring semantic interoperability is proposed, using a technique called the ‘Single Logical Information Model’ framework, which relies on an operational definition of semantic interoperability and an understanding that interoperability improves incrementally. Whether semantic interoperability tomorrow will enable one computer to talk to another, much as one person can talk to another person, is a matter for speculation. It is assumed, however, that what gets measured gets improved, and in that spirit this framework is offered as a means to improvement. PMID:21106995

  14. Standardized exchange of clinical documents--towards a shared care paradigm in glaucoma treatment.

    PubMed

    Gerdsen, F; Müller, S; Jablonski, S; Prokosch, H-U

    2006-01-01

    The exchange of medical data from research and clinical routine across institutional borders is essential to establish an integrated healthcare platform. In this project we want to realize the standardized exchange of medical data between different healthcare institutions to implement an integrated and interoperable information system supporting clinical treatment and research of glaucoma. The central point of our concept is a standardized communication model based on the Clinical Document Architecture (CDA). Further, a communication concept between different health care institutions applying the developed document model has been defined. With our project we have been able to prove that standardized communication between an Electronic Medical Record (EMR), an Electronic Health Record (EHR) and the Erlanger Glaucoma Register (EGR) based on the established conceptual models, which rely on CDA rel.1 level 1 and SCIPHOX, could be implemented. The HL7-tool-based deduction of a suitable CDA rel.2 compliant schema showed significant differences when compared with the manually created schema. Finally fundamental requirements, which have to be implemented for an integrated health care platform, have been identified. An interoperable information system can enhance both clinical treatment and research projects. By automatically transferring screening findings from a glaucoma research project to the electronic medical record of our ophthalmology clinic, clinicians could benefit from the availability of a longitudinal patient record. The CDA as a standard for exchanging clinical documents has demonstrated its potential to enhance interoperability within a future shared care paradigm.

  15. Integrating reasoning and clinical archetypes using OWL ontologies and SWRL rules.

    PubMed

    Lezcano, Leonardo; Sicilia, Miguel-Angel; Rodríguez-Solano, Carlos

    2011-04-01

    Semantic interoperability is essential to facilitate the computerized support for alerts, workflow management and evidence-based healthcare across heterogeneous electronic health record (EHR) systems. Clinical archetypes, which are formal definitions of specific clinical concepts defined as specializations of a generic reference (information) model, provide a mechanism to express data structures in a shared and interoperable way. However, currently available archetype languages do not provide direct support for mapping to formal ontologies and then exploiting reasoning on clinical knowledge, which are key ingredients of full semantic interoperability, as stated in the SemanticHEALTH report [1]. This paper reports on an approach to translate definitions expressed in the openEHR Archetype Definition Language (ADL) to a formal representation expressed using the Ontology Web Language (OWL). The formal representations are then integrated with rules expressed with Semantic Web Rule Language (SWRL) expressions, providing an approach to apply the SWRL rules to concrete instances of clinical data. Sharing the knowledge expressed in the form of rules is consistent with the philosophy of open sharing, encouraged by archetypes. Our approach also allows the reuse of formal knowledge, expressed through ontologies, and extends reuse to propositions of declarative knowledge, such as those encoded in clinical guidelines. This paper describes the ADL-to-OWL translation approach, describes the techniques to map archetypes to formal ontologies, and demonstrates how rules can be applied to the resulting representation. We provide examples taken from a patient safety alerting system to illustrate our approach. Copyright © 2010 Elsevier Inc. All rights reserved.

  16. Towards semantic interoperability for electronic health records.

    PubMed

    Garde, Sebastian; Knaup, Petra; Hovenga, Evelyn; Heard, Sam

    2007-01-01

    In the field of open electronic health records (EHRs), openEHR as an archetype-based approach is being increasingly recognised. It is the objective of this paper to shortly describe this approach, and to analyse how openEHR archetypes impact on health professionals and semantic interoperability. Analysis of current approaches to EHR systems, terminology and standards developments. In addition to literature reviews, we organised face-to-face and additional telephone interviews and tele-conferences with members of relevant organisations and committees. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability -- both important prerequisites for semantic interoperability. Archetypes enable the formal definition of clinical content by clinicians. To enable comprehensive semantic interoperability, the development and maintenance of archetypes needs to be coordinated internationally and across health professions. Domain knowledge governance comprises a set of processes that enable the creation, development, organisation, sharing, dissemination, use and continuous maintenance of archetypes. It needs to be supported by information technology. To enable EHRs, semantic interoperability is essential. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability. However, without coordinated archetype development and maintenance, 'rank growth' of archetypes would jeopardize semantic interoperability. We therefore believe that openEHR archetypes and domain knowledge governance together create the knowledge environment required to adopt EHRs.

  17. PACS/information systems interoperability using Enterprise Communication Framework.

    PubMed

    alSafadi, Y; Lord, W P; Mankovich, N J

    1998-06-01

    Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).

  18. The role of architecture and ontology for interoperability.

    PubMed

    Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka

    2010-01-01

    Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.

  19. Evolution of System Architectures: Where Do We Need to Fail Next?

    NASA Astrophysics Data System (ADS)

    Bermudez, Luis; Alameh, Nadine; Percivall, George

    2013-04-01

    Innovation requires testing and failing. Thomas Edison was right when he said "I have not failed. I've just found 10,000 ways that won't work". For innovation and improvement of standards to happen, service Architectures have to be tested and tested. Within the Open Geospatial Consortium (OGC), testing of service architectures has occurred for the last 15 years. This talk will present an evolution of these service architectures and a possible future path. OGC is a global forum for the collaboration of developers and users of spatial data products and services, and for the advancement and development of international standards for geospatial interoperability. The OGC Interoperability Program is a series of hands-on, fast paced, engineering initiatives to accelerate the development and acceptance of OGC standards. Each initiative is organized in threads that provide focus under a particular theme. The first testbed, OGC Web Services phase 1, completed in 2003 had four threads: Common Architecture, Web Mapping, Sensor Web and Web Imagery Enablement. The Common Architecture was a cross-thread theme, to ensure that the Web Mapping and Sensor Web experiments built on a base common architecture. The architecture was based on the three main SOA components: Broker, Requestor and Provider. It proposed a general service model defining service interactions and dependencies; categorization of service types; registries to allow discovery and access of services; data models and encodings; and common services (WMS, WFS, WCS). For the latter, there was a clear distinction on the different services: Data Services (e.g. WMS), Application services (e.g. Coordinate transformation) and server-side client applications (e.g. image exploitation). The latest testbed, OGC Web Service phase 9, completed in 2012 had 5 threads: Aviation, Cross-Community Interoperability (CCI), Security and Services Interoperability (SSI), OWS Innovations and Compliance & Interoperability Testing & Evaluation (CITE). Compared to the first testbed, OWS-9 did not have a separate common architecture thread. Instead the emphasis was on brokering information models, securing them and making data available efficiently on mobile devices. The outcome is an architecture based on usability and non-intrusiveness while leveraging mediation of information models from different communities. This talk will use lessons learned from the evolution from OGC Testbed phase 1 to phase 9 to better understand how global and complex infrastructures evolve to support many communities including the Earth System Science Community.

  20. tmBioC: improving interoperability of text-mining tools with BioC.

    PubMed

    Khare, Ritu; Wei, Chih-Hsuan; Mao, Yuqing; Leaman, Robert; Lu, Zhiyong

    2014-01-01

    The lack of interoperability among biomedical text-mining tools is a major bottleneck in creating more complex applications. Despite the availability of numerous methods and techniques for various text-mining tasks, combining different tools requires substantial efforts and time owing to heterogeneity and variety in data formats. In response, BioC is a recent proposal that offers a minimalistic approach to tool interoperability by stipulating minimal changes to existing tools and applications. BioC is a family of XML formats that define how to present text documents and annotations, and also provides easy-to-use functions to read/write documents in the BioC format. In this study, we introduce our text-mining toolkit, which is designed to perform several challenging and significant tasks in the biomedical domain, and repackage the toolkit into BioC to enhance its interoperability. Our toolkit consists of six state-of-the-art tools for named-entity recognition, normalization and annotation (PubTator) of genes (GenNorm), diseases (DNorm), mutations (tmVar), species (SR4GN) and chemicals (tmChem). Although developed within the same group, each tool is designed to process input articles and output annotations in a different format. We modify these tools and enable them to read/write data in the proposed BioC format. We find that, using the BioC family of formats and functions, only minimal changes were required to build the newer versions of the tools. The resulting BioC wrapped toolkit, which we have named tmBioC, consists of our tools in BioC, an annotated full-text corpus in BioC, and a format detection and conversion tool. Furthermore, through participation in the 2013 BioCreative IV Interoperability Track, we empirically demonstrate that the tools in tmBioC can be more efficiently integrated with each other as well as with external tools: Our experimental results show that using BioC reduces >60% in lines of code for text-mining tool integration. The tmBioC toolkit is publicly available at http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/tmTools/. Database URL: http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/tmTools/. Published by Oxford University Press 2014. This work is written by US Government employees and is in the public domain in the US.

  1. Building a portable data and information interoperability infrastructure-framework for a standard Taiwan Electronic Medical Record Template.

    PubMed

    Jian, Wen-Shan; Hsu, Chien-Yeh; Hao, Te-Hui; Wen, Hsyien-Chia; Hsu, Min-Huei; Lee, Yen-Liang; Li, Yu-Chuan; Chang, Polun

    2007-11-01

    Traditional electronic health record (EHR) data are produced from various hospital information systems. They could not have existed independently without an information system until the incarnation of XML technology. The interoperability of a healthcare system can be divided into two dimensions: functional interoperability and semantic interoperability. Currently, no single EHR standard exists that provides complete EHR interoperability. In order to establish a national EHR standard, we developed a set of local EHR templates. The Taiwan Electronic Medical Record Template (TMT) is a standard that aims to achieve semantic interoperability in EHR exchanges nationally. The TMT architecture is basically composed of forms, components, sections, and elements. Data stored in the elements which can be referenced by the code set, data type, and narrative block. The TMT was established with the following requirements in mind: (1) transformable to international standards; (2) having a minimal impact on the existing healthcare system; (3) easy to implement and deploy, and (4) compliant with Taiwan's current laws and regulations. The TMT provides a basis for building a portable, interoperable information infrastructure for EHR exchange in Taiwan.

  2. System and methods of resource usage using an interoperable management framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.

    Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.

  3. Information Management Challenges in Achieving Coalition Interoperability

    DTIC Science & Technology

    2001-12-01

    by J. Dyer SESSION I: ARCHITECTURES AND STANDARDS: FUNDAMENTAL ISSUES Chairman: Dr I. WHITE (UK) Planning for Interoperability 1 by W.M. Gentleman...framework – a crucial step toward achieving coalition C4I interoperability. TOPICS TO BE COVERED: 1 ) Maintaining secure interoperability 2) Command...d’une coalition. SUJETS À EXAMINER : 1 ) Le maintien d’une interopérabilité sécurisée 2) Les interfaces des systèmes de commandement : 2a

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.

    The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levelsmore » to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.« less

  5. Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.

    PubMed

    Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert

    2017-08-01

    Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.

  6. Modified criteria for carotid sinus hypersensitivity are associated with increased mortality in a population-based study

    PubMed Central

    McDonald, Claire; Pearce, Mark S.; Newton, Julia L.; Kerr, Simon R.J.

    2016-01-01

    Abstract Aims Carotid sinus hypersensitivity (CSH) is arbitrarily defined as ≥3 s asystole or vasodepression of ≥50 mmHg in response to carotid sinus massage (CSM). Using this definition, 39% of older people meet the criteria for CSH. It has been suggested that current criteria are too sensitive. Krediet et al. [The history of diagnosing carotid sinus hypersensitivity: why are the current criteria too sensitive? Europace 2011;13:14–22] and Kerr et al. [Carotid sinus hypersensitivity in asymptomatic older persons: implications for diagnosis of syncope and falls. Arch Intern Med 2006;166:515–20] have proposed modified criteria. This population-based study aimed to compare the prevalence of CSH defined according to standard, Krediet and Kerr criteria, and to establish if CSH defined according these criteria is associated with all-cause mortality. Methods and results A total of 272 community-dwelling people aged ≥65 were recruited at random. Carotid sinus massage was performed for 5 s in supine and head-up positions. Heart rate and blood pressure response were recorded using an electrocardiogram and photoplethysmography. Cox regression analysis was used to examine the association between each definition of CSH and all-cause mortality. The prevalence of CSH defined according to standard, Krediet, and Kerr criteria was 39, 52, and 10%, respectively. Seventy-one participants died over a mean follow-up of 8.6 years (SD 2.1). Carotid sinus hypersensitivity defined according to standard and Krediet criteria was not associated with survival. Carotid sinus hypersensitivity defined according to Kerr criteria was associated with all-cause mortality independent of age and sex [hazard ratio (HR) 2.023 (95% confidence interval (95% CI) 1.131–3.618) P = 0.018)]. This remained significant after adjusting for cardiovascular risk factors [HR 2.174 (1.075–3.900) P = 0.009]. Conclusion Carotid sinus hypersensitivity defined according to Kerr criteria is associated with increased mortality. This raises an interesting question as to the suitability of the current criteria used to define CSH. PMID:27139698

  7. Developing a Coalition Battle Management Language to Facilitate Interoperability Between Operation CIS, and Simulations in Support of Training and Mission Rehearsal

    DTIC Science & Technology

    2005-06-01

    virtualisation of distributed computing and data resources such as processing, network bandwidth, and storage capacity, to create a single system...and Simulation (M&S) will be integrated into this heterogeneous SOA. M&S functionality will be available in the form of operational M&S services. One...documents defining net centric warfare, the use of M&S functionality is a common theme. Alberts and Hayes give a good overview on net centric operations

  8. Impact of coalition interoperability on PKI

    NASA Astrophysics Data System (ADS)

    Krall, Edward J.

    2003-07-01

    This paper examines methods for providing PKI interoperability among units of a coalition of armed forces drawn from different nations. The area in question is tactical identity management, for the purposes of confidentiality, integrity and non-repudiation in such a dynamic coalition. The interoperating applications under consideration range from email and other forms of store-and-forward messaging to TLS and IPSEC-protected real-time communications. Six interoperability architectures are examined with advantages and disadvantages of each described in the paper.

  9. Telemedicine system interoperability architecture: concept description and architecture overview.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard Layne, II

    2004-05-01

    In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.

  10. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    PubMed

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community.

  11. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    PubMed Central

    2010-01-01

    Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community. PMID:20591161

  12. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...

  13. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...

  14. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...

  15. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  16. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  17. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability.

    PubMed

    Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A

    2008-02-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).

  18. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability

    PubMed Central

    Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.

    2008-01-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service Oriented Architecture (SSOA) for cancer research by the National Cancer Institute’s cancer Biomedical Informatics Grid (caBIG™). PMID:17512259

  19. Maturity Model for Advancing Smart Grid Interoperability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knight, Mark; Widergren, Steven E.; Mater, J.

    2013-10-28

    Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met withmore » process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.« less

  20. Hearing Device Manufacturers Call for Interoperability and Standardization of Internet and Audiology.

    PubMed

    Laplante-Lévesque, Ariane; Abrams, Harvey; Bülow, Maja; Lunner, Thomas; Nelson, John; Riis, Søren Kamaric; Vanpoucke, Filiep

    2016-10-01

    This article describes the perspectives of hearing device manufacturers regarding the exciting developments that the Internet makes possible. Specifically, it proposes to join forces toward interoperability and standardization of Internet and audiology. A summary of why such a collaborative effort is required is provided from historical and scientific perspectives. A roadmap toward interoperability and standardization is proposed. Information and communication technologies improve the flow of health care data and pave the way to better health care. However, hearing-related products, features, and services are notoriously heterogeneous and incompatible with other health care systems (no interoperability). Standardization is the process of developing and implementing technical standards (e.g., Noah hearing database). All parties involved in interoperability and standardization realize mutual gains by making mutually consistent decisions. De jure (officially endorsed) standards can be developed in collaboration with large national health care systems as well as spokespeople for hearing care professionals and hearing device users. The roadmap covers mutual collaboration; data privacy, security, and ownership; compliance with current regulations; scalability and modularity; and the scope of interoperability and standards. We propose to join forces to pave the way to the interoperable Internet and audiology products, features, and services that the world needs.

  1. Reflections on the role of open source in health information system interoperability.

    PubMed

    Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G

    2007-01-01

    This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.

  2. 77 FR 67815 - Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-14

    ..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public... persons that the Federal Communications Commission's (FCC) Communications Security, Reliability, and... the security, reliability, and interoperability of communications systems. On March 19, 2011, the FCC...

  3. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  4. UAS Integration in the NAS Project: DAA-TCAS Interoperability "mini" HITL Primary Results

    NASA Technical Reports Server (NTRS)

    Rorie, Conrad; Fern, Lisa; Shively, Jay; Santiago, Confesor

    2016-01-01

    At the May 2015 SC-228 meeting, requirements for TCAS II interoperability became elevated in priority. A TCAS interoperability workgroup was formed to identify and address key issues/questions. The TCAS workgroup came up with an initial list of questions and a plan to address those questions. As part of that plan, NASA proposed to run a mini HITL to address display, alerting and guidance issues. A TCAS Interoperability Workshop was held to determine potential display/alerting/guidance issues that could be explored in future NASA mini HITLS. Consensus on main functionality of DAA guidance when TCAS II RA occurs. Prioritized list of independent variables for experimental design. Set of use cases to stress TCAS Interoperability.

  5. An Ontological Solution to Support Interoperability in the Textile Industry

    NASA Astrophysics Data System (ADS)

    Duque, Arantxa; Campos, Cristina; Jiménez-Ruiz, Ernesto; Chalmeta, Ricardo

    Significant developments in information and communication technologies and challenging market conditions have forced enterprises to adapt their way of doing business. In this context, providing mechanisms to guarantee interoperability among heterogeneous organisations has become a critical issue. Even though prolific research has already been conducted in the area of enterprise interoperability, we have found that enterprises still struggle to introduce fully interoperable solutions, especially, in terms of the development and application of ontologies. Thus, the aim of this paper is to introduce basic ontology concepts in a simple manner and to explain the advantages of the use of ontologies to improve interoperability. We will also present a case study showing the implementation of an application ontology for an enterprise in the textile/clothing sector.

  6. 77 FR 37001 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-20

    ... of the Interoperability Services Layer, Attn: Ron Chen, 400 Gigling Road, Seaside, CA 93955. Title; Associated Form; and OMB Number: Interoperability Services Layer; OMB Control Number 0704-TBD. Needs and Uses... INFORMATION: Summary of Information Collection IoLS (Interoperability Layer Services) is an application in a...

  7. The eXtensible ontology development (XOD) principles and tool implementation to support ontology interoperability.

    PubMed

    He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison

    2018-01-12

    Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).

  8. Achieving Interoperability in GEOSS - How Close Are We?

    NASA Astrophysics Data System (ADS)

    Arctur, D. K.; Khalsa, S. S.; Browdy, S. F.

    2010-12-01

    A primary goal of the Global Earth Observing System of System (GEOSS) is improving the interoperability between the observational, modelling, data assimilation, and prediction systems contributed by member countries. The GEOSS Common Infrastructure (GCI) comprises the elements designed to enable discovery and access to these diverse data and information sources. But to what degree can the mechanisms for accessing these data, and the data themselves, be considered interoperable? Will the separate efforts by Communities of Practice within GEO to build their own portals, such as for Energy, Biodiversity, and Air Quality, lead to fragmentation or synergy? What communication and leadership do we need with these communities to improve interoperability both within and across such communities? The Standards and Interoperability Forum (SIF) of GEO's Architecture and Data Committee has assessed progress towards achieving the goal of global interoperability and made recommendations regarding evolution of the architecture and overall data strategy to ensure fulfillment of the GEOSS vision. This presentation will highlight the results of this study, and directions for further work.

  9. Leverage and Delegation in Developing an Information Model for Geology

    NASA Astrophysics Data System (ADS)

    Cox, S. J.

    2007-12-01

    GeoSciML is an information model and XML encoding developed by a group of primarily geologic survey organizations under the auspices of the IUGS CGI. The scope of the core model broadly corresponds with information traditionally portrayed on a geologic map, viz. interpreted geology, some observations, the map legend and accompanying memoir. The development of GeoSciML has followed the methodology specified for an Application Schema defined by OGC and ISO 19100 series standards. This requires agreement within a community concerning their domain model, its formal representation using UML, documentation as a Feature Type Catalogue, with an XML Schema implementation generated from the model by applying a rule-based transformation. The framework and technology supports a modular governance process. Standard datatypes and GI components (geometry, the feature and coverage metamodels, metadata) are imported from the ISO framework. The observation and sampling model (including boreholes) is imported from OGC. The scale used for most scalar literal values (terms, codes, measures) allows for localization where necessary. Wildcards and abstract base- classes provide explicit extensibility points. Link attributes appear in a regular way in the encodings, allowing reference to external resources using URIs. The encoding is compatible with generic GI data-service interfaces (WFS, WMS, SOS). For maximum interoperability within a community, the interfaces may be specialised through domain-specified constraints (e.g. feature-types, scale and vocabulary bindings, query-models). Formalization using UML and XML allows use of standard validation and processing tools. Use of upper-level elements defined for generic GI application reduces the development effort and governance resonsibility, while maximising cross-domain interoperability. On the other hand, enabling specialization to be delegated in a controlled manner is essential to adoption across a range of subdisciplines and jurisdictions. The GeoSciML design team is responsible only for the part of the model that is unique to geology but for which general agreement can be reached within the domain. This paper is presented on behalf of the Interoperability Working Group of the IUGS Commission for Geoscience Information (CGI) - follow web-link for details of the membership.

  10. An HL7-CDA wrapper for facilitating semantic interoperability to rule-based Clinical Decision Support Systems.

    PubMed

    Sáez, Carlos; Bresó, Adrián; Vicente, Javier; Robles, Montserrat; García-Gómez, Juan Miguel

    2013-03-01

    The success of Clinical Decision Support Systems (CDSS) greatly depends on its capability of being integrated in Health Information Systems (HIS). Several proposals have been published up to date to permit CDSS gathering patient data from HIS. Some base the CDSS data input on the HL7 reference model, however, they are tailored to specific CDSS or clinical guidelines technologies, or do not focus on standardizing the CDSS resultant knowledge. We propose a solution for facilitating semantic interoperability to rule-based CDSS focusing on standardized input and output documents conforming an HL7-CDA wrapper. We define the HL7-CDA restrictions in a HL7-CDA implementation guide. Patient data and rule inference results are mapped respectively to and from the CDSS by means of a binding method based on an XML binding file. As an independent clinical document, the results of a CDSS can present clinical and legal validity. The proposed solution is being applied in a CDSS for providing patient-specific recommendations for the care management of outpatients with diabetes mellitus. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  11. Collection and Processing of Data from Wrist Wearable Devices in Heterogeneous and Multiple-User Scenarios.

    PubMed

    de Arriba-Pérez, Francisco; Caeiro-Rodríguez, Manuel; Santos-Gago, Juan M

    2016-09-21

    Over recent years, we have witnessed the development of mobile and wearable technologies to collect data from human vital signs and activities. Nowadays, wrist wearables including sensors (e.g., heart rate, accelerometer, pedometer) that provide valuable data are common in market. We are working on the analytic exploitation of this kind of data towards the support of learners and teachers in educational contexts. More precisely, sleep and stress indicators are defined to assist teachers and learners on the regulation of their activities. During this development, we have identified interoperability challenges related to the collection and processing of data from wearable devices. Different vendors adopt specific approaches about the way data can be collected from wearables into third-party systems. This hinders such developments as the one that we are carrying out. This paper contributes to identifying key interoperability issues in this kind of scenario and proposes guidelines to solve them. Taking into account these topics, this work is situated in the context of the standardization activities being carried out in the Internet of Things and Machine to Machine domains.

  12. Defining the Collision Avoidance Region for DAA Systems

    NASA Technical Reports Server (NTRS)

    Thipphavong, David; Cone, Andrew; Park, Chunki; Lee, Seung Man; Santiago, Confesor

    2016-01-01

    Unmanned aircraft systems (UAS) will be required to equip with a detect-­-and-­-avoid (DAA) system in order to satisfy the federal aviation regulations to maintain well clear of other aircraft, some of which may be equipped with a Traffic Collision Avoidance System (TCAS) to mitigate the possibility of mid-­-air collisions. As such, the minimum operational performance standards (MOPS) for UAS DAA systems are being designed with TCAS interoperability in mind by a group of industry, government, and academic institutions named RTCA Special Committee-228 (SC-228). This document will discuss the development of the spatial-­-temporal volume known as the collision avoidance region in which the DAA system is not allowed to provide vertical guidance to maintain or regain DAA well clear that could conflict with resolution advisories (RAs) issued by the intruder aircraft's TCAS system. Three collision avoidance region definition candidates were developed based on the existing TCAS RA and DAA alerting definitions. They were evaluated against each other in terms of their interoperability with TCAS RAs and DAA alerts in an unmitigated factorial encounter analysis of 1.3 million simulated pairs.

  13. Towards public health decision support: a systematic review of bidirectional communication approaches.

    PubMed

    Dixon, Brian E; Gamache, Roland E; Grannis, Shaun J

    2013-05-01

    To summarize the literature describing computer-based interventions aimed at improving bidirectional communication between clinical and public health. A systematic review of English articles using MEDLINE and Google Scholar. Search terms included public health, epidemiology, electronic health records, decision support, expert systems, and decision-making. Only articles that described the communication of information regarding emerging health threats from public health agencies to clinicians or provider organizations were included. Each article was independently reviewed by two authors. Ten peer-reviewed articles highlight a nascent but promising area of research and practice related to alerting clinicians about emerging threats. Current literature suggests that additional research and development in bidirectional communication infrastructure should focus on defining a coherent architecture, improving interoperability, establishing clear governance, and creating usable systems that will effectively deliver targeted, specific information to clinicians in support of patient and population decision-making. Increasingly available clinical information systems make it possible to deliver timely, relevant knowledge to frontline clinicians in support of population health. Future work should focus on developing a flexible, interoperable infrastructure for bidirectional communications capable of integrating public health knowledge into clinical systems and workflows.

  14. Collection and Processing of Data from Wrist Wearable Devices in Heterogeneous and Multiple-User Scenarios

    PubMed Central

    de Arriba-Pérez, Francisco; Caeiro-Rodríguez, Manuel; Santos-Gago, Juan M.

    2016-01-01

    Over recent years, we have witnessed the development of mobile and wearable technologies to collect data from human vital signs and activities. Nowadays, wrist wearables including sensors (e.g., heart rate, accelerometer, pedometer) that provide valuable data are common in market. We are working on the analytic exploitation of this kind of data towards the support of learners and teachers in educational contexts. More precisely, sleep and stress indicators are defined to assist teachers and learners on the regulation of their activities. During this development, we have identified interoperability challenges related to the collection and processing of data from wearable devices. Different vendors adopt specific approaches about the way data can be collected from wearables into third-party systems. This hinders such developments as the one that we are carrying out. This paper contributes to identifying key interoperability issues in this kind of scenario and proposes guidelines to solve them. Taking into account these topics, this work is situated in the context of the standardization activities being carried out in the Internet of Things and Machine to Machine domains. PMID:27657081

  15. Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) Project: Terminal Operations HITL 1: Primary Results

    NASA Technical Reports Server (NTRS)

    Rorie, Conrad; Fern, Lisa; Monk, Kevin; Roberts, Zach; Brandt, Summer

    2017-01-01

    This presentation covers the primary results of the Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) Project Terminal Operations Foundational Human-in-the-Loop (HITL) simulation. The study tasked 16 pilots (half with manned piloting experience, and the other half with unmanned piloting experience) with maintaining "well clear" from other traffic while performing three different types of approaches into the Santa Rosa airport. A detect and avoid (DAA) system was provided to pilots to assist their ability to manage separation. The DAA system used in this test conformed to the criteria defined by RTCA Special Committee 228 (SC-228) in their Phase 1 Minimum Operational Performance Standards (MOPS) for UAS intending to operate in the NAS. The Phase 1 system was not designed to account for terminal operations, focusing instead on en route operations. To account for this, three different alerting and guidance configurations were presently tested in order to determine their effect on pilots operating the system in the terminal area. Results indicated that pilots with the alerting and guidance condition that provided the least amount of assistance (fewer alert levels and guidance types) experienced slightly increased pilot response times and rates of losses of separation. Additional data is presented on the effects of approach type and descriptive data on pilot maneuver preferences and ATC interoperability.

  16. Critical evaluation of connectivity-based point of care testing systems of glucose in a hospital environment.

    PubMed

    Floré, Katelijne M J; Fiers, Tom; Delanghe, Joris R

    2008-01-01

    In recent years a number of point of care testing (POCT) glucometers were introduced on the market. We investigated the analytical variability (lot-to-lot variation, calibration error, inter-instrument and inter-operator variability) of glucose POCT systems in a university hospital environment and compared these results with the analytical needs required for tight glucose monitoring. The reference hexokinase method was compared to different POCT systems based on glucose oxidase (blood gas instruments) or glucose dehydrogenase (handheld glucometers). Based upon daily internal quality control data, total errors were calculated for the various glucose methods and the analytical variability of the glucometers was estimated. The total error of the glucometers exceeded by far the desirable analytical specifications (based on a biological variability model). Lot-to-lot variation, inter-instrument variation and inter-operator variability contributed approximately equally to total variance. As in a hospital environment, distribution of hematocrit values is broad, converting blood glucose into plasma values using a fixed factor further increases variance. The percentage of outliers exceeded the ISO 15197 criteria in a broad glucose concentration range. Total analytical variation of handheld glucometers is larger than expected. Clinicians should be aware that the variability of glucose measurements obtained by blood gas instruments is lower than results obtained with handheld glucometers on capillary blood.

  17. Organisational Interoperability: Evaluation and Further Development of the OIM Model

    DTIC Science & Technology

    2003-06-01

    an Organizational Interoperability Maturity Model (OIM) to evaluate interoperability at the organizational level. The OIM considers the human ... activity aspects of military operations, which are not covered in other models. This paper describes how the model has been used to identify problems and to

  18. First Experiences Using XACML for Access Control in Distributed Systems

    NASA Technical Reports Server (NTRS)

    Lorch, Marcus; Proctor, Seth; Lepro, Rebekah; Kafura, Dennis; Shah, Sumit

    2003-01-01

    Authorization systems today are increasingly complex. They span domains of administration, rely on many different authentication sources, and manage permissions that can be as complex as the system itself. Worse still, while there are many standards that define authentication mechanisms, the standards that address authorization are less well defined and tend to work only within homogeneous systems. This paper presents XACML, a standard access control language, as one component of a distributed and inter-operable authorization framework. Several emerging systems which incorporate XACML are discussed. These discussions illustrate how authorization can be deployed in distributed, decentralized systems. Finally, some new and future topics are presented to show where this work is heading and how it will help connect the general components of an authorization system.

  19. Clinical knowledge governance: the international perspective.

    PubMed

    Garde, Sebastian

    2013-01-01

    As a basis for semantic interoperability, ideally, a Clinical Knowledge Resource for a clinical concept should be defined formally and defined once in a way that all clinical professions and all countries can agree on. Clinical Knowledge Governance is required to create high-quality, reusable Clinical Knowledge Resources and achieve this aim. Traditionally, this is a time-consuming and cumbersome process, relying heavily on face-to-face meetings and being able to get sufficient input from clinicians. However, in a national or even international space, it is required to streamline the processes involved in creating Clinical Knowledge Resources. For this, a Web 2.0 tool that supports online collaboration of clinicians during their creation and publishing of Clinical Knowledge Resources has been developed. This tool is named the Clinical Knowledge Manager (CKM) and supports the development, review and publication of Clinical Knowledge Resources. Also, post-publication activities such as adding terminology bindings, translating the Clinical Knowledge Resource into another language and republishing it are supported. The acceptance of Clinical Knowledge Resources depends on their quality and being able to determine their quality, for example it is important to know that a broad umber of reviewers from various clinical disciplines have been involved in the development of the Clinical Knowledge Resource. We are still far from realizing the vision of a global repository of a great number of reusable, high-quality Clinical Knowledge Resources, which can provide the basis for broad semantic interoperability between systems. However progress towards this aim is being made around the world.

  20. An ontology for component-based models of water resource systems

    NASA Astrophysics Data System (ADS)

    Elag, Mostafa; Goodall, Jonathan L.

    2013-08-01

    Component-based modeling is an approach for simulating water resource systems where a model is composed of a set of components, each with a defined modeling objective, interlinked through data exchanges. Component-based modeling frameworks are used within the hydrologic, atmospheric, and earth surface dynamics modeling communities. While these efforts have been advancing, it has become clear that the water resources modeling community in particular, and arguably the larger earth science modeling community as well, faces a challenge of fully and precisely defining the metadata for model components. The lack of a unified framework for model component metadata limits interoperability between modeling communities and the reuse of models across modeling frameworks due to ambiguity about the model and its capabilities. To address this need, we propose an ontology for water resources model components that describes core concepts and relationships using the Web Ontology Language (OWL). The ontology that we present, which is termed the Water Resources Component (WRC) ontology, is meant to serve as a starting point that can be refined over time through engagement by the larger community until a robust knowledge framework for water resource model components is achieved. This paper presents the methodology used to arrive at the WRC ontology, the WRC ontology itself, and examples of how the ontology can aid in component-based water resources modeling by (i) assisting in identifying relevant models, (ii) encouraging proper model coupling, and (iii) facilitating interoperability across earth science modeling frameworks.

  1. Enterprise tools to promote interoperability: MonitoringResources.org supports design and documentation of large-scale, long-term monitoringprograms

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.; Scully, R. A.; Bayer, J.

    2016-12-01

    Individual natural resource monitoring programs have evolved in response to different organizational mandates, jurisdictional needs, issues and questions. We are establishing a collaborative forum for large-scale, long-term monitoring programs to identify opportunities where collaboration could yield efficiency in monitoring design, implementation, analyses, and data sharing. We anticipate these monitoring programs will have similar requirements - e.g. survey design, standardization of protocols and methods, information management and delivery - that could be met by enterprise tools to promote sustainability, efficiency and interoperability of information across geopolitical boundaries or organizational cultures. MonitoringResources.org, a project of the Pacific Northwest Aquatic Monitoring Partnership, provides an on-line suite of enterprise tools focused on aquatic systems in the Pacific Northwest Region of the United States. We will leverage on and expand this existing capacity to support continental-scale monitoring of both aquatic and terrestrial systems. The current stakeholder group is focused on programs led by bureaus with the Department of Interior, but the tools will be readily and freely available to a broad variety of other stakeholders. Here, we report the results of two initial stakeholder workshops focused on (1) establishing a collaborative forum of large scale monitoring programs, (2) identifying and prioritizing shared needs, (3) evaluating existing enterprise resources, (4) defining priorities for development of enhanced capacity for MonitoringResources.org, and (5) identifying a small number of pilot projects that can be used to define and test development requirements for specific monitoring programs.

  2. Using ontologies to improve semantic interoperability in health data.

    PubMed

    Liyanage, Harshana; Krause, Paul; De Lusignan, Simon

    2015-07-10

    The present-day health data ecosystem comprises a wide array of complex heterogeneous data sources. A wide range of clinical, health care, social and other clinically relevant information are stored in these data sources. These data exist either as structured data or as free-text. These data are generally individual person-based records, but social care data are generally case based and less formal data sources may be shared by groups. The structured data may be organised in a proprietary way or be coded using one-of-many coding, classification or terminologies that have often evolved in isolation and designed to meet the needs of the context that they have been developed. This has resulted in a wide range of semantic interoperability issues that make the integration of data held on these different systems changing. We present semantic interoperability challenges and describe a classification of these. We propose a four-step process and a toolkit for those wishing to work more ontologically, progressing from the identification and specification of concepts to validating a final ontology. The four steps are: (1) the identification and specification of data sources; (2) the conceptualisation of semantic meaning; (3) defining to what extent routine data can be used as a measure of the process or outcome of care required in a particular study or audit and (4) the formalisation and validation of the final ontology. The toolkit is an extension of a previous schema created to formalise the development of ontologies related to chronic disease management. The extensions are focused on facilitating rapid building of ontologies for time-critical research studies.

  3. 77 FR 48153 - Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-13

    ..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public..., Reliability, and Interoperability Council (CSRIC) will hold its fifth meeting. The CSRIC will vote on... to the FCC regarding best practices and actions the FCC can take to ensure the security, reliability...

  4. Evaluation of Interoperability Protocols in Repositories of Electronic Theses and Dissertations

    ERIC Educational Resources Information Center

    Hakimjavadi, Hesamedin; Masrek, Mohamad Noorman

    2013-01-01

    Purpose: The purpose of this study is to evaluate the status of eight interoperability protocols within repositories of electronic theses and dissertations (ETDs) as an introduction to further studies on feasibility of deploying these protocols in upcoming areas of interoperability. Design/methodology/approach: Three surveys of 266 ETD…

  5. Examining the Relationship between Electronic Health Record Interoperability and Quality Management

    ERIC Educational Resources Information Center

    Purcell, Bernice M.

    2013-01-01

    A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…

  6. Interoperability of Demand Response Resources Demonstration in NY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wellington, Andre

    2014-03-31

    The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.

  7. Watershed and Economic Data InterOperability (WEDO): Facilitating Discovery, Evaluation and Integration through the Sharing of Watershed Modeling Data

    EPA Science Inventory

    Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interop...

  8. Reminiscing about 15 years of interoperability efforts

    DOE PAGES

    Van de Sompel, Herbert; Nelson, Michael L.

    2015-11-01

    Over the past fifteen years, our perspective on tackling information interoperability problems for web-based scholarship has evolved significantly. In this opinion piece, we look back at three efforts that we have been involved in that aptly illustrate this evolution: OAI-PMH, OAI-ORE, and Memento. Understanding that no interoperability specification is neutral, we attempt to characterize the perspectives and technical toolkits that provided the basis for these endeavors. With that regard, we consider repository-centric and web-centric interoperability perspectives, and the use of a Linked Data or a REST/HATEAOS technology stack, respectively. In addition, we lament the lack of interoperability across nodes thatmore » play a role in web-based scholarship, but end on a constructive note with some ideas regarding a possible path forward.« less

  9. The HDF Product Designer - Interoperability in the First Mile

    NASA Astrophysics Data System (ADS)

    Lee, H.; Jelenak, A.; Habermann, T.

    2014-12-01

    Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.

  10. Potential interoperability problems facing multi-site radiation oncology centers in The Netherlands

    NASA Astrophysics Data System (ADS)

    Scheurleer, J.; Koken, Ph; Wessel, R.

    2014-03-01

    Aim: To identify potential interoperability problems facing multi-site Radiation Oncology (RO) departments in the Netherlands and solutions for unambiguous multi-system workflows. Specific challenges confronting the RO department of VUmc (RO-VUmc), which is soon to open a satellite department, were characterized. Methods: A nationwide questionnaire survey was conducted to identify possible interoperability problems and solutions. Further detailed information was obtained by in-depth interviews at 3 Dutch RO institutes that already operate in more than one site. Results: The survey had a 100% response rate (n=21). Altogether 95 interoperability problems were described. Most reported problems were on a strategic and semantic level. The majority were DICOM(-RT) and HL7 related (n=65), primarily between treatment planning and verification systems or between departmental and hospital systems. Seven were identified as being relevant for RO-VUmc. Departments have overcome interoperability problems with their own, or with tailor-made vendor solutions. There was little knowledge about or utilization of solutions developed by Integrating the Healthcare Enterprise Radiation Oncology (IHE-RO). Conclusions: Although interoperability problems are still common, solutions have been identified. Awareness of IHE-RO needs to be raised. No major new interoperability problems are predicted as RO-VUmc develops into a multi-site department.

  11. Is There Evidence of Cost Benefits of Electronic Medical Records, Standards, or Interoperability in Hospital Information Systems? Overview of Systematic Reviews

    PubMed Central

    2017-01-01

    Background Electronic health (eHealth) interventions may improve the quality of care by providing timely, accessible information about one patient or an entire population. Electronic patient care information forms the nucleus of computerized health information systems. However, interoperability among systems depends on the adoption of information standards. Additionally, investing in technology systems requires cost-effectiveness studies to ensure the sustainability of processes for stakeholders. Objective The objective of this study was to assess cost-effectiveness of the use of electronically available inpatient data systems, health information exchange, or standards to support interoperability among systems. Methods An overview of systematic reviews was conducted, assessing the MEDLINE, Cochrane Library, LILACS, and IEEE Library databases to identify relevant studies published through February 2016. The search was supplemented by citations from the selected papers. The primary outcome sought the cost-effectiveness, and the secondary outcome was the impact on quality of care. Independent reviewers selected studies, and disagreement was resolved by consensus. The quality of the included studies was evaluated using a measurement tool to assess systematic reviews (AMSTAR). Results The primary search identified 286 papers, and two papers were manually included. A total of 211 were systematic reviews. From the 20 studies that were selected after screening the title and abstract, 14 were deemed ineligible, and six met the inclusion criteria. The interventions did not show a measurable effect on cost-effectiveness. Despite the limited number of studies, the heterogeneity of electronic systems reported, and the types of intervention in hospital routines, it was possible to identify some preliminary benefits in quality of care. Hospital information systems, along with information sharing, had the potential to improve clinical practice by reducing staff errors or incidents, improving automated harm detection, monitoring infections more effectively, and enhancing the continuity of care during physician handoffs. Conclusions This review identified some benefits in the quality of care but did not provide evidence that the implementation of eHealth interventions had a measurable impact on cost-effectiveness in hospital settings. However, further evidence is needed to infer the impact of standards adoption or interoperability in cost benefits of health care; this in turn requires further research. PMID:28851681

  12. Is There Evidence of Cost Benefits of Electronic Medical Records, Standards, or Interoperability in Hospital Information Systems? Overview of Systematic Reviews.

    PubMed

    Reis, Zilma Silveira Nogueira; Maia, Thais Abreu; Marcolino, Milena Soriano; Becerra-Posada, Francisco; Novillo-Ortiz, David; Ribeiro, Antonio Luiz Pinho

    2017-08-29

    Electronic health (eHealth) interventions may improve the quality of care by providing timely, accessible information about one patient or an entire population. Electronic patient care information forms the nucleus of computerized health information systems. However, interoperability among systems depends on the adoption of information standards. Additionally, investing in technology systems requires cost-effectiveness studies to ensure the sustainability of processes for stakeholders. The objective of this study was to assess cost-effectiveness of the use of electronically available inpatient data systems, health information exchange, or standards to support interoperability among systems. An overview of systematic reviews was conducted, assessing the MEDLINE, Cochrane Library, LILACS, and IEEE Library databases to identify relevant studies published through February 2016. The search was supplemented by citations from the selected papers. The primary outcome sought the cost-effectiveness, and the secondary outcome was the impact on quality of care. Independent reviewers selected studies, and disagreement was resolved by consensus. The quality of the included studies was evaluated using a measurement tool to assess systematic reviews (AMSTAR). The primary search identified 286 papers, and two papers were manually included. A total of 211 were systematic reviews. From the 20 studies that were selected after screening the title and abstract, 14 were deemed ineligible, and six met the inclusion criteria. The interventions did not show a measurable effect on cost-effectiveness. Despite the limited number of studies, the heterogeneity of electronic systems reported, and the types of intervention in hospital routines, it was possible to identify some preliminary benefits in quality of care. Hospital information systems, along with information sharing, had the potential to improve clinical practice by reducing staff errors or incidents, improving automated harm detection, monitoring infections more effectively, and enhancing the continuity of care during physician handoffs. This review identified some benefits in the quality of care but did not provide evidence that the implementation of eHealth interventions had a measurable impact on cost-effectiveness in hospital settings. However, further evidence is needed to infer the impact of standards adoption or interoperability in cost benefits of health care; this in turn requires further research. ©Zilma Silveira Nogueira Reis, Thais Abreu Maia, Milena Soriano Marcolino, Francisco Becerra-Posada, David Novillo-Ortiz, Antonio Luiz Pinho Ribeiro. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 29.08.2017.

  13. Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaidon, Clement; Poplawski, Michael

    First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.

  14. Smart Grid Interoperability Maturity Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Levinson, Alex; Mater, J.

    2010-04-28

    The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizationalmore » alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.« less

  15. Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'

    NASA Astrophysics Data System (ADS)

    Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno

    2015-04-01

    Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.

  16. An open repositories network development for medical teaching resources.

    PubMed

    Soula, Gérard; Darmoni, Stefan; Le Beux, Pierre; Renard, Jean-Marie; Dahamna, Badisse; Fieschi, Marius

    2010-01-01

    The lack of interoperability between repositories of heterogeneous and geographically widespread data is an obstacle to the diffusion, sharing and reutilization of those data. We present the development of an open repositories network taking into account both the syntactic and semantic interoperability of the different repositories and based on international standards in this field. The network is used by the medical community in France for the diffusion and sharing of digital teaching resources. The syntactic interoperability of the repositories is managed using the OAI-PMH protocol for the exchange of metadata describing the resources. Semantic interoperability is based, on one hand, on the LOM standard for the description of resources and on MESH for the indexing of the latter and, on the other hand, on semantic interoperability management designed to optimize compliance with standards and the quality of the metadata.

  17. ARGOS Policy Brief on Semantic Interoperability

    PubMed Central

    KALRA, Dipak; MUSEN, Mark; SMITH, Barry; CEUSTERS, Werner

    2016-01-01

    Semantic interoperability is one of the priority themes of the ARGOS Trans-Atlantic Observatory. This topic represents a globally recognised challenge that must be addressed if electronic health records are to be shared among heterogeneous systems, and the information in them exploited to the maximum benefit of patients, professionals, health services, research, and industry. Progress in this multi-faceted challenge has been piecemeal, and valuable lessons have been learned, and approaches discovered, in Europe and in the US that can be shared and combined. Experts from both continents have met at three ARGOS workshops during 2010 and 2011 to share understanding of these issues and how they might be tackled collectively from both sides of the Atlantic. This policy brief summarises the problems and the reasons why they are important to tackle, and also why they are so difficult. It outlines the major areas of semantic innovation that exist and that are available to help address this challenge. It proposes a series of next steps that need to be championed on both sides of the Atlantic if further progress is to be made in sharing and analysing electronic health records meaningfully. Semantic interoperability requires the use of standards, not only for EHR data to be transferred and structurally mapped into a receiving repository, but also for the clinical content of the EHR to be interpreted in conformity with the original meanings intended by its authors. Wide-scale engagement with professional bodies, globally, is needed to develop these clinical information standards. Accurate and complete clinical documentation, faithful to the patient’s situation, and interoperability between systems, require widespread and dependable access to published and maintained collections of coherent and quality-assured semantic resources, including models such as archetypes and templates that would (1) provide clinical context, (2) be mapped to interoperability standards for EHR data, (3) be linked to well specified multi-lingual terminology value sets, and (4) be derived from high quality ontologies. There is need to gain greater experience in how semantic resources should be defined, validated, and disseminated, how users (who increasingly will include patients) should be educated to improve the quality and consistency of EHR documentation and to make full use of it. There are urgent needs to scale up the authorship, acceptance, and adoption of clinical information standards, to leverage and harmonise the islands of standardisation optimally, to assure the quality of the artefacts produced, and to organise end-to-end governance of the development and adoption of solutions. PMID:21893897

  18. Special Topic Interoperability and EHR: Combining openEHR, SNOMED, IHE, and Continua as approaches to interoperability on national eHealth.

    PubMed

    Beštek, Mate; Stanimirović, Dalibor

    2017-08-09

    The main aims of the paper comprise the characterization and examination of the potential approaches regarding interoperability. This includes openEHR, SNOMED, IHE, and Continua as combined interoperability approaches, possibilities for their incorporation into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. The paper represents an in-depth analysis regarding the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research and charting of existing experience in the field, and sources, both electronic and written, which include interoperability concepts and related implementation issues. The paper will try to answer the following inquiries that are complementing each other: 1. Scrutiny of the potential approaches, which could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field that critically influence development and implementation of eHealth projects in an efficient manner. Provided insights and identified success factors could serve as a constituent of the strategic starting points for continuous integration of interoperability principles into the healthcare domain. Moreover, the general implementation of the identified success factors could facilitate better penetration of ICT into the healthcare environment and enable the eHealth-based transformation of the health system especially in the countries which are still in an early phase of eHealth planning and development and are often confronted with differing interests, requirements, and contending strategies.

  19. Ensuring Sustainable Data Interoperability Across the Natural and Social Sciences

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.

    2015-12-01

    Both the natural and social science data communities are attempting to address the long-term sustainability of their data infrastructures in rapidly changing research, technological, and policy environments. Many parts of these communities are also considering how to improve the interoperability and integration of their data and systems across natural, social, health, and other domains. However, these efforts have generally been undertaken in parallel, with little thought about how different sustainability approaches may impact long-term interoperability from scientific, legal, or economic perspectives, or vice versa, i.e., how improved interoperability could enhance—or threaten—infrastructure sustainability. Scientific progress depends substantially on the ability to learn from the legacy of previous work available for current and future scientists to study, often by integrating disparate data not previously assembled. Digital data are less likely than scientific publications to be usable in the future unless they are managed by science-oriented repositories that can support long-term data access with the documentation and services needed for future interoperability. We summarize recent discussions in the social and natural science communities on emerging approaches to sustainability and relevant interoperability activities, including efforts by the Belmont Forum E-Infrastructures project to address global change data infrastructure needs; the Group on Earth Observations to further implement data sharing and improve data management across diverse societal benefit areas; and the Research Data Alliance to develop legal interoperability principles and guidelines and to address challenges faced by domain repositories. We also examine emerging needs for data interoperability in the context of the post-2015 development agenda and the expected set of Sustainable Development Goals (SDGs), which set ambitious targets for sustainable development, poverty reduction, and environmental stewardship by 2030. These efforts suggest the need for a holistic approach towards improving and implementing strategies, policies, and practices that will ensure long-term sustainability and interoperability of scientific data repositories and networks across multiple scientific domains.

  20. A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Sankaran, S.

    2015-12-01

    In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted as part of our everyday use of technology.

  1. Web Services Security - Implementation and Evaluation Issues

    NASA Astrophysics Data System (ADS)

    Pimenidis, Elias; Georgiadis, Christos K.; Bako, Peter; Zorkadis, Vassilis

    Web services development is a key theme in the utilization the commercial exploitation of the semantic web. Paramount to the development and offering of such services is the issue of security features and they way these are applied in instituting trust amongst participants and recipients of the service. Implementing such security features is a major challenge to developers as they need to balance these with performance and interoperability requirements. Being able to evaluate the level of security offered is a desirable feature for any prospective participant. The authors attempt to address the issues of security requirements and evaluation criteria, while they discuss the challenges of security implementation through a simple web service application case.

  2. Groundwater data network interoperability

    USGS Publications Warehouse

    Brodaric, Boyan; Booth, Nathaniel; Boisvert, Eric; Lucido, Jessica M.

    2016-01-01

    Water data networks are increasingly being integrated to answer complex scientific questions that often span large geographical areas and cross political borders. Data heterogeneity is a major obstacle that impedes interoperability within and between such networks. It is resolved here for groundwater data at five levels of interoperability, within a Spatial Data Infrastructure architecture. The result is a pair of distinct national groundwater data networks for the United States and Canada, and a combined data network in which they are interoperable. This combined data network enables, for the first time, transparent public access to harmonized groundwater data from both sides of the shared international border.

  3. Improving Patient Safety with X-Ray and Anesthesia Machine Ventilator Synchronization: A Medical Device Interoperability Case Study

    NASA Astrophysics Data System (ADS)

    Arney, David; Goldman, Julian M.; Whitehead, Susan F.; Lee, Insup

    When a x-ray image is needed during surgery, clinicians may stop the anesthesia machine ventilator while the exposure is made. If the ventilator is not restarted promptly, the patient may experience severe complications. This paper explores the interconnection of a ventilator and simulated x-ray into a prototype plug-and-play medical device system. This work assists ongoing interoperability framework development standards efforts to develop functional and non-functional requirements and illustrates the potential patient safety benefits of interoperable medical device systems by implementing a solution to a clinical use case requiring interoperability.

  4. Report on the Second Catalog Interoperability Workshop

    NASA Technical Reports Server (NTRS)

    Thieman, James R.; James, Mary E.

    1988-01-01

    The events, resolutions, and recommendations of the Second Catalog Interoperability Workshop, held at JPL in January, 1988, are discussed. This workshop dealt with the issues of standardization and communication among directories, catalogs, and inventories in the earth and space science data management environment. The Directory Interchange Format, being constructed as a standard for the exchange of directory information among participating data systems, is discussed. Involvement in the Interoperability effort by NASA, NOAA, ISGS, and NSF is described, and plans for future interoperability considered. The NASA Master Directory prototype is presented and critiqued and options for additional capabilities debated.

  5. A logical approach to semantic interoperability in healthcare.

    PubMed

    Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni

    2011-01-01

    Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.

  6. Architecture for Survivable Systems Processing (ASSP). Technology benefits for Open System Interconnects

    NASA Technical Reports Server (NTRS)

    Wood, Richard J.

    1992-01-01

    The Architecture for Survivable Systems Processing (ASSP) program is a two phase program whose objective is the derivation, specification, development and validation of an open system architecture capable of supporting advanced processing needs of space, ground, and launch vehicle operations. The output of the first phase is a set of hardware and software standards and specifications defining this architecture at three levels. The second phase will validate these standards and develop the technology necessary to achieve strategic hardness, packaging density, throughput requirements, and interoperability/interchangeability.

  7. Self-defining as sexually abused and adult sexual risk behavior: Results from a cross-sectional survey of women attending an STD clinic

    PubMed Central

    Carey, Michael P.; Coury-Doniger, Patricia

    2011-01-01

    Objective Childhood sexual abuse (CSA) is associated with increased sexual risk behavior in adulthood, and this association may be mediated by traumagenic dynamics constructs (i.e., traumatic sexualization, trust, guilt, and powerlessness). However, few studies have investigated whether such relationships hold for women who do not identify as having experienced CSA despite meeting objective criteria that CSA occurred. This study sought to determine whether individuals who met research criteria for CSA and who self-defined as sexually abused differed on traumagenic dynamics constructs and current sexual risk behavior from individuals who met research criteria for CSA and who did not self-define as sexually abused. Methods Participants were 481 women recruited from a publicly-funded STD clinic. Participants completed a computerized survey assessing childhood sexual experiences and adult sexual risk behavior. Results Of the total sample, 206 (43%) met research criteria for CSA. Of the women meeting research criteria for CSA, 142 (69%) self-defined as sexually abused. Women who met research criteria for CSA reported more traumatic sexualization, more trust of a partner, more powerlessness, less sexual guilt, more episodes of unprotected sex, more sex partners, and greater likelihood of sex trading, compared to women who did not meet research criteria for CSA. Among women meeting research criteria, those who self-defined as sexually abused did not differ from those who did not self-define on any of the traumagenic dynamics constructs or on current sexual risk behavior, controlling for CSA characteristics. Conclusions Individuals who were sexually abused as children by behavioral research criteria are at risk for engaging in sexual risk behavior as adults, regardless of whether or not they perceive the experience to be CSA. Future research is needed to understand how non-definers perceive these childhood sexual experiences. PMID:21620162

  8. Collaborative development of predictive toxicology applications

    PubMed Central

    2010-01-01

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way. PMID:20807436

  9. Collaborative development of predictive toxicology applications.

    PubMed

    Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia

    2010-08-31

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.

  10. Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) Project: Advanced Collision Avoidance System for UAS (ACAS Xu) Interoperability White Paper Presentation

    NASA Technical Reports Server (NTRS)

    Fern, Lisa

    2017-01-01

    The Phase 1 DAA Minimum Operational Performance Standards (MOPS) provided requirements for two classes of DAA equipment: equipment Class 1 contains the basic DAA equipment required to assist a pilot in remaining well clear, while equipment Class 2 integrates the Traffic Alert and Collision Avoidance (TCAS) II system. Thus, the Class 1 system provides RWC functionality only, while the Class 2 system is intended to provide both RWC and Collision Avoidance (CA) functionality, in compliance with the Minimum Aviation System Performance (MASPS) for the Interoperability of Airborne Collision Avoidance Systems. The FAAs TCAS Program Office is currently developing Airborne Collision Avoidance System X (ACAS X) to support the objectives of the Federal Aviation Administrations (FAA) Next Generation Air Transportation System Program (NextGen). ACAS X has a suite of variants with a common underlying design that are intended to be optimized for their intended airframes and operations. ACAS Xu being is designed for UAS and allows for new surveillance technologies and tailored logic for platforms with different performance characteristics. In addition to Collision Avoidance (CA) alerting and guidance, ACAS Xu is being tuned to provide RWC alerting and guidance in compliance with the SC 228 DAA MOPS. With a single logic performing both RWC and CA functions, ACAS Xu will provide industry with an integrated DAA solution that addresses many of the interoperability shortcomings of Phase I systems. While the MOPS for ACAS Xu will specify an integrated DAA system, it will need to show compliance with the RWC alerting thresholds and alerting requirements defined in the DAA Phase 2 MOPS. Further, some functional components of the ACAS Xu system such as the remote pilots displayed guidance might be mostly references to the corresponding requirements in the DAA MOPS. To provide a seamless, integrated, RWC-CA system to assist the pilot in remaining well clear and avoiding collisions, several issues need to be addressed within the Phase 2 SC-228 DAA efforts. Interoperability of the RWC and CA alerting and guidance, and ensuring pilot comprehension, compliance and performance, will be a primary research area.

  11. Publication, discovery and interoperability of Clinical Decision Support Systems: A Linked Data approach.

    PubMed

    Marco-Ruiz, Luis; Pedrinaci, Carlos; Maldonado, J A; Panziera, Luca; Chen, Rong; Bellika, J Gustav

    2016-08-01

    The high costs involved in the development of Clinical Decision Support Systems (CDSS) make it necessary to share their functionality across different systems and organizations. Service Oriented Architectures (SOA) have been proposed to allow reusing CDSS by encapsulating them in a Web service. However, strong barriers in sharing CDS functionality are still present as a consequence of lack of expressiveness of services' interfaces. Linked Services are the evolution of the Semantic Web Services paradigm to process Linked Data. They aim to provide semantic descriptions over SOA implementations to overcome the limitations derived from the syntactic nature of Web services technologies. To facilitate the publication, discovery and interoperability of CDS services by evolving them into Linked Services that expose their interfaces as Linked Data. We developed methods and models to enhance CDS SOA as Linked Services that define a rich semantic layer based on machine interpretable ontologies that powers their interoperability and reuse. These ontologies provided unambiguous descriptions of CDS services properties to expose them to the Web of Data. We developed models compliant with Linked Data principles to create a semantic representation of the components that compose CDS services. To evaluate our approach we implemented a set of CDS Linked Services using a Web service definition ontology. The definitions of Web services were linked to the models developed in order to attach unambiguous semantics to the service components. All models were bound to SNOMED-CT and public ontologies (e.g. Dublin Core) in order to count on a lingua franca to explore them. Discovery and analysis of CDS services based on machine interpretable models was performed reasoning over the ontologies built. Linked Services can be used effectively to expose CDS services to the Web of Data by building on current CDS standards. This allows building shared Linked Knowledge Bases to provide machine interpretable semantics to the CDS service description alleviating the challenges on interoperability and reuse. Linked Services allow for building 'digital libraries' of distributed CDS services that can be hosted and maintained in different organizations. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Reuse and Interoperability of Avionics for Space Systems

    NASA Technical Reports Server (NTRS)

    Hodson, Robert F.

    2007-01-01

    The space environment presents unique challenges for avionics. Launch survivability, thermal management, radiation protection, and other factors are important for successful space designs. Many existing avionics designs use custom hardware and software to meet the requirements of space systems. Although some space vendors have moved more towards a standard product line approach to avionics, the space industry still lacks similar standards and common practices for avionics development. This lack of commonality manifests itself in limited reuse and a lack of interoperability. To address NASA s need for interoperable avionics that facilitate reuse, several hardware and software approaches are discussed. Experiences with existing space boards and the application of terrestrial standards is outlined. Enhancements and extensions to these standards are considered. A modular stack-based approach to space avionics is presented. Software and reconfigurable logic cores are considered for extending interoperability and reuse. Finally, some of the issues associated with the design of reusable interoperable avionics are discussed.

  13. Lessons Learned for Cx PRACA. Constellation Program Problem Reporting, Analysis and Corrective Action Process and System

    NASA Technical Reports Server (NTRS)

    Kelle, Pido I.; Ratterman, Christian; Gibbs, Cecil

    2009-01-01

    This slide presentation reviews the Constellation Program Problem Reporting, Analysis and Corrective Action Process and System (Cx PRACA). The goal of the Cx PRACA is to incorporate Lessons learned from the Shuttle, ISS, and Orbiter programs by creating a single tool for managing the PRACA process, that clearly defines the scope of PRACA applicability and what must be reported, and defines the ownership and responsibility for managing the PRACA process including disposition approval authority. CxP PRACA is a process, supported by a single information gathering data module which will be integrated with a single CxP Information System, providing interoperability, import and export capability making the CxP PRACA a more effective and user friendly technical and management tool.

  14. Using archetypes for defining CDA templates.

    PubMed

    Moner, David; Moreno, Alberto; Maldonado, José A; Robles, Montserrat; Parra, Carlos

    2012-01-01

    While HL7 CDA is a widely adopted standard for the documentation of clinical information, the archetype approach proposed by CEN/ISO 13606 and openEHR is gaining recognition as a means of describing domain models and medical knowledge. This paper describes our efforts in combining both standards. Using archetypes as an alternative for defining CDA templates permit new possibilities all based on the formal nature of archetypes and their ability to merge into the same artifact medical knowledge and technical requirements for semantic interoperability of electronic health records. We describe the process followed for the normalization of existing legacy data in a hospital environment, from the importation of the HL7 CDA model into an archetype editor, the definition of CDA archetypes and the application of those archetypes to obtain normalized CDA data instances.

  15. Interoperability Measurement

    DTIC Science & Technology

    2008-08-01

    facilitate the use of existing architecture descriptions in performing interoperability measurement. Noting that “everything in the world can be expressed as...biological, botanical, and genetic research, it has also been used with great success in the fields of ecology, medicine, the social sciences, the...appropriate for at least three reasons. First, systems perform different interoperations in different scenarios (i.e., they are used differently); second

  16. Commanding Heterogeneous Multi-Robot Teams

    DTIC Science & Technology

    2014-06-01

    Coalition Battle Management Language (C-BML) Study Group Report. 2005 Fall Simulation Interoperability Workshop (05F- SIW - 041), Orlando, FL, September...NMSG-085 CIG Land Operation Demonstration. 2013 Spring Simulation Interoperability Workshop (13S- SIW -031), San Diego, CA. April 2013. [4] K...Simulation Interoperability Workshop (10F- SIW -039), Orlando, FL, September 2010. [5] M. Langerwisch, M. Ax, S. Thamke, T. Remmersmann, A. Tiderko

  17. DeoR repression at-a-distance only weakly responds to changes in interoperator separation and DNA topology.

    PubMed Central

    Dandanell, G

    1992-01-01

    The interoperator distance between a synthetic operator Os and the deoP2O2-galK fusion was varied between 46 and 176 bp. The repression of the deoP2 directed galK expression as a function of the interoperator distance (center-to-center) was measured in vivo in a single-copy system. The results show that the DeoR repressor efficiently can repress transcription at all the interoperator distances tested. The degree of repression depends very little on the spacing between the operators, however, a weak periodic dependency of 8-11 bp may exist. PMID:1437558

  18. Photovoltaic system criteria documents. Volume 3: Environmental issues and evaluation criteria for photovoltaic applications

    NASA Technical Reports Server (NTRS)

    Koenig, John C.; Billitti, Joseph W.; Tallon, John M.

    1979-01-01

    The environmental issues and evaluation criteria relating to the suitability of sites proposed for photovoltaic (PV) system deployment are identified. The important issues are defined, briefly discussed and then developed into evaluation criteria. System designers are provided with information on the environmental sensitivity of PV systems in realistic applications, background material which indicates the applicability of the siting issues identified, and evaluation criteria are defined to facilitate the selection of sites that maximize PV system operation.

  19. A federated semantic metadata registry framework for enabling interoperability across clinical research and care domains.

    PubMed

    Sinaci, A Anil; Laleci Erturkmen, Gokce B

    2013-10-01

    In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, Dave; Stephan, Eric G.; Wang, Weimin

    Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability liemore » the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.« less

  1. Towards an ontology for data quality in integrated chronic disease management: a realist review of the literature.

    PubMed

    Liaw, S T; Rahimi, A; Ray, P; Taggart, J; Dennis, S; de Lusignan, S; Jalaludin, B; Yeo, A E T; Talaei-Khoei, A

    2013-01-01

    Effective use of routine data to support integrated chronic disease management (CDM) and population health is dependent on underlying data quality (DQ) and, for cross system use of data, semantic interoperability. An ontological approach to DQ is a potential solution but research in this area is limited and fragmented. Identify mechanisms, including ontologies, to manage DQ in integrated CDM and whether improved DQ will better measure health outcomes. A realist review of English language studies (January 2001-March 2011) which addressed data quality, used ontology-based approaches and is relevant to CDM. We screened 245 papers, excluded 26 duplicates, 135 on abstract review and 31 on full-text review; leaving 61 papers for critical appraisal. Of the 33 papers that examined ontologies in chronic disease management, 13 defined data quality and 15 used ontologies for DQ. Most saw DQ as a multidimensional construct, the most used dimensions being completeness, accuracy, correctness, consistency and timeliness. The majority of studies reported tool design and development (80%), implementation (23%), and descriptive evaluations (15%). Ontological approaches were used to address semantic interoperability, decision support, flexibility of information management and integration/linkage, and complexity of information models. DQ lacks a consensus conceptual framework and definition. DQ and ontological research is relatively immature with little rigorous evaluation studies published. Ontology-based applications could support automated processes to address DQ and semantic interoperability in repositories of routinely collected data to deliver integrated CDM. We advocate moving to ontology-based design of information systems to enable more reliable use of routine data to measure health mechanisms and impacts. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  2. Performance analysis of a proposed tightly-coupled medical instrument network based on CAN protocol.

    PubMed

    Mujumdar, Shantanu; Thongpithoonrat, Pongnarin; Gurkan, D; McKneely, Paul K; Chapman, Frank M; Merchant, Fatima

    2010-01-01

    Advances in medical devices and health care has been phenomenal during the recent years. Although medical device manufacturers have been improving their instruments, network connection of these instruments still rely on proprietary technologies. Even if the interface has been provided by the manufacturer (e.g., RS-232, USB, or Ethernet coupled with a proprietary API), there is no widely-accepted uniform data model to access data of various bedside instruments. There is a need for a common standard which allows for internetworking with the medical devices from different manufacturers. ISO/IEEE 11073 (X73) is a standard attempting to unify the interfaces of all medical devices. X73 defines a client access mechanism that would be implemented into the communication controllers (residing between an instrument and the network) in order to access/network patient data. On the other hand, MediCAN™ technology suite has been demonstrated with various medical instruments to achieve interfacing and networking with a similar goal in its open standardization approach. However, it provides a more generic definition for medical data to achieve flexibility for networking and client access mechanisms. The instruments are in turn becoming more sophisticated; however, the operation of an instrument is still expected to be locally done by authorized medical personnel. Unfortunately, each medical instrument has its unique proprietary API (application programming interface - if any) to provide automated and electronic access to monitoring data. Integration of these APIs requires an agreement with the manufacturers towards realization of interoperable health care networking. As long as the interoperability of instruments with a network is not possible, ubiquitous access to patient status is limited only to manual entry based systems. This paper demonstrates an attempt to realize an interoperable medical instrument interface for networking using MediCAN technology suite as an open standard.

  3. Analysis of ISO/IEEE 11073 built-in security and its potential IHE-based extensibility.

    PubMed

    Rubio, Óscar J; Trigo, Jesús D; Alesanco, Álvaro; Serrano, Luis; García, José

    2016-04-01

    The ISO/IEEE 11073 standard for Personal Health Devices (X73PHD) aims to ensure interoperability between Personal Health Devices and aggregators-e.g. health appliances, routers-in ambulatory setups. The Integrating the Healthcare Enterprise (IHE) initiative promotes the coordinated use of different standards in healthcare systems (e.g. Personal/Electronic Health Records, alert managers, Clinical Decision Support Systems) by defining profiles intended for medical use cases. X73PHD provides a robust syntactic model and a comprehensive terminology, but it places limited emphasis on security and on interoperability with IHE-compliant systems and frameworks. However, the implementation of eHealth/mHealth applications in environments such as health and fitness monitoring, independent living and disease management (i.e. the X73PHD domains) increasingly requires features such as secure connections to mobile aggregators-e.g. smartphones, tablets-, the sharing of devices among different users with privacy, and interoperability with certain IHE-compliant healthcare systems. This work proposes a comprehensive IHE-based X73PHD extension consisting of additive layers adapted to different eHealth/mHealth applications, after having analyzed the features of X73PHD (especially its built-in security), IHE profiles related with these applications and other research works. Both the new features proposed for each layer and the procedures to support them have been carefully chosen to minimize the impact on X73PHD, on its architecture (in terms of delays and overhead) and on its framework. Such implications are thoroughly analyzed in this paper. As a result, an extended model of X73PHD is proposed, preserving its essential features while extending them with added value. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Reproducibility and interoperator reliability of obtaining images and measurements of the cervix and uterus with brachytherapy treatment applicators in situ using transabdominal ultrasound.

    PubMed

    van Dyk, Sylvia; Garth, Margaret; Oates, Amanda; Kondalsamy-Chennakesavan, Srinivas; Schneider, Michal; Bernshaw, David; Narayan, Kailash

    2016-01-01

    To validate interoperator reliability of brachytherapy radiation therapists (RTs) in obtaining an ultrasound image and measuring the cervix and uterine dimensions using transabdominal ultrasound. Patients who underwent MRI with applicators in situ after the first insertion were included in the study. Imaging was performed by three RTs (RT1, RT2, and RT3) with varying degrees of ultrasound experience. All RTs were required to obtain a longitudinal planning image depicting the applicator in the uterine canal and measure the cervix and uterus. The MRI scan, taken 1 hour after the ultrasound, was used as the reference standard against which all measurements were compared. Measurements were analyzed with intraclass correlation coefficient and Bland-Altman plots. All RTs were able to obtain a suitable longitudinal image for each patient in the study. Mean differences (SD) between MRI and ultrasound measurements obtained by RTs ranged from 3.5 (3.6) to 4.4 (4.23) mm and 0 (3.0) to 0.9 (2.5) mm on the anterior and posterior surface of the cervix, respectively. Intraclass correlation coefficient for absolute agreement between MRI and RTs was >0.9 for all posterior measurement points in the cervix and ranged from 0.41 to 0.92 on the anterior surface. Measurements were not statistically different between RTs at any measurement point. RTs with variable training attained high levels of interoperator reliability when using transabdominal ultrasound to obtain images and measurements of the uterus and cervix with brachytherapy applicators in situ. Access to training and use of a well-defined protocol assist in achieving these high levels of reliability. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  5. Comparison of H.323 and SIP for IP telephony signaling

    NASA Astrophysics Data System (ADS)

    Dalgic, Ismail; Fang, Hanlin

    1999-11-01

    Two standards currently compete for the dominance of IP telephony signaling: the H.323 protocol suite by ITU-T, and the Session Initiation Protocol (SIP) by IETF. Both of these signaling protocols provide mechanisms for call establishment and teardown, call control and supplementary services, and capability exchange. We investigate and compare these two protocols in terms of Functionality, Quality of Service (QoS), Scalability, Flexibility, Interoperability, and Ease of Implementation. For fairness of comparison, we consider similar scenarios for both protocols. In particular, we focus on scenarios that involve a gatekeeper for H.323, and a Proxy/Redirect server for SIP. The reason is that medium-to-large IP Telephony systems are not manageable without a gatekeeper or proxy server. We consider all three versions of H.323. In terms of functionality and services that can be supported, H.323 version 2 and SIP are very similar. However, supplementary services in H.323 are more rigorously defined, and therefore fewer interoperability issues are expected among its implementations. Furthermore, H.323 has taken more steps to ensure compatibility among its different versions, and to interoperate with PSTN. The two protocols are comparable in their QoS support [similar call setup delays, no support for resource reservation or class of service (CoS) setting], but H.323 version 3 will allow signaling of the requested CoS. SIP's primary advantages are (1) flexibility to add new features, and (2) relative ease of implementation and debugging. Finally, we note that H.323 and SIP are improving themselves by learning from each other, and the differences between them are diminishing with each new version.

  6. Data interoperability software solution for emergency reaction in the Europe Union

    NASA Astrophysics Data System (ADS)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2014-09-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision-making slower and more difficult. However, spread and development of networks and IT-based Emergency Management Systems (EMS) has improved emergency responses, becoming more coordinated. Despite improvements made in recent years, EMS have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision-making. In addition, from a technical perspective, the consolidation of current EMS and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMS surrounded by different contexts. To overcome these problems we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries cultural linguistic issues. To deal with the diversity of data protocols and formats, we have designed a Service Oriented Architecture for Data Interoperability (named DISASTER) providing a flexible extensible solution to solve the mediation issues. Web Services have been adopted as specific technology to implement such paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency first responders: the Netherlands-Germany border fire.

  7. Investigating the capabilities of semantic enrichment of 3D CityEngine data

    NASA Astrophysics Data System (ADS)

    Solou, Dimitra; Dimopoulou, Efi

    2016-08-01

    In recent years the development of technology and the lifting of several technical limitations, has brought the third dimension to the fore. The complexity of urban environments and the strong need for land administration, intensify the need of using a three-dimensional cadastral system. Despite the progress in the field of geographic information systems and 3D modeling techniques, there is no fully digital 3D cadastre. The existing geographic information systems and the different methods of three-dimensional modeling allow for better management, visualization and dissemination of information. Nevertheless, these opportunities cannot be totally exploited because of deficiencies in standardization and interoperability in these systems. Within this context, CityGML was developed as an international standard of the Open Geospatial Consortium (OGC) for 3D city models' representation and exchange. CityGML defines geometry and topology for city modeling, also focusing on semantic aspects of 3D city information. The scope of CityGML is to reach common terminology, also addressing the imperative need for interoperability and data integration, taking into account the number of available geographic information systems and modeling techniques. The aim of this paper is to develop an application for managing semantic information of a model generated based on procedural modeling. The model was initially implemented in CityEngine ESRI's software, and then imported to ArcGIS environment. Final goal was the original model's semantic enrichment and then its conversion to CityGML format. Semantic information management and interoperability seemed to be feasible by the use of the 3DCities Project ESRI tools, since its database structure ensures adding semantic information to the CityEngine model and therefore automatically convert to CityGML for advanced analysis and visualization in different application areas.

  8. Modular open RF architecture: extending VICTORY to RF systems

    NASA Astrophysics Data System (ADS)

    Melber, Adam; Dirner, Jason; Johnson, Michael

    2015-05-01

    Radio frequency products spanning multiple functions have become increasingly critical to the warfighter. Military use of the electromagnetic spectrum now includes communications, electronic warfare (EW), intelligence, and mission command systems. Due to the urgent needs of counterinsurgency operations, various quick reaction capabilities (QRCs) have been fielded to enhance warfighter capability. Although these QRCs were highly successfully in their respective missions, they were designed independently resulting in significant challenges when integrated on a common platform. This paper discusses how the Modular Open RF Architecture (MORA) addresses these challenges by defining an open architecture for multifunction missions that decomposes monolithic radio systems into high-level components with welldefined functions and interfaces. The functional decomposition maximizes hardware sharing while minimizing added complexity and cost due to modularization. MORA achieves significant size, weight and power (SWaP) savings by allowing hardware such as power amplifiers and antennas to be shared across systems. By separating signal conditioning from the processing that implements the actual radio application, MORA exposes previously inaccessible architecture points, providing system integrators with the flexibility to insert third-party capabilities to address technical challenges and emerging requirements. MORA leverages the Vehicular Integration for Command, Control, Communication, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR)/EW Interoperability (VICTORY) framework. This paper concludes by discussing how MORA, VICTORY and other standards such as OpenVPX are being leveraged by the U.S. Army Research, Development, and Engineering Command (RDECOM) Communications Electronics Research, Development, and Engineering Center (CERDEC) to define a converged architecture enabling rapid technology insertion, interoperability and reduced SWaP.

  9. LVC Architecture Roadmap Implementation - Results of the First Two Years

    DTIC Science & Technology

    2012-03-01

    NOTES Presented at the Simulation Interoperability Standards Organization?s (SISO) Spring Simulation Interoperability Workshop ( SIW ), 26-30 March...presented at the semi-annual Simulation Interoperability Workshops ( SIWs ) and the annual Interservice/Industry Training, Simulation & Education Conference...I/ITSEC), as well as other venues. For example, a full-day workshop on the initial progress of the effort was conducted at the 2010 Spring SIW [2

  10. Towards a cross-domain interoperable framework for natural hazards and disaster risk reduction information

    NASA Astrophysics Data System (ADS)

    Tomas, Robert; Harrison, Matthew; Barredo, José I.; Thomas, Florian; Llorente Isidro, Miguel; Cerba, Otakar; Pfeiffer, Manuela

    2014-05-01

    The vast amount of information and data necessary for comprehensive hazard and risk assessment presents many challenges regarding the lack of accessibility, comparability, quality, organisation and dissemination of natural hazards spatial data. In order to mitigate these limitations an interoperable framework has been developed in the framework of the development of legally binding Implementing rules of the EU INSPIRE Directive1* aiming at the establishment of the European Spatial Data Infrastructure. The interoperability framework is described in the Data Specification on Natural risk zones - Technical Guidelines (DS) document2* that was finalized and published on 10.12. 2013. This framework provides means for facilitating access, integration, harmonisation and dissemination of natural hazard data from different domains and sources. The objective of this paper is twofold. Firstly, the paper demonstrates the applicability of the interoperable framework developed in the DS and highlights the key aspects of the interoperability to the various natural hazards communities. Secondly, the paper "translates" into common language the main features and potentiality of the interoperable framework of the DS for a wider audience of scientists and practitioners in the natural hazards domain. Further in this paper the main five aspects of the interoperable framework will be presented. First, the issue of a common terminology for the natural hazards domain will be addressed. A common data model to facilitate cross domain data integration will follow secondly. Thirdly, the common methodology developed to provide qualitative or quantitative assessments of natural hazards will be presented. Fourthly, the extensible classification schema for natural hazards developed from a literature review and key reference documents from the contributing community of practice will be shown. Finally, the applicability of the interoperable framework for the various stakeholder groups will be also presented. This paper closes discussing open issues and next steps regarding the sustainability and evolution of the interoperable framework and missing aspects such as multi-hazard and multi-risk. --------------- 1*INSPIRE - Infrastructure for spatial information in Europe, http://inspire.ec.europa.eu 2*http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_NZ_v3.0.pdf

  11. Advanced Communications Technology Satellite Now Operating in an Inclined Orbit

    NASA Technical Reports Server (NTRS)

    Bauer, Robert A.

    1999-01-01

    The Advanced Communications Technology Satellite (ACTS) system has been modified to support operation in an inclined orbit that is virtually transparent to users, and plans are to continue this final phase of its operation through September 2000. The next 2 years of ACTS will provide a new opportunity for using the technologies that this system brought online over 5 years ago and that are still being used to resolve the technical issues that face NASA and the satellite industry in the area of seamless networking and interoperability with terrestrial systems. New goals for ACTS have been defined that align the program with recent changes in NASA and industry. ACTS will be used as a testbed to: Show how NASA and other Government agencies can use commercial systems for 1. future support of their operations Test, characterize, and resolve technical issues in using advanced communications 2. protocols such as asynchronous transfer mode (ATM) and transmission control protocol/Internet protocol (TCP/IP) over long latency links as found when interoperating satellites with terrestrial systems Evaluate narrow-spot-beam Ka-band satellite operation in an inclined orbit 3. Verify Ka-band satellite technologies since no other Ka-band system is yet 4. available in the United States

  12. Quality requirements for EHR archetypes.

    PubMed

    Kalra, Dipak; Tapuria, Archana; Austin, Tony; De Moor, Georges

    2012-01-01

    The realisation of semantic interoperability, in which any EHR data may be communicated between heterogeneous systems and fully understood by computers as well as people on receipt, is a challenging goal. Despite the use of standardised generic models for the EHR and standard terminology systems, too much optionality and variability exists in how particular clinical entries may be represented. Clinical archetypes provide a means of defining how generic models should be shaped and bound to terminology for specific kinds of clinical data. However, these will only contribute to semantic interoperability if libraries of archetypes can be built up consistently. This requires the establishment of design principles, editorial and governance policies, and further research to develop ways for archetype authors to structure clinical data and to use terminology consistently. Drawing on several years of work within communities of practice developing archetypes and implementing systems from them, this paper presents quality requirements for the development of archetypes. Clinical engagement on a wide scale is also needed to help grow libraries of good quality archetypes that can be certified. Vendor and eHealth programme engagement is needed to validate such archetypes and achieve safe, meaningful exchange of EHR data between systems.

  13. Next-generation digital camera integration and software development issues

    NASA Astrophysics Data System (ADS)

    Venkataraman, Shyam; Peters, Ken; Hecht, Richard

    1998-04-01

    This paper investigates the complexities associated with the development of next generation digital cameras due to requirements in connectivity and interoperability. Each successive generation of digital camera improves drastically in cost, performance, resolution, image quality and interoperability features. This is being accomplished by advancements in a number of areas: research, silicon, standards, etc. As the capabilities of these cameras increase, so do the requirements for both hardware and software. Today, there are two single chip camera solutions in the market including the Motorola MPC 823 and LSI DCAM- 101. Real time constraints for a digital camera may be defined by the maximum time allowable between capture of images. Constraints in the design of an embedded digital camera include processor architecture, memory, processing speed and the real-time operating systems. This paper will present the LSI DCAM-101, a single-chip digital camera solution. It will present an overview of the architecture and the challenges in hardware and software for supporting streaming video in such a complex device. Issues presented include the development of the data flow software architecture, testing and integration on this complex silicon device. The strategy for optimizing performance on the architecture will also be presented.

  14. Towards public health decision support: a systematic review of bidirectional communication approaches

    PubMed Central

    Dixon, Brian E; Gamache, Roland E; Grannis, Shaun J

    2013-01-01

    Objective To summarize the literature describing computer-based interventions aimed at improving bidirectional communication between clinical and public health. Materials and Methods A systematic review of English articles using MEDLINE and Google Scholar. Search terms included public health, epidemiology, electronic health records, decision support, expert systems, and decision-making. Only articles that described the communication of information regarding emerging health threats from public health agencies to clinicians or provider organizations were included. Each article was independently reviewed by two authors. Results Ten peer-reviewed articles highlight a nascent but promising area of research and practice related to alerting clinicians about emerging threats. Current literature suggests that additional research and development in bidirectional communication infrastructure should focus on defining a coherent architecture, improving interoperability, establishing clear governance, and creating usable systems that will effectively deliver targeted, specific information to clinicians in support of patient and population decision-making. Conclusions Increasingly available clinical information systems make it possible to deliver timely, relevant knowledge to frontline clinicians in support of population health. Future work should focus on developing a flexible, interoperable infrastructure for bidirectional communications capable of integrating public health knowledge into clinical systems and workflows. PMID:23467470

  15. Socially Aware Heterogeneous Wireless Networks

    PubMed Central

    Kosmides, Pavlos; Adamopoulou, Evgenia; Demestichas, Konstantinos; Theologou, Michael; Anagnostou, Miltiades; Rouskas, Angelos

    2015-01-01

    The development of smart cities has been the epicentre of many researchers’ efforts during the past decade. One of the key requirements for smart city networks is mobility and this is the reason stable, reliable and high-quality wireless communications are needed in order to connect people and devices. Most research efforts so far, have used different kinds of wireless and sensor networks, making interoperability rather difficult to accomplish in smart cities. One common solution proposed in the recent literature is the use of software defined networks (SDNs), in order to enhance interoperability among the various heterogeneous wireless networks. In addition, SDNs can take advantage of the data retrieved from available sensors and use them as part of the intelligent decision making process contacted during the resource allocation procedure. In this paper, we propose an architecture combining heterogeneous wireless networks with social networks using SDNs. Specifically, we exploit the information retrieved from location based social networks regarding users’ locations and we attempt to predict areas that will be crowded by using specially-designed machine learning techniques. By recognizing possible crowded areas, we can provide mobile operators with recommendations about areas requiring datacell activation or deactivation. PMID:26110402

  16. A Framework for Integration of Heterogeneous Medical Imaging Networks

    PubMed Central

    Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos

    2014-01-01

    Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS. PMID:25279021

  17. A framework for integration of heterogeneous medical imaging networks.

    PubMed

    Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos

    2014-01-01

    Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS.

  18. The Next Generation of Interoperability Agents in Healthcare

    PubMed Central

    Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel ; Abelha, António; Machado, José

    2014-01-01

    Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA. PMID:24840351

  19. The role of markup for enabling interoperability in health informatics.

    PubMed

    McKeever, Steve; Johnson, David

    2015-01-01

    Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  20. A comparison of data interoperability approaches of fusion codes with application to synthetic diagnostics

    NASA Astrophysics Data System (ADS)

    Kruger, Scott; Shasharina, S.; Vadlamani, S.; McCune, D.; Holland, C.; Jenkins, T. G.; Candy, J.; Cary, J. R.; Hakim, A.; Miah, M.; Pletzer, A.

    2010-11-01

    As various efforts to integrate fusion codes proceed worldwide, standards for sharing data have emerged. In the U.S., the SWIM project has pioneered the development of the Plasma State, which has a flat-hierarchy and is dominated by its use within 1.5D transport codes. The European Integrated Tokamak Modeling effort has developed a more ambitious data interoperability effort organized around the concept of Consistent Physical Objects (CPOs). CPOs have deep hierarchies as needed by an effort that seeks to encompass all of fusion computing. Here, we discuss ideas for implementing data interoperability that is complementary to both the Plasma State and CPOs. By making use of attributes within the netcdf and HDF5 binary file formats, the goals of data interoperability can be achieved with a more informal approach. In addition, a file can be simultaneously interoperable to several standards at once. As an illustration of this approach, we discuss its application to the development of synthetic diagnostics that can be used for multiple codes.

  1. IT Labs Proof-of-Concept Project: Technical Data Interoperability (TDI) Pathfinder Via Emerging Standards

    NASA Technical Reports Server (NTRS)

    Conroy, Mike; Gill, Paul; Ingalls, John; Bengtsson, Kjell

    2014-01-01

    No known system is in place to allow NASA technical data interoperability throughout the whole life cycle. Life Cycle Cost (LCC) will be higher on many developing programs if action isn't taken soon to join disparate systems efficiently. Disparate technical data also increases safety risks from poorly integrated elements. NASA requires interoperability and industry standards, but breaking legacy ways is a challenge.

  2. Interacting with Multi-Robot Systems Using BML

    DTIC Science & Technology

    2013-06-01

    Pullen, U. Schade, J. Simonsen & R. Gomez-Veiga, NATO MSG-048 C-BML Final Report Summary. 2010 Fall Simulation Interoperability Workshop (10F- SIW -039...NATO MSG-085. 2012 Spring Simulation Interoperability Workshop (12S- SIW -045), Orlando, FL, March 2012. [3] T. Remmersmann, U. Schade, L. Khimeche...B. Grautreau & R. El Abdouni Khayari, Lessons Recognized: How to Combine BML and MSDL. 2012 Spring Simulation Interoperability Workshop (12S- SIW -012

  3. A Linguistic Foundation for Communicating Geo-Information in the context of BML and geoBML

    DTIC Science & Technology

    2010-03-23

    BML Standard. 2009 Spring Simulation Interoperability Workshop (09S- SIW -046). San Diego, CA. Rein, K., Schade, U. & Hieb, M.R. (2009). Battle...Formalizing Battle Management Language: A Grammar for Specifying Orders. 2006 Spring Simulation Interoperability Workshop (06S- SIW - 068). Huntsville...Hieb, M.R. (2007). Battle Management Language: A Grammar for Specifying Reports. 2007 Spring Simulation Interoperability Workshop (07S- SIW -036

  4. Semantic and syntactic interoperability in online processing of big Earth observation data.

    PubMed

    Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea

    2018-01-01

    The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover).

  5. Semantic and syntactic interoperability in online processing of big Earth observation data

    PubMed Central

    Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea

    2018-01-01

    ABSTRACT The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover). PMID:29387171

  6. The diagnosis of PCOS in young infertile women according to different diagnostic criteria: the role of serum anti-Müllerian hormone.

    PubMed

    Casadei, Luisa; Fanisio, Francesca; Sorge, Roberto Pietro; Collamarini, Matteo; Piccolo, Eleonora; Piccione, Emilio

    2018-07-01

    To diagnose polycystic ovary syndrome (PCOS) in young infertile women using different diagnostic criteria. To define serum anti-Müllerian hormone (AMH) cutoff values for PCOS definition. To investigate the correlation between AMH and body mass index (BMI). Retrospective case-control study. A total of 140 infertile women (age 21-35 years) were enrolled. PCOS was defined according to the National Institutes of Health (NIH) criteria, the Rotterdam consensus criteria and the Androgen Excess and PCOS Society (AE-PCOS) criteria. ROC curve analysis was performed to define AMH thresholds for PCOS definition according to the three different diagnostic criteria. Correlation between AMH and BMI was investigated. The prevalence of PCOS under the NIH criteria, the Rotterdam criteria and the AE-PCOS criteria was 27.1, 40 and 29.3%, respectively. The optimal thresholds of AMH to distinguish NIH PCOS from infertile controls was 5.20 ng/ml (AUC = 0.86, sensitivity 79%, specificity 80%); the best cutoff to detect Rotterdam PCOS was 4.57 ng/ml (AUC = 0.85, sensitivity 78%, specificity 81%); a cutoff of 4.85 ng/ml (AUC = 0.85, sensitivity 80%, specificity 78%) defined PCOS women according to AE-PCOS criteria. The prevalence of the syndrome became 37.1, 44.3 and 39.2% according to the three criteria, respectively, using AMH threshold between 4.57 and 5.20 ng/ml as an alternative to antral follicle count and/or hyperandrogenism. Anti-Müllerian hormone may reconcile the three diagnostic criteria and allow the PCOS diagnosis in women with mild symptoms. No significant correlation was found between AMH and BMI in PCOS women and controls.

  7. Defining and reconstructing clinical processes based on IHE and BPMN 2.0.

    PubMed

    Strasser, Melanie; Pfeifer, Franz; Helm, Emmanuel; Schuler, Andreas; Altmann, Josef

    2011-01-01

    This paper describes the current status and the results of our process management system for defining and reconstructing clinical care processes, which contributes to compare, analyze and evaluate clinical processes and further to identify high cost tasks or stays. The system is founded on IHE, which guarantees standardized interfaces and interoperability between clinical information systems. At the heart of the system there is BPMN, a modeling notation and specification language, which allows the definition and execution of clinical processes. The system provides functionality to define healthcare information system independent clinical core processes and to execute the processes in a workflow engine. Furthermore, the reconstruction of clinical processes is done by evaluating an IHE audit log database, which records patient movements within a health care facility. The main goal of the system is to assist hospital operators and clinical process managers to detect discrepancies between defined and actual clinical processes and as well to identify main causes of high medical costs. Beyond that, the system can potentially contribute to reconstruct and improve clinical processes and enhance cost control and patient care quality.

  8. A General Quality Classification System for eIDs and e-Signatures

    NASA Astrophysics Data System (ADS)

    Ølnes, Jon; Buene, Leif; Andresen, Anette; Grindheim, Håvard; Apitzsch, Jörg; Rossi, Adriano

    The PEPPOL (Pan-European Public Procurement On-Line) project is a large scale pilot under the CIP programme of the EU, exploring electronic public procurement in a unified European market. Interoperability of electronic signatures across borders is identified as a major obstacle to cross-border procurement. PEPPOL suggests specify-ing signature acceptance criteria in the form of signature policies that must be transparent and non-discriminatory. Validation solutions must then not only assess signature correctness but also signature policy adherence. This paper addresses perhaps the most important topic of a signature policy: Quality of eIDs and e-signatures. Discrete levels are suggested for: eID quality, assurance level for this quality, and for cryptographic quality of signatures.

  9. Electronic Health Records: VAs Efforts Raise Concerns about Interoperability Goals and Measures, Duplication with DOD, and Future Plans

    DTIC Science & Technology

    2016-07-13

    ELECTRONIC HEALTH RECORDS VA’s Efforts Raise Concerns about Interoperability Goals and Measures, Duplication with DOD...Agencies, Committee on Appropriations, U.S. Senate July 13, 2016 ELECTRONIC HEALTH RECORDS VA’s Efforts Raise Concerns about Interoperability Goals...initiatives with the Department of Defense (DOD) that were intended to advance the ability of the two departments to share electronic health records , the

  10. Enabling Medical Device Interoperability for the Integrated Clinical Environment

    DTIC Science & Technology

    2016-12-01

    else who is eager to work together to mature the healthcare technology ecosystem to enable the next generation of safe and intelligent medical device...Award Number: W81XWH-12-C-0154 TITLE: “Enabling Medical Device Interoperability for the Integrated Clinical Environment ” PRINCIPAL INVESTIGATOR...SUBTITLE 5a. CONTRACT NUMBER W81XWH-12-C-0154 “Enabling Medical Device Interoperability for the Integrated Clinical Environment ” 5b. GRANT NUMBER 5c

  11. An Air Operations Division Live, Virtual, and Constructive (LVC) Corporate Interoperability Standards Development Strategy

    DTIC Science & Technology

    2011-07-01

    Orlando, Florida, September 2009, 09F- SIW -090. [HLA (2000) - 1] - Modeling and Simulation Standard - High Level Architecture (HLA) – Framework and...Simulation Interoperability Workshop, Orlando, FL, USA, September 2009, 09F- SIW -023. [MaK] - www.mak.com [MIL-STD-3011] - MIL-STD-3011...Spring Simulation Interoperability Workshop, Norfolk, VA, USA, March 2007, 07S- SIW -072. [Ross] - Ross, P. and Clark, P. (2005), “Recommended

  12. An HLA-Based Approach to Quantify Achievable Performance for Tactical Edge Applications

    DTIC Science & Technology

    2011-05-01

    in: Proceedings of the 2002 Fall Simulation Interoperability Workshop, 02F- SIW -068, Nov 2002. [16] P. Knight, et al. ―WBT RTI Independent...Benchmark Tests: Design, Implementation, and Updated Results‖, in: Proceedings of the 2002 Spring Simulation Interoperability Workshop, 02S- SIW -081, March...Interoperability Workshop, 98F- SIW -085, Nov 1998. [18] S. Ferenci and R. Fujimoto. ―RTI Performance on Shared Memory and Message Passing Architectures‖, in

  13. Interoperable Open-Source Sensor-Net Frameworks with Sensor-Package Workbench Capabilities: Motivation, Survey of Resources, and Exploratory Results

    DTIC Science & Technology

    2010-06-01

    Military Scenario Definition Language (MSDL) for Nontraditional Warfare Scenarios," Paper 09S- SIW -001, Proceedings of the Spring Simulation...Update to the M&S Community," Paper 09S- SIW -002, Proceedings of the Spring Simulation Interoperability Workshop, Simulation Interoperability...Multiple Simulations: An Application of the Military Scenario Definition Language (MSDL)," Paper 09S- SIW -003, Proc. of the Spring Simulation

  14. Planetary Sciences Interoperability at VO Paris Data Centre

    NASA Astrophysics Data System (ADS)

    Le Sidaner, P.; Aboudarham, J.; Birlan, M.; Briot, D.; Bonnin, X.; Cecconi, B.; Chauvin, C.; Erard, S.; Henry, F.; Lamy, L.; Mancini, M.; Normand, J.; Popescu, F.; Roques, F.; Savalle, R.; Schneider, J.; Shih, A.; Thuillot, W.; Vinatier, S.

    2015-10-01

    The Astronomy community has been developing interoperability since more than 10 years, by standardizing data access, data formats, and metadata. This international action is led by the International Virtual Observatory Alliance (IVOA). Observatoire de Paris is an active participant in this project. All actions on interoperability, data and service provision are centralized in and managed by VOParis Data Centre (VOPDC). VOPDC is a coordinated project from all scientific departments of Observatoire de Paris..

  15. Space Telecommunications Radio System STRS Cognitive Radio

    NASA Technical Reports Server (NTRS)

    Briones, Janette C.; Handler, Louis M.

    2013-01-01

    Radios today are evolving from awareness toward cognition. A software defined radio (SDR) provides the most capability for integrating autonomic decision making ability and allows the incremental evolution toward a cognitive radio. This cognitive radio technology will impact NASA space communications in areas such as spectrum utilization, interoperability, network operations, and radio resource management over a wide range of operating conditions. NASAs cognitive radio will build upon the infrastructure being developed by Space Telecommunication Radio System (STRS) SDR technology. This paper explores the feasibility of inserting cognitive capabilities in the NASA STRS architecture and the interfaces between the cognitive engine and the STRS radio. The STRS architecture defines methods that can inform the cognitive engine about the radio environment so that the cognitive engine can learn autonomously from experience, and take appropriate actions to adapt the radio operating characteristics and optimize performance.

  16. Interoperable and standard e-Health solution over Bluetooth.

    PubMed

    Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J

    2010-01-01

    The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.

  17. UAS Integration into the NAS: Detect and Avoid Display Evaluations in Support of SC-228 MOPS Development

    NASA Technical Reports Server (NTRS)

    Fern, Lisa; Rorie, Conrad; Shively, Jay

    2016-01-01

    At the May 2015 SC-228 meeting, requirements for TCAS II interoperability became elevated in priority. A TCAS interoperability work group was formed to identify and address key issuesquestions. The TCAS work group came up with an initial list of questions and a plan to address those questions. As part of that plan, NASA proposed to run a mini HITL to address display, alerting and guidance issues. A TCAS Interoperability Workshop was held to determine potential displayalertingguidance issues that could be explored in future NASA mini HITLS. Consensus on main functionality of DAA guidance when TCAS II RA occurs. Prioritized list of independent variables for experimental design. Set of use cases to stress TCAS Interoperability.

  18. Interoperability Barriers in NASA Earth Science Data Systems from the Perspective of a Science User (Invited)

    NASA Astrophysics Data System (ADS)

    Kuo, K.

    2010-12-01

    As a practitioner in the field of atmospheric remote sensing, the author, like many other similar science users, depends on and uses heavily NASA Earth Science remote sensing data. Thus the author is asked by the NASA Earth Science Data Information System Project (ESDIS) to assess the capabilities of the Earth Observing System Data and Information System (EOSDIS) in order to provide suggestions and recommendations for the evolution of EOSDIS in the path towards its 2015 Vision Tenets. As NASA's Earth science data system, EOSDIS provides data processing and data archiving and distribution services for EOS missions. The science operations of EOSDIS are the focus of this report, i.e. data archiving and distribution, which are performed within a distributed system of many interconnected nodes, namely the Science Investigator-led Processing Systems, or SIPS, and distributed data centers. Since its inception in the early 1990s, EOSDIS has represented a democratization of data, a break from the past when data dissemination was at the discretion of project scientists. Its “open data” policy is so highly valued and well received by its user communities that it has influenced other agencies, even those of other countries, to adopt the same open policy. In the last ~10 years EOSDIS has matured to serve very well users of any given science community in which the varieties of data being used change infrequently. The unpleasant effects of interoperability barriers are now more often felt by users who try to use new data outside their existing familiar set. This paper first defines interoperability and identifies the purposes for achieving interoperability. The sources of interoperability barriers, classified by the author into software, hardware, and human categories, are examined. For a subset of issues related to software, it presents diagnoses obtained from experience of the author and his survey of the EOSDIS data finding, ordering, retrieving, and extraction services. it also reports on an analysis of his survey regarding tools provided by EOSDIS or its user communities and intended to make routine data manipulations easier. Barriers in the hardware category are those resulting from differences in orbit configurations of the spacecrafts and differences in remote sensing modality (active or passive), spectral and spatial resolutions, scanning strategies, etc. of the instruments. Such differences are best understood by considering the nature of remotely sensed observations. Human factors are further classified into institutional and individual subcategories. The former includes factors such as NASA’s funding practices and the latter relates to individuals’ propensity in adopting new technologies. Finally, a strategy for overcoming these barriers is proposed.

  19. Usefulness of Urine Output Criteria for Early Detection of Acute Kidney Injury after Transcatheter Aortic Valve Implantation

    PubMed Central

    Shacham, Yacov; Rofe, Maytal; Leshem-Rubinow, Eran; Gal-Oz, Amir; Arbel, Yaron; Keren, Gad; Roth, Arie; Ben-Assa, Eyal; Halkin, Amir; Finkelstein, Ariel; Banai, Shmuel; Steinvil, Arie

    2014-01-01

    Background Previous studies demonstrated that acute kidney injury (AKI) following transcatheter aortic valve implantation (TAVI) is frequent and associated with adverse outcomes. However, these studies only applied the serum creatinine (sCr) criteria while ignoring the urine output criteria. We hypothesized that adding the urine output criteria might contribute to an earlier diagnosis of AKI. Methods We included 143 patients with severe aortic stenosis who underwent transfemoral TAVI between December 2012 and April 2014. Urine output was assessed hourly for at least 24 h following TAVI, and sCr was assessed at least daily until discharge. Based on the Valve Academic Research Consortium-2 (VARC-2), AKI was determined using both sCr and urine output criteria. We compared the incidence of AKI and time to AKI diagnosis based on these two methods. Results The mean age was 81 ± 6 years (range 61-94) and 56% were male. AKI occurred in 27 (19%) patients, 13 (9%) of whom had AKI defined by sCr criteria. Twenty (14%) patients had AKI defined by urine output criteria, only 6 of whom had AKI also defined by sCr criteria. The use of urine output criteria resulted in earlier identification of AKI (18 ± 4 vs. 64 ± 57 h, p = 0.02) and was associated with lower sCr elevation in patients having AKI defined by only urine output criteria (0.03 ± 0.12 vs. 0.37 ± 0.06 mg/dl, p < 0.001). Conclusion The use of the VARC-2 urine output criteria significantly increased the incidence of AKI and shortened the time to AKI diagnosis. PMID:25737679

  20. Data Modeling Challenges of Advanced Interoperability.

    PubMed

    Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka

    2018-01-01

    Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.

  1. An adaptive semantic based mediation system for data interoperability among Health Information Systems.

    PubMed

    Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung

    2014-08-01

    Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.

  2. A Framework for Seamless Interoperation of Heterogeneous Distributed Software Components

    DTIC Science & Technology

    2005-05-01

    interoperability, b) distributed resource discovery, and c) validation of quality requirements. Principles and prototypical systems were created to demonstrate the successful completion of the research.

  3. The BACnet Campus Challenge - Part 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masica, Ken; Tom, Steve

    Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less

  4. Study and validation of tools interoperability in JPSEC

    NASA Astrophysics Data System (ADS)

    Conan, V.; Sadourny, Y.; Jean-Marie, K.; Chan, C.; Wee, S.; Apostolopoulos, J.

    2005-08-01

    Digital imagery is important in many applications today, and the security of digital imagery is important today and is likely to gain in importance in the near future. The emerging international standard ISO/IEC JPEG-2000 Security (JPSEC) is designed to provide security for digital imagery, and in particular digital imagery coded with the JPEG-2000 image coding standard. One of the primary goals of a standard is to ensure interoperability between creators and consumers produced by different manufacturers. The JPSEC standard, similar to the popular JPEG and MPEG family of standards, specifies only the bitstream syntax and the receiver's processing, and not how the bitstream is created or the details of how it is consumed. This paper examines the interoperability for the JPSEC standard, and presents an example JPSEC consumption process which can provide insights in the design of JPSEC consumers. Initial interoperability tests between different groups with independently created implementations of JPSEC creators and consumers have been successful in providing the JPSEC security services of confidentiality (via encryption) and authentication (via message authentication codes, or MACs). Further interoperability work is on-going.

  5. [InlineEquation not available: see fulltext.]-Means Based Fingerprint Segmentation with Sensor Interoperability

    NASA Astrophysics Data System (ADS)

    Yang, Gongping; Zhou, Guang-Tong; Yin, Yilong; Yang, Xiukun

    2010-12-01

    A critical step in an automatic fingerprint recognition system is the segmentation of fingerprint images. Existing methods are usually designed to segment fingerprint images originated from a certain sensor. Thus their performances are significantly affected when dealing with fingerprints collected by different sensors. This work studies the sensor interoperability of fingerprint segmentation algorithms, which refers to the algorithm's ability to adapt to the raw fingerprints obtained from different sensors. We empirically analyze the sensor interoperability problem, and effectively address the issue by proposing a [InlineEquation not available: see fulltext.]-means based segmentation method called SKI. SKI clusters foreground and background blocks of a fingerprint image based on the [InlineEquation not available: see fulltext.]-means algorithm, where a fingerprint block is represented by a 3-dimensional feature vector consisting of block-wise coherence, mean, and variance (abbreviated as CMV). SKI also employs morphological postprocessing to achieve favorable segmentation results. We perform SKI on each fingerprint to ensure sensor interoperability. The interoperability and robustness of our method are validated by experiments performed on a number of fingerprint databases which are obtained from various sensors.

  6. Identity Management Systems in Healthcare: The Issue of Patient Identifiers

    NASA Astrophysics Data System (ADS)

    Soenens, Els

    According to a recent recommendation of the European Commission, now is the time for Europe to enhance interoperability in eHealth. Although interoperability of patient identifiers seems promising for matters of patient mobility, patient empowerment and effective access to care, we see that today there is indeed a considerable lack of interoperability in the field of patient identification. Looking from a socio-technical rather than a merely technical point of view, one can understand the fact that the development and implementation of an identity management system in a specific healthcare context is influenced by particular social practices, affected by socio-economical history and the political climate and regulated by specific data protection legislations. Consequently, the process of making patient identification in Europe more interoperable is a development beyond semantic and syntactic levels. In this paper, we gives some examples of today’s patient identifier systems in Europe, discuss the issue of interoperability of (unique) patient identifiers from a socio-technical point of view and try not to ignore the ‘privacy side’ of the story.

  7. The BACnet Campus Challenge - Part 1

    DOE PAGES

    Masica, Ken; Tom, Steve

    2015-12-01

    Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less

  8. Smart Grid Interoperability Maturity Model Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Drummond, R.; Giroti, Tony

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across anmore » information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.« less

  9. Presenting an evaluation model of the trauma registry software.

    PubMed

    Asadi, Farkhondeh; Paydar, Somayeh

    2018-04-01

    Trauma is a major cause of 10% death in the worldwide and is considered as a global concern. This problem has made healthcare policy makers and managers to adopt a basic strategy in this context. Trauma registry has an important and basic role in decreasing the mortality and the disabilities due to injuries resulted from trauma. Today, different software are designed for trauma registry. Evaluation of this software improves management, increases efficiency and effectiveness of these systems. Therefore, the aim of this study is to present an evaluation model for trauma registry software. The present study is an applied research. In this study, general and specific criteria of trauma registry software were identified by reviewing literature including books, articles, scientific documents, valid websites and related software in this domain. According to general and specific criteria and related software, a model for evaluating trauma registry software was proposed. Based on the proposed model, a checklist designed and its validity and reliability evaluated. Mentioned model by using of the Delphi technique presented to 12 experts and specialists. To analyze the results, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved by the experts and professionals, the final version of the evaluation model for the trauma registry software was presented. For evaluating of criteria of trauma registry software, two groups were presented: 1- General criteria, 2- Specific criteria. General criteria of trauma registry software were classified into four main categories including: 1- usability, 2- security, 3- maintainability, and 4-interoperability. Specific criteria were divided into four main categories including: 1- data submission and entry, 2- reporting, 3- quality control, 4- decision and research support. The presented model in this research has introduced important general and specific criteria of trauma registry software and sub criteria related to each main criteria separately. This model was validated by experts in this field. Therefore, this model can be used as a comprehensive model and a standard evaluation tool for measuring efficiency and effectiveness and performance improvement of trauma registry software. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Evidence for Broadening Criteria for Atypical Depression Which May Define a Reactive Depressive Disorder.

    PubMed

    Silverstein, Brett; Angst, Jules

    2015-01-01

    Objective. Arguing that additional symptoms should be added to the criteria for atypical depression. Method. Published research articles on atypical depression are reviewed. Results. (1) The original studies upon which the criteria for atypical depression were based cited fatigue, insomnia, pain, and loss of weight as characteristic symptoms. (2) Several studies of DSM depressive criteria found patients with atypical depression to exhibit high levels of insomnia, fatigue, and loss of appetite/weight. (3) Several studies have found atypical depression to be comorbid with headaches, bulimia, and body image issues. (4) Most probands who report atypical depression meet criteria for "somatic depression," defined as depression associated with several of disordered eating, poor body image, headaches, fatigue, and insomnia. The gender difference in prevalence of atypical depression results from its overlap with somatic depression. Somatic depression is associated with psychosocial measures related to gender, linking it with the descriptions of atypical depression as "reactive" appearing in the studies upon which the original criteria for atypical depression were based. Conclusion. Insomnia, disordered eating, poor body image, and aches/pains should be added as criteria for atypical depression matching criteria for somatic depression defining a reactive depressive disorder possibly distinct from endogenous melancholic depression.

  11. Implementing the HL7v3 standard in Croatian primary healthcare domain.

    PubMed

    Koncar, Miroslav

    2004-01-01

    The mission of HL7 Inc. is to provide standards for the exchange, management and integration of data that supports clinical patient care and the management, delivery and evaluation of healthcare services. The scope of this work includes the specifications of flexible, cost-effective approaches, standards, guidelines, methodologies, and related services for interoperability between healthcare information systems. In the field of medical information technologies, HL7 provides the world's most advanced information standards. Versions 1 and 2 of the HL7 standard have on the one hand solved many issues, but on the other demonstrated the size and complexity of the health information sharing problem. As the solution, a complete new methodology has been adopted, which is being encompassed in version 3 recommendations. This approach standardizes the Reference Information Model (RIM), which is the source of all domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely-coupled systems that are designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project, we have decided to go directly to HL7v3. Implementing the HL7v3 standard in healthcare applications represents a challenging task. By using standardized refinement and localization methods we were able to define information models for Croatian primary healthcare domain. The scope of our work includes clinical, financial and administrative data management, where in some cases we were compelled to introduce new HL7v3-compliant models. All of the HL7v3 transactions are digitally signed, using the W3C XML Digital Signature standard.

  12. Exploring NASA GES DISC Data with Interoperable Services

    NASA Technical Reports Server (NTRS)

    Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey

    2015-01-01

    Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.

  13. Interoperability for Space Mission Monitor and Control: Applying Technologies from Manufacturing Automation and Process Control Industries

    NASA Technical Reports Server (NTRS)

    Jones, Michael K.

    1998-01-01

    Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.

  14. Implementation of a metadata architecture and knowledge collection to support semantic interoperability in an enterprise data warehouse.

    PubMed

    Dhaval, Rakesh; Borlawsky, Tara; Ostrander, Michael; Santangelo, Jennifer; Kamal, Jyoti; Payne, Philip R O

    2008-11-06

    In order to enhance interoperability between enterprise systems, and improve data validity and reliability throughout The Ohio State University Medical Center (OSUMC), we have initiated the development of an ontology-anchored metadata architecture and knowledge collection for our enterprise data warehouse. The metadata and corresponding semantic relationships stored in the OSUMC knowledge collection are intended to promote consistency and interoperability across the heterogeneous clinical, research, business and education information managed within the data warehouse.

  15. Indication Criteria for Total Knee Arthroplasty in Patients with Osteoarthritis - A Multi-perspective Consensus Study.

    PubMed

    Schmitt, Jochen; Lange, Toni; Günther, Klaus-Peter; Kopkow, Christian; Rataj, Elisabeth; Apfelbacher, Christian; Aringer, Martin; Böhle, Eckhardt; Bork, Hartmut; Dreinhöfer, Karsten; Friederich, Niklaus; Frosch, Karl-Heinz; Gravius, Sascha; Gromnica-Ihle, Erika; Heller, Karl-Dieter; Kirschner, Stephan; Kladny, Bernd; Kohlhof, Hendrik; Kremer, Michael; Leuchten, Nicolai; Lippmann, Maike; Malzahn, Jürgen; Meyer, Heiko; Sabatowski, Rainer; Scharf, Hanns-Peter; Stoeve, Johannes; Wagner, Richard; Lützner, Jörg

    2017-10-01

    Background and Objectives Knee osteoarthritis (OA) is a significant public health burden. Rates of total knee arthroplasty (TKA) in OA vary substantially between geographical regions, most likely due to the lack of standardised indication criteria. We set out to define indication criteria for the German healthcare system for TKA in patients with knee OA, on the basis of best evidence and transparent multi-stakeholder consensus. Methods We undertook a complex mixed methods study, including an iterative process of systematic appraisal of existing evidence, Delphi consensus methods and stakeholder conferences. We established a consensus panel representing key German national societies of healthcare providers (orthopaedic surgeons, rheumatologists, pain physicians, psychologists, physiotherapists), payers, and patient representatives. A priori defined consensus criteria were at least 70% agreement and less than 20% disagreement among the consensus panel. Agreement was sought for (1) core indication criteria defined as criteria that must be met to consider TKA in a normal patient with knee OA, (2) additional (not obligatory) indication criteria, (3) absolute contraindication criteria that generally prohibit TKA, and (4) risk factors that do not prohibit TKA, but usually do not lead to a recommendation for TKA. Results The following 5 core indication criteria were agreed within the panel: 1. intermittent (several times per week) or constant knee pain for at least 3 - 6 months; 2. radiological confirmation of structural knee damage (osteoarthritis, osteonecrosis); 3. inadequate response to conservative treatment, including pharmacological and non-pharmacological treatment for at least 3 - 6 months; 4. adverse impact of knee disease on patient's quality of life for at least 3 - 6 months; 5. patient-reported suffering/impairment due to knee disease. Additional indication criteria, contraindication criteria, and risk factors for adverse outcome were also agreed by a large majority within the multi-perspective stakeholder panel. Conclusion The defined indication criteria constitute a prerequisite for appropriate provision of TKA in patients with knee OA in Germany. In eligible patients, shared-decision making should eventually determine if TKA is performed or not. The next important steps are the implementation of the defined indication criteria, and the prospective investigation of predictors of success or failure of TKA in the context of routine care provision in Germany. Georg Thieme Verlag KG Stuttgart · New York.

  16. Interoperability Outlook in the Big Data Future

    NASA Astrophysics Data System (ADS)

    Kuo, K. S.; Ramachandran, R.

    2015-12-01

    The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center" interoperability is almost guaranteed because data, analysis, and results all can be readily shared and reused. Effectively, with the establishment of "distributed active analysis centers", interoperation turns from a many-to-many problem into a less complicated few-to-few problem and becomes easier to solve.

  17. OGC and Grid Interoperability in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/

  18. Trends of Underweight and Overweight/Obesity Among Japanese Schoolchildren From 2003 to 2012, Defined by Body Mass Index and Percentage Overweight Cutoffs.

    PubMed

    Shirasawa, Takako; Ochiai, Hirotaka; Nanri, Hinako; Nishimura, Rimei; Ohtsu, Tadahiro; Hoshino, Hiromi; Tajima, Naoko; Kokaze, Akatsuki

    2015-01-01

    We investigated the prevalence and trends of underweight and overweight/obesity in a population-based sample of Japanese schoolchildren from 2003 to 2012, defined by body mass index (BMI) and percentage overweight (POW). Subjects comprised fourth and seventh graders from the town of Ina, Japan, from 2003 to 2012. The height and weight of each subject were measured. Children were classified as underweight, normal weight, or overweight/obese using two criteria: BMI cutoff points proposed by the International Obesity Task Force and cutoffs based on POW in Japan. Data from 4367 fourth graders and 3724 seventh graders were analyzed. The prevalence of underweight and overweight as defined by POW criteria were lower than those based on BMI criteria. There was a decrease in the prevalence of overweight among fourth-grade boys and girls and seventh-grade girls according to BMI; this decrease was also observed when POW criteria were used for the definition of overweight. The prevalence and trends of both underweight and overweight as defined by POW were underestimated among Japanese schoolchildren compared to those determined using BMI. The results of this study also suggest that trends in underweight and overweight/obesity using POW criteria are similar to those based on BMI criteria among schoolchildren in Japan.

  19. Transportation communications interoperability : phase 2, resource evaluation.

    DOT National Transportation Integrated Search

    2006-12-01

    Based on the Arizona Department of Transportations (ADOT) previous SPR-561 Needs Assessment study, this : report continues the efforts to enhance radio interoperability between Department of Public Safety (DPS) Highway : Patrol officers and ...

  20. Analysis of OPACITY and PLAID Protocols for Contactless Smart Cards

    DTIC Science & Technology

    2012-09-01

    9 3. Access Control ........................................................................ 9 E . THREATS AND...Synchronization .............................. 23 c. Simple Integration and Interoperability ..................... 24 E . MODES OF OPERATION...Interoperability ..................... 47 E . MODES OF OPERATIONS ................................................................ 47 F. SUGGESTED KEY

  1. Plugfest 2009: Global Interoperability in Telerobotics and Telemedicine

    PubMed Central

    King, H. Hawkeye; Hannaford, Blake; Kwok, Ka-Wai; Yang, Guang-Zhong; Griffiths, Paul; Okamura, Allison; Farkhatdinov, Ildar; Ryu, Jee-Hwan; Sankaranarayanan, Ganesh; Arikatla, Venkata; Tadano, Kotaro; Kawashima, Kenji; Peer, Angelika; Schauß, Thomas; Buss, Martin; Miller, Levi; Glozman, Daniel; Rosen, Jacob; Low, Thomas

    2014-01-01

    Despite the great diversity of teleoperator designs and applications, their underlying control systems have many similarities. These similarities can be exploited to enable inter-operability between heterogeneous systems. We have developed a network data specification, the Interoperable Telerobotics Protocol, that can be used for Internet based control of a wide range of teleoperators. In this work we test interoperable telerobotics on the global Internet, focusing on the telesurgery application domain. Fourteen globally dispersed telerobotic master and slave systems were connected in thirty trials in one twenty four hour period. Users performed common manipulation tasks to demonstrate effective master-slave operation. With twenty eight (93%) successful, unique connections the results show a high potential for standardizing telerobotic operation. Furthermore, new paradigms for telesurgical operation and training are presented, including a networked surgery trainer and upper-limb exoskeleton control of micro-manipulators. PMID:24748993

  2. Semantically Interoperable XML Data

    PubMed Central

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-01-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789

  3. Architecture for interoperable software in biology.

    PubMed

    Bare, James Christopher; Baliga, Nitin S

    2014-07-01

    Understanding biological complexity demands a combination of high-throughput data and interdisciplinary skills. One way to bring to bear the necessary combination of data types and expertise is by encapsulating domain knowledge in software and composing that software to create a customized data analysis environment. To this end, simple flexible strategies are needed for interconnecting heterogeneous software tools and enabling data exchange between them. Drawing on our own work and that of others, we present several strategies for interoperability and their consequences, in particular, a set of simple data structures--list, matrix, network, table and tuple--that have proven sufficient to achieve a high degree of interoperability. We provide a few guidelines for the development of future software that will function as part of an interoperable community of software tools for biological data analysis and visualization. © The Author 2012. Published by Oxford University Press.

  4. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-03

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.

  5. The Long Road to Semantic Interoperability in Support of Public Health: Experiences from Two States

    PubMed Central

    Vreeman, Daniel J.; Grannis, Shaun J.

    2014-01-01

    Proliferation of health information technologies creates opportunities to improve clinical and public health, including high quality, safer care and lower costs. To maximize such potential benefits, health information technologies must readily and reliably exchange information with other systems. However, evidence from public health surveillance programs in two states suggests that operational clinical information systems often fail to use available standards, a barrier to semantic interoperability. Furthermore, analysis of existing policies incentivizing semantic interoperability suggests they have limited impact and are fragmented. In this essay, we discuss three approaches for increasing semantic interoperability to support national goals for using health information technologies. A clear, comprehensive strategy requiring collaborative efforts by clinical and public health stakeholders is suggested as a guide for the long road towards better population health data and outcomes. PMID:24680985

  6. Geoscience Information Network (USGIN) Solutions for Interoperable Open Data Access Requirements

    NASA Astrophysics Data System (ADS)

    Allison, M. L.; Richard, S. M.; Patten, K.

    2014-12-01

    The geosciences are leading development of free, interoperable open access to data. US Geoscience Information Network (USGIN) is a freely available data integration framework, jointly developed by the USGS and the Association of American State Geologists (AASG), in compliance with international standards and protocols to provide easy discovery, access, and interoperability for geoscience data. USGIN standards include the geologic exchange language 'GeoSciML' (v 3.2 which enables instant interoperability of geologic formation data) which is also the base standard used by the 117-nation OneGeology consortium. The USGIN deployment of NGDS serves as a continent-scale operational demonstration of the expanded OneGeology vision to provide access to all geoscience data worldwide. USGIN is developed to accommodate a variety of applications; for example, the International Renewable Energy Agency streams data live to the Global Atlas of Renewable Energy. Alternatively, users without robust data sharing systems can download and implement a free software packet, "GINstack" to easily deploy web services for exposing data online for discovery and access. The White House Open Data Access Initiative requires all federally funded research projects and federal agencies to make their data publicly accessible in an open source, interoperable format, with metadata. USGIN currently incorporates all aspects of the Initiative as it emphasizes interoperability. The system is successfully deployed as the National Geothermal Data System (NGDS), officially launched at the White House Energy Datapalooza in May, 2014. The USGIN Foundation has been established to ensure this technology continues to be accessible and available.

  7. Articulation Management for Intelligent Integration of Information

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    When combining data from distinct sources, there is a need to share meta-data and other knowledge about various source domains. Due to semantic inconsistencies and heterogeneity of representations, problems arise in combining multiple domains when the domains are merged. The knowledge that is irrelevant to the task of interoperation will be included, making the result unnecessarily complex. This heterogeneity problem can be eliminated by mediating the conflicts and managing the intersections of the domains. For interoperation and intelligent access to heterogeneous information, the focus is on the intersection of the knowledge, since intersection will define the required articulation rules. An algebra over domain has been proposed to use articulation rules to support disciplined manipulation of domain knowledge resources. The objective of a domain algebra is to provide the capability for interrogating many domain knowledge resources, which are largely semantically disjoint. The algebra supports formally the tasks of selecting, combining, extending, specializing, and modifying Components from a diverse set of domains. This paper presents a domain algebra and demonstrates the use of articulation rules to link declarative interfaces for Internet and enterprise applications. In particular, it discusses the articulation implementation as part of a production system capable of operating over the domain described by the IDL (interface description language) of objects registered in multiple CORBA servers.

  8. The Human Phenotype Ontology project: linking molecular biology and disease through phenotype data

    PubMed Central

    Köhler, Sebastian; Doelken, Sandra C.; Mungall, Christopher J.; Bauer, Sebastian; Firth, Helen V.; Bailleul-Forestier, Isabelle; Black, Graeme C. M.; Brown, Danielle L.; Brudno, Michael; Campbell, Jennifer; FitzPatrick, David R.; Eppig, Janan T.; Jackson, Andrew P.; Freson, Kathleen; Girdea, Marta; Helbig, Ingo; Hurst, Jane A.; Jähn, Johanna; Jackson, Laird G.; Kelly, Anne M.; Ledbetter, David H.; Mansour, Sahar; Martin, Christa L.; Moss, Celia; Mumford, Andrew; Ouwehand, Willem H.; Park, Soo-Mi; Riggs, Erin Rooney; Scott, Richard H.; Sisodiya, Sanjay; Vooren, Steven Van; Wapner, Ronald J.; Wilkie, Andrew O. M.; Wright, Caroline F.; Vulto-van Silfhout, Anneke T.; de Leeuw, Nicole; de Vries, Bert B. A.; Washingthon, Nicole L.; Smith, Cynthia L.; Westerfield, Monte; Schofield, Paul; Ruef, Barbara J.; Gkoutos, Georgios V.; Haendel, Melissa; Smedley, Damian; Lewis, Suzanna E.; Robinson, Peter N.

    2014-01-01

    The Human Phenotype Ontology (HPO) project, available at http://www.human-phenotype-ontology.org, provides a structured, comprehensive and well-defined set of 10,088 classes (terms) describing human phenotypic abnormalities and 13,326 subclass relations between the HPO classes. In addition we have developed logical definitions for 46% of all HPO classes using terms from ontologies for anatomy, cell types, function, embryology, pathology and other domains. This allows interoperability with several resources, especially those containing phenotype information on model organisms such as mouse and zebrafish. Here we describe the updated HPO database, which provides annotations of 7,278 human hereditary syndromes listed in OMIM, Orphanet and DECIPHER to classes of the HPO. Various meta-attributes such as frequency, references and negations are associated with each annotation. Several large-scale projects worldwide utilize the HPO for describing phenotype information in their datasets. We have therefore generated equivalence mappings to other phenotype vocabularies such as LDDB, Orphanet, MedDRA, UMLS and phenoDB, allowing integration of existing datasets and interoperability with multiple biomedical resources. We have created various ways to access the HPO database content using flat files, a MySQL database, and Web-based tools. All data and documentation on the HPO project can be found online. PMID:24217912

  9. Where to Publish and Find Ontologies? A Survey of Ontology Libraries

    PubMed Central

    d'Aquin, Mathieu; Noy, Natalya F.

    2011-01-01

    One of the key promises of the Semantic Web is its potential to enable and facilitate data interoperability. The ability of data providers and application developers to share and reuse ontologies is a critical component of this data interoperability: if different applications and data sources use the same set of well defined terms for describing their domain and data, it will be much easier for them to “talk” to one another. Ontology libraries are the systems that collect ontologies from different sources and facilitate the tasks of finding, exploring, and using these ontologies. Thus ontology libraries can serve as a link in enabling diverse users and applications to discover, evaluate, use, and publish ontologies. In this paper, we provide a survey of the growing—and surprisingly diverse—landscape of ontology libraries. We highlight how the varying scope and intended use of the libraries a ects their features, content, and potential exploitation in applications. From reviewing eleven ontology libraries, we identify a core set of questions that ontology practitioners and users should consider in choosing an ontology library for finding ontologies or publishing their own. We also discuss the research challenges that emerge from this survey, for the developers of ontology libraries to address. PMID:22408576

  10. Supporting interoperability of collaborative networks through engineering of a service-based Mediation Information System (MISE 2.0)

    NASA Astrophysics Data System (ADS)

    Benaben, Frederick; Mu, Wenxin; Boissel-Dallier, Nicolas; Barthe-Delanoe, Anne-Marie; Zribi, Sarah; Pingaud, Herve

    2015-08-01

    The Mediation Information System Engineering project is currently finishing its second iteration (MISE 2.0). The main objective of this scientific project is to provide any emerging collaborative situation with methods and tools to deploy a Mediation Information System (MIS). MISE 2.0 aims at defining and designing a service-based platform, dedicated to initiating and supporting the interoperability of collaborative situations among potential partners. This MISE 2.0 platform implements a model-driven engineering approach to the design of a service-oriented MIS dedicated to supporting the collaborative situation. This approach is structured in three layers, each providing their own key innovative points: (i) the gathering of individual and collaborative knowledge to provide appropriate collaborative business behaviour (key point: knowledge management, including semantics, exploitation and capitalisation), (ii) deployment of a mediation information system able to computerise the previously deduced collaborative processes (key point: the automatic generation of collaborative workflows, including connection with existing devices or services) (iii) the management of the agility of the obtained collaborative network of organisations (key point: supervision of collaborative situations and relevant exploitation of the gathered data). MISE covers business issues (through BPM), technical issues (through an SOA) and agility issues of collaborative situations (through EDA).

  11. Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.

    PubMed

    Eichelberg, Marco; Chronaki, Catherine

    2016-01-01

    Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation.

  12. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things.

    PubMed

    Jazayeri, Mohammad Ali; Liang, Steve H L; Huang, Chih-Yuan

    2015-09-22

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision.

  13. An Assessment of Coherence Between Early Warning and Response Systems and Serious Cross-Border Health Threats in the European Union and Turkey

    PubMed Central

    Elif Ekmekci, Perihan

    2017-01-01

    Disease outbreaks have attracted the attention of the public health community to early warning and response systems (EWRS) for communicable diseases and other cross-border threats to health. The European Union (EU) and the World Health Organization (WHO) have published regulations in this area. Decision 1082/2013/EU brought a new approach the management of public health threats in EU member states. Decision 1082/2013/EU brought several innovations, which included establishing a Health Security Committee; preparedness and response planning; joint procurement of medical countermeasures; ad hoc monitoring for biological, chemical, and environmental threats; EWRS; and recognition of an emergency situation and interoperability between various sectors. Turkey, as an acceding country to the EU and a member of the WHO, has been improving its national public health system to meet EU legislations and WHO standards. This article first explains EWRS as defined in Decision 1082/2013/EU and Turkey’s obligations to align its public health laws to the EU acquis. EWRS in Turkey are addressed, particularly their coherence with EU policies regarding preparedness and response, alert notification, and interoperability between health and other sectors. Finally, the challenges and limitations of the current Turkish system are discussed and further improvements are suggested. PMID:27511433

  14. Interoperability and security in wireless body area network infrastructures.

    PubMed

    Warren, Steve; Lebak, Jeffrey; Yao, Jianchu; Creekmore, Jonathan; Milenkovic, Aleksandar; Jovanov, Emil

    2005-01-01

    Wireless body area networks (WBANs) and their supporting information infrastructures offer unprecedented opportunities to monitor state of health without constraining the activities of a wearer. These mobile point-of-care systems are now realizable due to the convergence of technologies such as low-power wireless communication standards, plug-and-play device buses, off-the-shelf development kits for low-power microcontrollers, handheld computers, electronic medical records, and the Internet. To increase acceptance of personal monitoring technology while lowering equipment cost, advances must be made in interoperability (at both the system and device levels) and security. This paper presents an overview of WBAN infrastructure work in these areas currently underway in the Medical Component Design Laboratory at Kansas State University (KSU) and at the University of Alabama in Huntsville (UAH). KSU efforts include the development of wearable health status monitoring systems that utilize ISO/IEEE 11073, Bluetooth, Health Level 7, and OpenEMed. WBAN efforts at UAH include the development of wearable activity and health monitors that incorporate ZigBee-compliant wireless sensor platforms with hardware-level encryption and the TinyOS development environment. WBAN infrastructures are complex, requiring many functional support elements. To realize these infrastructures through collaborative efforts, organizations such as KSU and UAH must define and utilize standard interfaces, nomenclature, and security approaches.

  15. A data types profile suitable for use with ISO EN 13606.

    PubMed

    Sun, Shanghua; Austin, Tony; Kalra, Dipak

    2012-12-01

    ISO EN 13606 is a five part International Standard specifying how Electronic Healthcare Record (EHR) information should be communicated between different EHR systems and repositories. Part 1 of the standard defines an information model for representing the EHR information itself, including the representation of types of data value. A later International Standard, ISO 21090:2010, defines a comprehensive set of models for data types needed by all health IT systems. This latter standard is vast, and duplicates some of the functions already handled by ISO EN 13606 part 1. A profile (sub-set) of ISO 21090 would therefore be expected to provide EHR system vendors with a more specially tailored set of data types to implement and avoid the risk of providing more than one modelling option for representing the information properties. This paper describes the process and design decisions made for developing a data types profile for EHR interoperability.

  16. A Journey in Standard Development: The Core Manufacturing Simulation Data (CMSD) Information Model.

    PubMed

    Lee, Yung-Tsun Tina

    2015-01-01

    This report documents a journey "from research to an approved standard" of a NIST-led standard development activity. That standard, Core Manufacturing Simulation Data (CMSD) information model, provides neutral structures for the efficient exchange of manufacturing data in a simulation environment. The model was standardized under the auspices of the international Simulation Interoperability Standards Organization (SISO). NIST started the research in 2001 and initiated the standardization effort in 2004. The CMSD standard was published in two SISO Products. In the first Product, the information model was defined in the Unified Modeling Language (UML) and published in 2010 as SISO-STD-008-2010. In the second Product, the information model was defined in Extensible Markup Language (XML) and published in 2013 as SISO-STD-008-01-2012. Both SISO-STD-008-2010 and SISO-STD-008-01-2012 are intended to be used together.

  17. Applications of software-defined radio (SDR) technology in hospital environments.

    PubMed

    Chávez-Santiago, Raúl; Mateska, Aleksandra; Chomu, Konstantin; Gavrilovska, Liljana; Balasingham, Ilangko

    2013-01-01

    A software-defined radio (SDR) is a radio communication system where the major part of its functionality is implemented by means of software in a personal computer or embedded system. Such a design paradigm has the major advantage of producing devices that can receive and transmit widely different radio protocols based solely on the software used. This flexibility opens several application opportunities in hospital environments, where a large number of wired and wireless electronic devices must coexist in confined areas like operating rooms and intensive care units. This paper outlines some possible applications in the 2360-2500 MHz frequency band. These applications include the integration of wireless medical devices in a common communication platform for seamless interoperability, and cognitive radio (CR) for body area networks (BANs) and wireless sensor networks (WSNs) for medical environmental surveillance. The description of a proof-of-concept CR prototype is also presented.

  18. Moving Beyond the 10,000 Ways That Don't Work

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Arctur, D. K.; Rueda, C.

    2009-12-01

    From his research in developing light bulb filaments, Thomas Edison provide us with a good lesson to advance any venture. He said "I have not failed, I've just found 10,000 ways that won't work." Advancing data and access interoperability is one of those ventures difficult to achieve because of the differences among the participating communities. Even within the marine domain, different communities exist and with them different technologies (formats and protocols) to publish data and its descriptions, and different vocabularies to name things (e.g. parameters, sensor types). Simplifying the heterogeneity of technologies is not only accomplished by adopting standards, but by creating profiles, and advancing tools that use those standards. In some cases, standards are advanced by building from existing tools. But what is the best strategy? Edison could provide us a hint. Prototypes and test beds are essential to achieve interoperability among geospatial communities. The Open Geospatial Consortium (OGC) calls them interoperability experiments. The World Wide Web Consortium (W3C) calls them incubator projects. Prototypes help test and refine specifications. The Marine Metadata Interoperability (MMI) Initiative, which is advancing marine data integration and re-use by promoting community solutions, understood this strategy and started an interoperability demonstration with the SURA Coastal Ocean Observing and Prediction (SCOOP) program. This interoperability demonstration transformed into the OGC Ocean Science Interoperability Experiment (Oceans IE). The Oceans IE brings together the Ocean-Observing community to advance interoperability of ocean observing systems by using OGC Standards. The Oceans IE Phase I investigated the use of OGC Web Feature Service (WFS) and OGC Sensor Observation Service (SOS) standards for representing and exchanging point data records from fixed in-situ marine platforms. The Oceans IE Phase I produced an engineering best practices report, advanced reference implementations, and submitted various change requests that are now being considered by the OGC SOS working group. Building on Phase I, and with a focus on semantically-enabled services, Oceans IE Phase II will continue the use and improvement of OGC specifications in the marine community. We will present the lessons learned and in particular the strategy of experimenting with technologies to advance standards to publish data in marine communities, which could also help advance interoperability in other geospatial communities. We will also discuss the growing collaborations among ocean-observing standards organizations that will bring about the institutional acceptance needed for these technologies and practices to gain traction globally.

  19. DSM-5 AND ICD-11 DEFINITIONS OF POSTTRAUMATIC STRESS DISORDER: INVESTIGATING “NARROW” AND “BROAD” APPROACHES

    PubMed Central

    Stein, Dan J.; McLaughlin, Katie A.; Koenen, Karestan C.; Atwoli, Lukoye; Friedman, Matthew J.; Hill, Eric D.; Maercker, Andreas; Petukhova, Maria; Shahly, Victoria; van Ommeren, Mark; Alonso, Jordi; Borges, Guilherme; de Girolamo, Giovanni; de Jonge, Peter; Demyttenaere, Koen; Florescu, Silvia; Karam, Elie G.; Kawakami, Norito; Matschinger, Herbert; Okoliyski, Michail; Posada-Villa, Jose; Scott, Kate M.; Viana, Maria Carmen; Kessler, Ronald C.

    2014-01-01

    Background The development of the Diagnostic and Statistical Manual of Mental Disorders 5th edition (DSM-5) and ICD-11 has led to reconsideration of diagnostic criteria for posttraumatic stress disorder (PTSD). The World Mental Health (WMH) Surveys allow investigation of the implications of the changing criteria compared to DSM-IV and ICD-10. Methods WMH Surveys in 13 countries asked respondents to enumerate all their lifetime traumatic events (TEs) and randomly selected one TE per respondent for PTSD assessment. DSMIV and ICD-10 PTSD were assessed for the 23,936 respondents who reported lifetime TEs in these surveys with the fully structured Composite International Diagnostic Interview (CIDI). DSM-5 and proposed ICD-11 criteria were approximated. Associations of the different criteria sets with indicators of clinical severity (distress-impairment, suicidality, comorbid fear-distress disorders, PTSD symptom duration) were examined to investigate the implications of using the different systems. Results A total of 5.6% of respondents met criteria for “broadly defined” PTSD (i.e., full criteria in at least one diagnostic system), with prevalence ranging from 3.0% with DSM-5 to 4.4% with ICD-10. Only one-third of broadly defined cases met criteria in all four systems and another one third in only one system (narrowly defined cases). Between-system differences in indicators of clinical severity suggest that ICD-10 criteria are least strict and DSM-IV criteria most strict. The more striking result, though, is that significantly elevated indicators of clinical significance were found even for narrowly defined cases for each of the four diagnostic systems. Conclusions These results argue for a broad definition of PTSD defined by any one of the different systems to capture all clinically significant cases of PTSD in future studies. PMID:24894802

  20. Special issue on enabling open and interoperable access to Planetary Science and Heliophysics databases and tools

    NASA Astrophysics Data System (ADS)

    2018-01-01

    The large amount of data generated by modern space missions calls for a change of organization of data distribution and access procedures. Although long term archives exist for telescopic and space-borne observations, high-level functions need to be developed on top of these repositories to make Planetary Science and Heliophysics data more accessible and to favor interoperability. Results of simulations and reference laboratory data also need to be integrated to support and interpret the observations. Interoperable software and interfaces have recently been developed in many scientific domains. The Virtual Observatory (VO) interoperable standards developed for Astronomy by the International Virtual Observatory Alliance (IVOA) can be adapted to Planetary Sciences, as demonstrated by the VESPA (Virtual European Solar and Planetary Access) team within the Europlanet-H2020-RI project. Other communities have developed their own standards: GIS (Geographic Information System) for Earth and planetary surfaces tools, SPASE (Space Physics Archive Search and Extract) for space plasma, PDS4 (NASA Planetary Data System, version 4) and IPDA (International Planetary Data Alliance) for planetary mission archives, etc, and an effort to make them interoperable altogether is starting, including automated workflows to process related data from different sources.

  1. Interoperability of Neuroscience Modeling Software

    PubMed Central

    Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik

    2009-01-01

    Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374

  2. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  3. A SOA-Based Platform to Support Clinical Data Sharing.

    PubMed

    Gazzarata, R; Giannini, B; Giacomini, M

    2017-01-01

    The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the "Interoperable" Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the "Interoperable" Tier, the current solution actually covers the "Connected" Tier, due to local hospital policy restrictions.

  4. Using a Foundational Ontology for Reengineering a Software Enterprise Ontology

    NASA Astrophysics Data System (ADS)

    Perini Barcellos, Monalessa; de Almeida Falbo, Ricardo

    The knowledge about software organizations is considerably relevant to software engineers. The use of a common vocabulary for representing the useful knowledge about software organizations involved in software projects is important for several reasons, such as to support knowledge reuse and to allow communication and interoperability between tools. Domain ontologies can be used to define a common vocabulary for sharing and reuse of knowledge about some domain. Foundational ontologies can be used for evaluating and re-designing domain ontologies, giving to these real-world semantics. This paper presents an evaluating of a Software Enterprise Ontology that was reengineered using the Unified Foundation Ontology (UFO) as basis.

  5. Using knowledge rules for pharmacy mapping.

    PubMed

    Shakib, Shaun C; Che, Chengjian; Lau, Lee Min

    2006-01-01

    The 3M Health Information Systems (HIS) Healthcare Data Dictionary (HDD) is used to encode and structure patient medication data for the Electronic Health Record (EHR) of the Department of Defense's (DoD's) Armed Forces Health Longitudinal Technology Application (AHLTA). HDD Subject Matter Experts (SMEs) are responsible for initial and maintenance mapping of disparate, standalone medication master files from all 100 DoD host sites worldwide to a single concept-based vocabulary, to accomplish semantic interoperability. To achieve higher levels of automation, SMEs began defining a growing set of knowledge rules. These knowledge rules were implemented in a pharmacy mapping tool, which enhanced consistency through automation and increased mapping rate by 29%.

  6. Interoperable Archetypes With a Three Folded Terminology Governance.

    PubMed

    Pederson, Rune; Ellingsen, Gunnar

    2015-01-01

    The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded.

  7. CCP interoperability and system stability

    NASA Astrophysics Data System (ADS)

    Feng, Xiaobing; Hu, Haibo

    2016-09-01

    To control counterparty risk, financial regulations such as the Dodd-Frank Act are increasingly requiring standardized derivatives trades to be cleared by central counterparties (CCPs). It is anticipated that in the near term future, CCPs across the world will be linked through interoperability agreements that facilitate risk sharing but also serve as a conduit for transmitting shocks. This paper theoretically studies a networked network with CCPs that are linked through interoperability arrangements. The major finding is that the different configurations of networked network CCPs contribute to the different properties of the cascading failures.

  8. The Importance of State and Context in Safe Interoperable Medical Systems

    PubMed Central

    Jaffe, Michael B.; Robkin, Michael; Rausch, Tracy; Arney, David; Goldman, Julian M.

    2016-01-01

    This paper describes why “device state” and “patient context” information are necessary components of device models for safe interoperability. This paper includes a discussion of the importance of describing the roles of devices with respect to interactions (including human user workflows involving devices, and device to device communication) within a system, particularly those intended for use at the point-of-care, and how this role information is communicated. In addition, it describes the importance of clinical scenarios in creating device models for interoperable devices. PMID:27730013

  9. Integrating technology to improve medication administration.

    PubMed

    Prusch, Amanda E; Suess, Tina M; Paoletti, Richard D; Olin, Stephen T; Watts, Starann D

    2011-05-01

    The development, implementation, and evaluation of an i.v. interoperability program to advance medication safety at the bedside are described. I.V. interoperability integrates intelligent infusion devices (IIDs), the bar-code-assisted medication administration system, and the electronic medication administration record system into a bar-code-driven workflow that populates provider-ordered, pharmacist-validated infusion parameters on IIDs. The purpose of this project was to improve medication safety through the integration of these technologies and decrease the potential for error during i.v. medication administration. Four key phases were essential to developing and implementing i.v. interoperability: (a) preparation, (b) i.v. interoperability pilot, (c) preliminary validation, and (d) expansion. The establishment of pharmacy involvement in i.v. interoperability resulted in two additional safety checks: pharmacist infusion rate oversight and nurse independent validation of the autoprogrammed rate. After instituting i.v. interoperability, monthly compliance to the telemetry drug library increased to a mean ± S.D. of 72.1% ± 2.1% from 56.5% ± 1.5%, and the medical-surgical nursing unit's drug library monthly compliance rate increased to 58.6% ± 2.9% from 34.1% ± 2.6% (p < 0.001 for both comparisons). The number of manual pump edits decreased with both telemetry and medical-surgical drug libraries, demonstrating a reduction from 56.9 ± 12.8 to 14.2 ± 3.9 and from 61.2 ± 15.4 to 14.7 ± 3.8, respectively (p < 0.001 for both comparisons). Through the integration and incorporation of pharmacist oversight for rate changes, the telemetry and medical-surgical patient care areas demonstrated a 32% reduction in reported monthly errors involving i.v. administration of heparin. By integrating two stand-alone technologies, i.v. interoperability was implemented to improve medication administration. Medication errors were reduced, nursing workflow was simplified, and pharmacists became involved in checking infusion rates of i.v. medications.

  10. Turning Interoperability Operational with GST

    NASA Astrophysics Data System (ADS)

    Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha

    2013-04-01

    GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially how it complies with existing or developing standards or quasi-standards and how it applies and extents services towards interoperability in the Earth sciences.

  11. Cross-domain Collaborative Research and People Interoperability: Beyond Knowledge Representation Frameworks

    NASA Astrophysics Data System (ADS)

    Fox, P. A.; Diviacco, P.; Busato, A.

    2016-12-01

    Geo-scientific research collaboration commonly faces of complex systems where multiple skills and competences are needed at the same time. Efficacy of such collaboration among researchers then becomes of paramount importance. Multidisciplinary studies draw from domains that are far from each other. Researchers also need to understand: how to extract what data they need and eventually produce something that can be used by others. The management of information and knowledge in this perspective is non-trivial. Interoperability is frequently sought in computer-to-computer environements, so-as to overcome mismatches in vocabulary, data formats, coordinate reference system and so on. Successful researcher collaboration also relies on interoperability of the people! Smaller, synchronous and face-to-face settings for researchers are knownn to enhance people interoperability. However changing settings; either geographically; temporally; or with increasing the team size, diversity, and expertise requires people-computer-people-computer (...) interoperability. To date, knowledge representation framework have been proposed but not proven as necessary and sufficient to achieve multi-way interoperability. In this contribution, we address epistemology and sociology of science advocating for a fluid perspective where science is mostly a social construct, conditioned by cognitive issues; especially cognitive bias. Bias cannot be obliterated. On the contrary it must be carefully taken into consideration. Information-centric interfaces built from different perspectives and ways of thinking by actors with different point of views, approaches and aims, are proposed as a means for enhancing people interoperability in computer-based settings. The contribution will provide details on the approach of augmenting and interfacing to knowledge representation frameworks to the cognitive-conceptual frameworks for people that are needed to meet and exceed collaborative research goals in the 21st century. A web based collaborative portal has been developed that integrates both approaches and will be presented. Reports will be given on initial tests that have encouraging results.

  12. Ocean Data Interoperability Platform (ODIP): developing a common framework for global marine data management

    NASA Astrophysics Data System (ADS)

    Glaves, H. M.

    2015-12-01

    In recent years marine research has become increasingly multidisciplinary in its approach with a corresponding rise in the demand for large quantities of high quality interoperable data as a result. This requirement for easily discoverable and readily available marine data is currently being addressed by a number of regional initiatives with projects such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Integrated Marine Observing System (IMOS) in Australia, having implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these systems has been developed to address local requirements and created in isolation from those in other regions.Multidisciplinary marine research on a global scale necessitates a common framework for marine data management which is based on existing data systems. The Ocean Data Interoperability Platform project is seeking to address this requirement by bringing together selected regional marine e-infrastructures for the purposes of developing interoperability across them. By identifying the areas of commonality and incompatibility between these data infrastructures, and leveraging the development activities and expertise of these individual systems, three prototype interoperability solutions are being created which demonstrate the effective sharing of marine data and associated metadata across the participating regional data infrastructures as well as with other target international systems such as GEO, COPERNICUS etc.These interoperability solutions combined with agreed best practice and approved standards, form the basis of a common global approach to marine data management which can be adopted by the wider marine research community. To encourage implementation of these interoperability solutions by other regional marine data infrastructures an impact assessment is being conducted to determine both the technical and financial implications of deploying them alongside existing services. The associated best practice and common standards are also being disseminated to the user community through relevant accreditation processes and related initiatives such as the Research Data Alliance and the Belmont Forum.

  13. Ambulatory measurement of the scapulohumeral rhythm: intra- and inter-operator agreement of a protocol based on inertial and magnetic sensors.

    PubMed

    Parel, I; Cutti, A G; Fiumana, G; Porcellini, G; Verni, G; Accardo, A P

    2012-04-01

    To measure the scapulohumeral rhythm (SHR) in outpatient settings, the motion analysis protocol named ISEO (INAIL Shoulder and Elbow Outpatient protocol) was developed, based on inertial and magnetic sensors. To complete the sensor-to-segment calibration, ISEO requires the involvement of an operator for sensor placement and for positioning the patient's arm in a predefined posture. Since this can affect the measure, this study aimed at quantifying ISEO intra- and inter-operator agreement. Forty subjects were considered, together with two operators, A and B. Three measurement sessions were completed for each subject: two by A and one by B. In each session, the humerus and scapula rotations were measured during sagittal and scapular plane elevation movements. ISEO intra- and inter-operator agreement were assessed by computing, between sessions, the: (1) similarity of the scapulohumeral patterns through the Coefficient of Multiple Correlation (CMC(2)), both considering and excluding the difference of the initial value of the scapula rotations between two sessions (inter-session offset); (2) 95% Smallest Detectable Difference (SDD(95)) in scapula range of motion. Results for CMC(2) showed that the intra- and inter-operator agreement is acceptable (median≥0.85, lower-whisker ≥ 0.75) for most of the scapula rotations, independently from the movement and the inter-session offset. The only exception is the agreement for scapula protraction-retraction and for scapula medio-lateral rotation during abduction (inter-operator), which is acceptable only if the inter-session offset is removed. SDD(95) values ranged from 4.4° to 8.6° for the inter-operator and between 4.9° and 8.5° for the intra-operator agreement. In conclusion, ISEO presents a high intra- and inter-operator agreement, particularly with the scapula inter-session offset removed. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Positive train control interoperability and networking research : final report.

    DOT National Transportation Integrated Search

    2015-12-01

    This document describes the initial development of an ITC PTC Shared Network (IPSN), a hosted : environment to support the distribution, configuration management, and IT governance of Interoperable : Train Control (ITC) Positive Train Control (PTC) s...

  15. 78 FR 46582 - Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-01

    ... FEDERAL COMMUNICATIONS COMMISSION Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public... persons that the Federal Communications Commission's (FCC or Commission) Communications Security...

  16. CCSDS SM and C Mission Operations Interoperability Prototype

    NASA Technical Reports Server (NTRS)

    Lucord, Steven A.

    2010-01-01

    This slide presentation reviews the prototype of the Spacecraft Monitor and Control (SM&C) Operations for interoperability among other space agencies. This particular prototype uses the German Space Agency (DLR) to test the ideas for interagency coordination.

  17. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  18. Metadata behind the Interoperability of Wireless Sensor Networks

    PubMed Central

    Ballari, Daniela; Wachowicz, Monica; Callejo, Miguel Angel Manso

    2009-01-01

    Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability. PMID:22412330

  19. Metadata behind the Interoperability of Wireless Sensor Networks.

    PubMed

    Ballari, Daniela; Wachowicz, Monica; Callejo, Miguel Angel Manso

    2009-01-01

    Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability.

  20. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Tian-Jy; Kim, Younghun

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function tomore » convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.« less

  1. Clinical data interoperability based on archetype transformation.

    PubMed

    Costa, Catalina Martínez; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2011-10-01

    The semantic interoperability between health information systems is a major challenge to improve the quality of clinical practice and patient safety. In recent years many projects have faced this problem and provided solutions based on specific standards and technologies in order to satisfy the needs of a particular scenario. Most of such solutions cannot be easily adapted to new scenarios, thus more global solutions are needed. In this work, we have focused on the semantic interoperability of electronic healthcare records standards based on the dual model architecture and we have developed a solution that has been applied to ISO 13606 and openEHR. The technological infrastructure combines reference models, archetypes and ontologies, with the support of Model-driven Engineering techniques. For this purpose, the interoperability infrastructure developed in previous work by our group has been reused and extended to cover the requirements of data transformation. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Finalizing the CCSDS Space-Data Link Layer Security Protocol: Setup and Execution of the Interoperability Testing

    NASA Technical Reports Server (NTRS)

    Fischer, Daniel; Aguilar-Sanchez, Ignacio; Saba, Bruno; Moury, Gilles; Biggerstaff, Craig; Bailey, Brandon; Weiss, Howard; Pilgram, Martin; Richter, Dorothea

    2015-01-01

    The protection of data transmitted over the space-link is an issue of growing importance also for civilian space missions. Through the Consultative Committee for Space Data Systems (CCSDS), space agencies have reacted to this need by specifying the Space Data-Link Layer Security (SDLS) protocol which provides confidentiality and integrity services for the CCSDS Telemetry (TM), Telecommand (TC) and Advanced Orbiting Services (AOS) space data-link protocols. This paper describes the approach of the CCSDS SDLS working group to specify and execute the necessary interoperability tests. It first details the individual SDLS implementations that have been produced by ESA, NASA, and CNES and then the overall architecture that allows the interoperability tests between them. The paper reports on the results of the interoperability tests and identifies relevant aspects for the evolution of the test environment.

  3. Sharing and interoperation of Digital Dongying geospatial data

    NASA Astrophysics Data System (ADS)

    Zhao, Jun; Liu, Gaohuan; Han, Lit-tao; Zhang, Rui-ju; Wang, Zhi-an

    2006-10-01

    Digital Dongying project was put forward by Dongying city, Shandong province, and authenticated by Ministry of Information Industry, Ministry of Science and Technology and Ministry of Construction P.R.CHINA in 2002. After five years of building, informationization level of Dongying has reached to the advanced degree. In order to forward the step of digital Dongying building, and to realize geospatial data sharing, geographic information sharing standards are drawn up and applied into realization. Secondly, Digital Dongying Geographic Information Sharing Platform has been constructed and developed, which is a highly integrated platform of WEBGIS. 3S (GIS, GPS, RS), Object oriented RDBMS, Internet, DCOM, etc. It provides an indispensable platform for sharing and interoperation of Digital Dongying Geospatial Data. According to the standards, and based on the platform, sharing and interoperation of "Digital Dongying" geospatial data have come into practice and the good results have been obtained. However, a perfect leadership group is necessary for data sharing and interoperation.

  4. Operational Interoperability Challenges on the Example of GEOSS and WIS

    NASA Astrophysics Data System (ADS)

    Heene, M.; Buesselberg, T.; Schroeder, D.; Brotzer, A.; Nativi, S.

    2015-12-01

    The following poster highlights the operational interoperability challenges on the example of Global Earth Observation System of Systems (GEOSS) and World Meteorological Organization Information System (WIS). At the heart of both systems is a catalogue of earth observation data, products and services but with different metadata management concepts. While in WIS a strong governance with an own metadata profile for the hundreds of thousands metadata records exists, GEOSS adopted a more open approach for the ten million records. Furthermore, the development of WIS - as an operational system - follows a roadmap with committed downwards compatibility while the GEOSS development process is more agile. The poster discusses how the interoperability can be reached for the different metadata management concepts and how a proxy concept helps to couple two different systems which follow a different development methodology. Furthermore, the poster highlights the importance of monitoring and backup concepts as a verification method for operational interoperability.

  5. Argo: an integrative, interactive, text mining-based workbench supporting curation

    PubMed Central

    Rak, Rafal; Rowley, Andrew; Black, William; Ananiadou, Sophia

    2012-01-01

    Curation of biomedical literature is often supported by the automatic analysis of textual content that generally involves a sequence of individual processing components. Text mining (TM) has been used to enhance the process of manual biocuration, but has been focused on specific databases and tasks rather than an environment integrating TM tools into the curation pipeline, catering for a variety of tasks, types of information and applications. Processing components usually come from different sources and often lack interoperability. The well established Unstructured Information Management Architecture is a framework that addresses interoperability by defining common data structures and interfaces. However, most of the efforts are targeted towards software developers and are not suitable for curators, or are otherwise inconvenient to use on a higher level of abstraction. To overcome these issues we introduce Argo, an interoperable, integrative, interactive and collaborative system for text analysis with a convenient graphic user interface to ease the development of processing workflows and boost productivity in labour-intensive manual curation. Robust, scalable text analytics follow a modular approach, adopting component modules for distinct levels of text analysis. The user interface is available entirely through a web browser that saves the user from going through often complicated and platform-dependent installation procedures. Argo comes with a predefined set of processing components commonly used in text analysis, while giving the users the ability to deposit their own components. The system accommodates various areas and levels of user expertise, from TM and computational linguistics to ontology-based curation. One of the key functionalities of Argo is its ability to seamlessly incorporate user-interactive components, such as manual annotation editors, into otherwise completely automatic pipelines. As a use case, we demonstrate the functionality of an in-built manual annotation editor that is well suited for in-text corpus annotation tasks. Database URL: http://www.nactem.ac.uk/Argo PMID:22434844

  6. Applications of the pipeline environment for visual informatics and genomics computations

    PubMed Central

    2011-01-01

    Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102

  7. Design and Implement AN Interoperable Internet of Things Application Based on AN Extended Ogc Sensorthings Api Standard

    NASA Astrophysics Data System (ADS)

    Huang, C. Y.; Wu, C. H.

    2016-06-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring and physical mashup applications can be constructed to improve people's daily life. However, IoT devices created by different manufacturers follow different proprietary protocols and cannot communicate with each other. This heterogeneity issue causes different products to be locked in multiple closed ecosystems that we call IoT silos. In order to address this issue, a common industrial solution is the hub approach, which implements connectors to communicate with IoT devices following different protocols. However, with the growing number of proprietary protocols proposed by device manufacturers, IoT hubs need to support and maintain a lot of customized connectors. Hence, we believe the ultimate solution to address the heterogeneity issue is to follow open and interoperable standard. Among the existing IoT standards, the Open Geospatial Consortium (OGC) SensorThings API standard supports comprehensive conceptual model and query functionalities. The first version of SensorThings API mainly focuses on connecting to IoT devices and sharing sensor observations online, which is the sensing capability. Besides the sensing capability, IoT devices could also be controlled via the Internet, which is the tasking capability. While the tasking capability was not included in the first version of the SensorThings API standard, this research aims on defining the tasking capability profile and integrates with the SensorThings API standard, which we call the extended-SensorThings API in this paper. In general, this research proposes a lightweight JSON-based web service description, the "Tasking Capability Description", allowing device owners and manufacturers to describe different IoT device protocols. Through the extended- SensorThings API, users and applications can follow a coherent protocol to control IoT devices that use different communication protocols, which could consequently achieve the interoperable Internet of Things infrastructure.

  8. A generic method for improving the spatial interoperability of medical and ecological databases.

    PubMed

    Ghenassia, A; Beuscart, J B; Ficheur, G; Occelli, F; Babykina, E; Chazard, E; Genin, M

    2017-10-03

    The availability of big data in healthcare and the intensive development of data reuse and georeferencing have opened up perspectives for health spatial analysis. However, fine-scale spatial studies of ecological and medical databases are limited by the change of support problem and thus a lack of spatial unit interoperability. The use of spatial disaggregation methods to solve this problem introduces errors into the spatial estimations. Here, we present a generic, two-step method for merging medical and ecological databases that avoids the use of spatial disaggregation methods, while maximizing the spatial resolution. Firstly, a mapping table is created after one or more transition matrices have been defined. The latter link the spatial units of the original databases to the spatial units of the final database. Secondly, the mapping table is validated by (1) comparing the covariates contained in the two original databases, and (2) checking the spatial validity with a spatial continuity criterion and a spatial resolution index. We used our novel method to merge a medical database (the French national diagnosis-related group database, containing 5644 spatial units) with an ecological database (produced by the French National Institute of Statistics and Economic Studies, and containing with 36,594 spatial units). The mapping table yielded 5632 final spatial units. The mapping table's validity was evaluated by comparing the number of births in the medical database and the ecological databases in each final spatial unit. The median [interquartile range] relative difference was 2.3% [0; 5.7]. The spatial continuity criterion was low (2.4%), and the spatial resolution index was greater than for most French administrative areas. Our innovative approach improves interoperability between medical and ecological databases and facilitates fine-scale spatial analyses. We have shown that disaggregation models and large aggregation techniques are not necessarily the best ways to tackle the change of support problem.

  9. Study and Simulation of Enhancements for TCP (Transmission Control Protocol) Performance Over Noisy, High-Latency Links

    NASA Technical Reports Server (NTRS)

    Shepard, Timothy J.; Partridge, Craig; Coulter, Robert

    1997-01-01

    The designers of the TCP/IP protocol suite explicitly included support of satellites in their design goals. The goal of the Internet Project was to design a protocol which could be layered over different networking technologies to allow them to be concatenated into an internet. The results of this project included two protocols, IP and TCP. IP is the protocol used by all elements in the network and it defines the standard packet format for IP datagrams. TCP is the end-to-end transport protocol commonly used between end systems on the Internet to derive a reliable bi-directional byte-pipe service from the underlying unreliable IP datagram service. Satellite links are explicitly mentioned in Vint Cerf's 2-page article which appeared in 1980 in CCR [2] to introduce the specifications for IP and TCP. In the past fifteen years, TCP has been demonstrated to work over many differing networking technologies, including over paths including satellites links. So if satellite links were in the minds of the designers from the beginning, what is the problem? The problem is that the performance of TCP has in some cases been disappointing. A goal of the authors of the original specification of TCP was to specify only enough behavior to ensure interoperability. The specification left a number of important decisions, in particular how much data is to be sent when, to the implementor. This was deliberately' done. By leaving performance-related decisions to the implementor, this would allow the protocol TCP to be tuned and adapted to different networks and situations in the future without the need to revise the specification of the protocol, or break interoperability. Interoperability would continue while future implementations would be allowed flexibility to adapt to needs which could not be anticipated at the time of the original protocol design.

  10. User-Defined Material Model for Progressive Failure Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F. Jr.; Reeder, James R. (Technical Monitor)

    2006-01-01

    An overview of different types of composite material system architectures and a brief review of progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model (or UMAT) for use with the ABAQUS/Standard1 nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details and use of the UMAT subroutine are described in the present paper. Parametric studies for composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented.

  11. A Story of a Crashed Plane in US-Mexican border

    NASA Astrophysics Data System (ADS)

    Bermudez, Luis; Hobona, Gobe; Vretanos, Peter; Peterson, Perry

    2013-04-01

    A plane has crashed on the US-Mexican border. The search and rescue command center planner needs to find information about the crash site, a mountain, nearby mountains for the establishment of a communications tower, as well as ranches for setting up a local incident center. Events like this one occur all over the world and exchanging information seamlessly is key to save lives and prevent further disasters. This abstract describes an interoperability testbed that applied this scenario using technologies based on Open Geospatial Consortium (OGC) standards. The OGC, which has about 500 members, serves as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC Interoperability Program conducts international interoperability testbeds, such as the OGC Web Services Phase 9 (OWS-9), that encourages rapid development, testing, validation, demonstration and adoption of open, consensus based standards and best practices. The Cross-Community Interoperability (CCI) thread in OWS-9 advanced the Web Feature Service for Gazetteers (WFS-G) by providing a Single Point of Entry Global Gazetteer (SPEGG), where a user can submit a single query and access global geographic names data across multiple Federal names databases. Currently users must make two queries with differing input parameters against two separate databases to obtain authoritative cross border geographic names data. The gazetteers in this scenario included: GNIS and GNS. GNIS or Geographic Names Information System is managed by USGS. It was first developed in 1964 and contains information about domestic and Antarctic names. GNS or GeoNET Names Server provides the Geographic Names Data Base (GNDB) and it is managed by National Geospatial Intelligence Agency (NGA). GNS has been in service since 1994, and serves names for areas outside the United States and its dependent areas, as well as names for undersea features. The following challenges were advanced: Cascaded WFS-G servers (allowing to query multiple WFSs with a "parent" WFS), implemented query names filters (e.g. fuzzy search, text search), implemented dealing with multilingualism and diacritics, implemented advanced spatial constraints (e.g. search by radial search and nearest neighbor) and semantically mediated feature types (e.g. mountain vs. hill). To enable semantic mediation, a series of semantic mappings were defined between the NGA GNS, USGS GNIS and the Alexandria Digital Library (ADL) Gazetteer. The mappings were encoded in the Web Ontology Language (OWL) to enable them to be used by semantic web technologies. The semantic mappings were then published for ingestion into a semantic mediator that used the mappings to associate location types from one gazetteer with location types in another. The semantic mediator was then able to transform requests on the fly, providing a single point of entry WFS-G to multiple gazetteers. The presentation will provide a live presentation of the work performed, highlight main developments, and discuss future development.

  12. Ocean Data Interoperability Platform (ODIP): developing a common global framework for marine data management through international collaboration

    NASA Astrophysics Data System (ADS)

    Glaves, Helen

    2015-04-01

    Marine research is rapidly moving away from traditional discipline specific science to a wider ecosystem level approach. This more multidisciplinary approach to ocean science requires large amounts of good quality, interoperable data to be readily available for use in an increasing range of new and complex applications. Significant amounts of marine data and information are already available throughout the world as a result of e-infrastructures being established at a regional level to manage and deliver marine data to the end user. However, each of these initiatives has been developed to address specific regional requirements and independently of those in other regions. Establishing a common framework for marine data management on a global scale necessitates that there is interoperability across these existing data infrastructures and active collaboration between the organisations responsible for their management. The Ocean Data Interoperability Platform (ODIP) project is promoting co-ordination between a number of these existing regional e-infrastructures including SeaDataNet and Geo-Seas in Europe, the Integrated Marine Observing System (IMOS) in Australia, the Rolling Deck to Repository (R2R) in the USA and the international IODE initiative. To demonstrate this co-ordinated approach the ODIP project partners are currently working together to develop several prototypes to test and evaluate potential interoperability solutions for solving the incompatibilities between the individual regional marine data infrastructures. However, many of the issues being addressed by the Ocean Data Interoperability Platform are not specific to marine science. For this reason many of the outcomes of this international collaborative effort are equally relevant and transferable to other domains.

  13. Trends of Underweight and Overweight/Obesity Among Japanese Schoolchildren From 2003 to 2012, Defined by Body Mass Index and Percentage Overweight Cutoffs

    PubMed Central

    Shirasawa, Takako; Ochiai, Hirotaka; Nanri, Hinako; Nishimura, Rimei; Ohtsu, Tadahiro; Hoshino, Hiromi; Tajima, Naoko; Kokaze, Akatsuki

    2015-01-01

    Background We investigated the prevalence and trends of underweight and overweight/obesity in a population-based sample of Japanese schoolchildren from 2003 to 2012, defined by body mass index (BMI) and percentage overweight (POW). Methods Subjects comprised fourth and seventh graders from the town of Ina, Japan, from 2003 to 2012. The height and weight of each subject were measured. Children were classified as underweight, normal weight, or overweight/obese using two criteria: BMI cutoff points proposed by the International Obesity Task Force and cutoffs based on POW in Japan. Results Data from 4367 fourth graders and 3724 seventh graders were analyzed. The prevalence of underweight and overweight as defined by POW criteria were lower than those based on BMI criteria. There was a decrease in the prevalence of overweight among fourth-grade boys and girls and seventh-grade girls according to BMI; this decrease was also observed when POW criteria were used for the definition of overweight. Conclusions The prevalence and trends of both underweight and overweight as defined by POW were underestimated among Japanese schoolchildren compared to those determined using BMI. The results of this study also suggest that trends in underweight and overweight/obesity using POW criteria are similar to those based on BMI criteria among schoolchildren in Japan. PMID:25986154

  14. Insulin secretion and action in North Indian women during pregnancy.

    PubMed

    Arora, G P; Almgren, P; Thaman, R G; Pal, A; Groop, L; Vaag, A; Prasad, R B; Brøns, C

    2017-10-01

    The relative roles(s) of impaired insulin secretion vs. insulin resistance in the development of gestational diabetes mellitus depend upon multiple risk factors and diagnostic criteria. Here, we explored their relative contribution to gestational diabetes as defined by the WHO 1999 (GDM1999) and adapted WHO 2013 (GDM2013) criteria, excluding the 1-h glucose value, in a high-risk Indian population from Punjab. Insulin secretion (HOMA2-B) and insulin action (HOMA2-IR) were assessed in 4665 Indian women with or without gestational diabetes defined by the GDM1999 or adapted GDM2013 criteria. Gestational diabetes defined using both criteria was associated with decreased insulin secretion compared with pregnant women with normal glucose tolerance. Women with gestational diabetes defined by the adapted GDM2013, but not GDM1999 criteria, were more insulin resistant than pregnant women with normal glucose tolerance, and furthermore displayed lower insulin secretion than GDM1999 women. Urban habitat, illiteracy, high age and low BMI were independently associated with reduced insulin secretion, whereas Sikh religion, increasing age and BMI, as well as a family history of diabetes were independently associated with increased insulin resistance. Gestational diabetes risk factors influence insulin secretion and action in North Indian women in a differential manner. Gestational diabetes classified using the adapted GDM2013 compared with GDM1999 criteria is associated with more severe impairments of insulin secretion and action. © 2017 Diabetes UK.

  15. When They Can't Talk, Lives Are Lost : What Public Officials Need to Know About Interoperability

    DOT National Transportation Integrated Search

    2003-02-01

    This publication was developed to facilitate education and discussion among and between elected and appointed officials, their representative associations, and public safety representatives on public safety wireless communications interoperability. T...

  16. Design and Implementation of a REST API for the Human Well Being Index (HWBI)

    EPA Science Inventory

    Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...

  17. Design and Implementation of a REST API for the ?Human Well Being Index (HWBI)

    EPA Science Inventory

    Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...

  18. CityGML - Interoperable semantic 3D city models

    NASA Astrophysics Data System (ADS)

    Gröger, Gerhard; Plümer, Lutz

    2012-07-01

    CityGML is the international standard of the Open Geospatial Consortium (OGC) for the representation and exchange of 3D city models. It defines the three-dimensional geometry, topology, semantics and appearance of the most relevant topographic objects in urban or regional contexts. These definitions are provided in different, well-defined Levels-of-Detail (multiresolution model). The focus of CityGML is on the semantical aspects of 3D city models, its structures, taxonomies and aggregations, allowing users to employ virtual 3D city models for advanced analysis and visualization tasks in a variety of application domains such as urban planning, indoor/outdoor pedestrian navigation, environmental simulations, cultural heritage, or facility management. This is in contrast to purely geometrical/graphical models such as KML, VRML, or X3D, which do not provide sufficient semantics. CityGML is based on the Geography Markup Language (GML), which provides a standardized geometry model. Due to this model and its well-defined semantics and structures, CityGML facilitates interoperable data exchange in the context of geo web services and spatial data infrastructures. Since its standardization in 2008, CityGML has become used on a worldwide scale: tools from notable companies in the geospatial field provide CityGML interfaces. Many applications and projects use this standard. CityGML is also having a strong impact on science: numerous approaches use CityGML, particularly its semantics, for disaster management, emergency responses, or energy-related applications as well as for visualizations, or they contribute to CityGML, improving its consistency and validity, or use CityGML, particularly its different Levels-of-Detail, as a source or target for generalizations. This paper gives an overview of CityGML, its underlying concepts, its Levels-of-Detail, how to extend it, its applications, its likely future development, and the role it plays in scientific research. Furthermore, its relationship to other standards from the fields of computer graphics and computer-aided architectural design and to the prospective INSPIRE model are discussed, as well as the impact CityGML has and is having on the software industry, on applications of 3D city models, and on science generally.

  19. Lessons learned from the implementation of remote control for the interoperability standard ISO/IEEE11073-20601 in a standard weighing scale.

    PubMed

    Barrón-González, Héctor Gilberto; Martínez-Espronceda, Miguel; Trigo, Jesús Daniel; Led, Santiago; Serrano, Luis

    2016-01-01

    The Point of Care (PoC) version of the interoperability standard ISO/IEEE11073 (X73) provided a mechanism to control remotely agents through documents X73-10201 and X73-20301. The newer version of X73 oriented to Personal Health Devices (PHD) has no mechanisms to do such a thing. The authors are working toward a common proposal with the PHD Working Group (PHD-WG) in order to adapt the remote control capabilities from X73PoC to X73PHD. However, this theoretical adaptation has to be implemented and tested to evaluate whether or not its inclusion entails an acceptable overhead and extra cost. Such proof-of-concept assessment is the main objective of this paper. For the sake of simplicity, a weighing scale with a configurable operation was chosen as use case. First, in a previous stage of the research - the model was defined. Second, the implementation methodology - both in terms of hardware and software - was defined and executed. Third, an evaluation methodology to test the remote control features was defined. Then, a thorough comparison between a weighing scale with and without remote control was performed. The results obtained indicate that, when implementing remote control in a weighing scale, the relative weight of such feature represents an overhead of as much as 53%, whereas the number of Implementation Conformance Statements (ICSs) to be satisfied by the manufacturer represent as much as 34% regarding the implementation without remote control. The new feature facilitates remote control of PHDs but, at the same time, increases overhead and costs, and, therefore, manufacturers need to weigh this trade-off. As a conclusion, this proof-of-concept helps in fostering the evolution of the remote control proposal to extend X73PHD and promotes its inclusion as part of the standard, as well as it illustrates the methodological steps for its extrapolation to other specializations. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. 76 FR 71048 - Assistance to Firefighters Grant Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-16

    ... the highest-scoring applications for awards. Applications that involve interoperable communications... categories: (1) Activities designed to reach high- risk target groups and mitigate incidences of death and... communications project is consistent with the Statewide Communications Interoperability Plan (SCIP). If the State...

  1. Identification of port communication equipment needs for safety, security, and interoperability

    DOT National Transportation Integrated Search

    2007-12-01

    Identification of Port Communication Equipment Needs for Safety, Security, and Interoperability is a big concern for current and : future need. The data demonstrates that two-way radios should be the most effective method of communication in both rou...

  2. Conceptual Model Formalization in a Semantic Interoperability Service Framework: Transforming Relational Database Schemas to OWL.

    PubMed

    Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd

    2014-01-01

    Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.

  3. FLTSATCOM interoperability applications

    NASA Astrophysics Data System (ADS)

    Woolford, Lynn

    A mobile Fleet Satellite Communications (FLTSATCOM) system called the Mobile Operational Control Center (MOCC) was developed which has demonstrated the ability to be interoperable with many of the current FLTSATCOM command and control channels. This low-cost system is secure in all its communications, is lightweight, and provides a gateway for other communications formats. The major elements of this system are made up of a personal computer, a protocol microprocessor, and off-the-shelf mobile communication components. It is concluded that with both FLTSATCOM channel protocol and data format interoperability, the MOCC has the ability provide vital information in or near real time, which significantly improves mission effectiveness.

  4. A framework for semantic interoperability in healthcare: a service oriented architecture based on health informatics standards.

    PubMed

    Ryan, Amanda; Eklund, Peter

    2008-01-01

    Healthcare information is composed of many types of varying and heterogeneous data. Semantic interoperability in healthcare is especially important when all these different types of data need to interact. Presented in this paper is a solution to interoperability in healthcare based on a standards-based middleware software architecture used in enterprise solutions. This architecture has been translated into the healthcare domain using a messaging and modeling standard which upholds the ideals of the Semantic Web (HL7 V3) combined with a well-known standard terminology of clinical terms (SNOMED CT).

  5. Scientific Digital Libraries, Interoperability, and Ontologies

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.

    2009-01-01

    Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.

  6. Current Barriers to Large-scale Interoperability of Traceability Technology in the Seafood Sector.

    PubMed

    Hardt, Marah J; Flett, Keith; Howell, Colleen J

    2017-08-01

    Interoperability is a critical component of full-chain digital traceability, but is almost nonexistent in the seafood industry. Using both quantitative and qualitative methodology, this study explores the barriers impeding progress toward large-scale interoperability among digital traceability systems in the seafood sector from the perspectives of seafood companies, technology vendors, and supply chains as a whole. We highlight lessons from recent research and field work focused on implementing traceability across full supply chains and make some recommendations for next steps in terms of overcoming challenges and scaling current efforts. © 2017 Institute of Food Technologists®.

  7. A Spectrum of Interoperability: The Site for Science Prototype for the NSDL; Re-Inventing the Wheel? Standards, Interoperability and Digital Cultural Content; Preservation Risk Management for Web Resources: Virtual Remote Control in Cornell's Project Prism; Safekeeping: A Cooperative Approach to Building a Digital Preservation Resource; Object Persistence and Availability in Digital Libraries; Illinois Digital Cultural Heritage Community-Collaborative Interactions among Libraries, Museums and Elementary Schools.

    ERIC Educational Resources Information Center

    Arms, William Y.; Hillmann, Diane; Lagoze, Carl; Krafft, Dean; Marisa, Richard; Saylor, John; Terizzi, Carol; Van de Sompel, Herbert; Gill, Tony; Miller, Paul; Kenney, Anne R.; McGovern, Nancy Y.; Botticelli, Peter; Entlich, Richard; Payette, Sandra; Berthon, Hilary; Thomas, Susan; Webb, Colin; Nelson, Michael L.; Allen, B. Danette; Bennett, Nuala A.; Sandore, Beth; Pianfetti, Evangeline S.

    2002-01-01

    Discusses digital libraries, including interoperability, metadata, and international standards; Web resource preservation efforts at Cornell University; digital preservation at the National Library of Australia; object persistence and availability; collaboration among libraries, museums and elementary schools; Asian digital libraries; and a Web…

  8. Interoperability of Electronic Health Records: A Physician-Driven Redesign.

    PubMed

    Miller, Holly; Johns, Lucy

    2018-01-01

    PURPOSE: Electronic health records (EHRs), now used by hundreds of thousands of providers and encouraged by federal policy, have the potential to improve quality and decrease costs in health care. But interoperability, although technically feasible among different EHR systems, is the weak link in the chain of logic. Interoperability is inhibited by poor understanding, by suboptimal implementation, and at times by a disinclination to dilute market share or patient base on the part of vendors or providers, respectively. The intent of this project has been to develop a series of practicable recommendations that, if followed by EHR vendors and users, can promote and enhance interoperability, helping EHRs reach their potential. METHODOLOGY: A group of 11 physicians, one nurse, and one health policy consultant, practicing from California to Massachusetts, has developed a document titled "Feature and Function Recommendations To Optimize Clinician Usability of Direct Interoperability To Enhance Patient Care" that offers recommendations from the clinician point of view. This report introduces some of these recommendations and suggests their implications for policy and the "virtualization" of EHRs. CONCLUSION: Widespread adoption of even a few of these recommendations by designers and vendors would enable a major advance toward the "Triple Aim" of improving the patient experience, improving the health of populations, and reducing per capita costs.

  9. The Sea Monitoring Virtual Research Community (VRC) in the EVER-EST Project (a virtual research environment for the Earth Sciences).

    NASA Astrophysics Data System (ADS)

    Foglini, Federica; Boero, Ferdinando; Guarino, Raffaele

    2016-04-01

    The EU's H2020 EVER-EST Project is dedicated to the realization of a Virtual Research Environment (VRE) for Earth Science researchers during 2015-2018. In this framework the Sea monitoring represents one of the four use case VRCs chosen to validate the EVER-EST e-infrastructure, which is aimed at representing a wide and multidisciplinary Earth Science domain. The objective of the Sea Monitoring Virtual Research Community (VRC) is to provide useful and applicable contributions to the identification and definition of variables indicated by the European Commission in the Marine Directive under the framework for Good Environment Status (GES). The European Marine Strategy Framework Directive (MSFD, http://ec.europa.eu/environment/marine/index_en.htm) has defined the descriptors for Good Environmental Status in marine waters. The first descriptor is biodiversity; the second one is the presence of non-indigenous species while the remaining nine (even when they consider physical, chemical or geological variables) require proper functioning of the ecosystem, linked to a good state of biodiversity. The Sea Monitoring VRC is direct to provide practical methods, procedures and protocols to support coherent and widely accepted interpretation of the Descriptors 1(Biodiversity), 2 (non- indigenous species), 4 (food webs) and 6 (seafloor integrity) identified in GES. In that context, the criteria and methodological standards already identified by the European Commission, and at same time considering the activities and projects in progress in the marine framework, will be taken into account. This research of practical methods to estimate and measure GES parameters requires a close cooperation among different disciplines including: biologists, geologists, geophysics, oceanographers, Earth observation experts and others. It will also require a number of different types of scientific data and observations (e.g. biology related, chemico-physical, etc.) from different inputs and sensors (e.g. remote sensing, on-site buoys, marine stations, administrations, citizen observations, etc.). Furthermore, different communities require support and guidance to be able to effectively interoperate and share practices, methods, standards and terminologies. The EVER-EST VRE will provide the Sea Monitoring VRC users community with an innovative framework aimed at enhancing their ability to interoperate and share knowledge, experience and methods for GES assessment and monitoring. Furthermore the Sea monitoring VRC will focus the attention on the implementation of Research Object (RO, a semantically rich aggregation of resources bringing together data, documents and methods in scientific investigations) for GES assessment to be shared among the wide sea monitoring community for the first time.

  10. Dispelling myths about rare disease registry system development

    PubMed Central

    2013-01-01

    Rare disease registries (RDRs) are an essential tool to improve knowledge and monitor interventions for rare diseases. If designed appropriately, patient and disease related information captured within them can become the cornerstone for effective diagnosis and new therapies. Surprisingly however, registries possess a diverse range of functionality, operate in different, often-times incompatible, software environments and serve various, and sometimes incongruous, purposes. Given the ambitious goals of the International Rare Diseases Research Consortium (IRDiRC) by 2020 and beyond, RDRs must be designed with the agility to evolve and efficiently interoperate in an ever changing rare disease landscape, as well as to cater for rapid changes in Information Communication Technologies. In this paper, we contend that RDR requirements will also evolve in response to a number of factors such as changing disease definitions and diagnostic criteria, the requirement to integrate patient/disease information from advances in either biotechnology and/or phenotypying approaches, as well as the need to adapt dynamically to security and privacy concerns. We dispel a number of myths in RDR development, outline key criteria for robust and sustainable RDR implementation and introduce the concept of a RDR Checklist to guide future RDR development. PMID:24131574

  11. Transition of advanced technology to military, homeland security, and law enforcement users

    NASA Astrophysics Data System (ADS)

    Jarrett, Stephen M.

    2004-09-01

    With the attack on the United States and the subsequent war on terror and the wars in Afghanistan and Iraq a need has been exposed for the transition of technology to all of our defenders both combat forces on the foreign battlefield and domestic forces here at home. The establishment of the Department of Homeland Security has also provided a focus on inserting technology to dramatically improve the capability of airport security forces, law enforcement, and all first responder networks. The drastic increase in the use of Special Forces in combat has also required new advanced technology capabilities at a much faster rate of development than the standard military procurement system. Technology developers must address the questions of interoperability, cost, commercialization, of how these groups will use the technology delivered and the adoption criteria of users in the deployment environment. The successful transition to the field must address the formation of complex concepts of operations in the user's adoption criteria. Prototype transition for two systems, a pocket infrared camera and an acoustic/seismic detector, will be highlighted in their effect on the wars in Iraq and Afghanistan and in the heightening of homeland security.

  12. 75 FR 417 - National Protection and Programs Directorate; Statewide Communication Interoperability Plan...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-05

    ... Communication Interoperability Plan Implementation Report AGENCY: National Protection and Programs Directorate... Directorate/Cybersecurity and Communications/Office of Emergency Communications, has submitted the following... INFORMATION: The Office of Emergency Communications (OEC), formed under Title XVIII of the Homeland Security...

  13. A Conceptual Framework to Enhance the Interoperability of Observatories among Countries, Continents and the World

    NASA Astrophysics Data System (ADS)

    Loescher, H.; Fundamental Instrument Unit

    2013-05-01

    Ecological research addresses challenges relating to the dynamics of the planet, such as changes in climate, biodiversity, ecosystem functioning and services, carbon and energy cycles, natural and human-induced hazards, and adaptation and mitigation strategies that involve many science and engineering disciplines and cross national boundaries. Because of the global nature of these challenges, greater international collaboration is required for knowledge sharing and technology deployment to advance earth science investigations and enhance societal benefits. For example, the Working Group on Biodiversity Preservation and Ecosystem Services (PCAST 2011) noted the scale and complexity of the physical and human resources needed to address these challenges. Many of the most pressing ecological research questions require global-scale data and global scale solutions (Suresh 2012), e.g., interdisciplinary data access from data centers managing ecological resources and hazards, drought, heat islands, carbon cycle, or data used to forecast the rate of spread of invasive species or zoonotic diseases. Variability and change at one location or in one region may well result from the superposition of global processes coupled together with regional and local modes of variability. For example, we know the El Niño-Southern Oscillation large-scale modes of variability in the coupled terrestrial-aquatic-atmospheric systems' correlation with variability in regional rainfall and ecosystem functions. It is therefore a high priority of government and non-government organizations to develop the necessary large scale, world-class research infrastructures for environmental research—and the framework by which these data can be shared, discovered, and utilized by a broad user community of scientists and policymakers, alike. Given that there are many, albeit nascent, efforts to build new environmental observatories/networks globally (e.g., EU-ICOS, EU-Lifewatch, AU-TERN, China-CERN, GEOSS, GEO-BON, NutNet, etc.) and domestically, (e.g., NSF-CZO, USDA-LTAR, DOE-NGEE, Soil Carbon Network, etc.), there is a strong and mutual desire to assure interoperability of data. Developing interoperability is the degree by which each of the following is mapped between observatories (entities), defined by linking i) science requirements with science questions, ii) traceability of measurements to nationally and internationally accepted standards, iii) how data product are derived, i.e., algorithms, procedures, and methods, and iv) the bioinformatics which broadly include data formats, metadata, controlled vocabularies, and semantics. Here, we explore the rationale and focus areas for interoperability, the governance and work structures, example projects (NSF-NEON, EU-ICOS, and AU-TERN), and the emergent roles of scientists in these endeavors.

  14. Integrating semantic dimension into openEHR archetypes for the management of cerebral palsy electronic medical records.

    PubMed

    Ellouze, Afef Samet; Bouaziz, Rafik; Ghorbel, Hanen

    2016-10-01

    Integrating semantic dimension into clinical archetypes is necessary once modeling medical records. First, it enables semantic interoperability and, it offers applying semantic activities on clinical data and provides a higher design quality of Electronic Medical Record (EMR) systems. However, to obtain these advantages, designers need to use archetypes that cover semantic features of clinical concepts involved in their specific applications. In fact, most of archetypes filed within open repositories are expressed in the Archetype Definition Language (ALD) which allows defining only the syntactic structure of clinical concepts weakening semantic activities on the EMR content in the semantic web environment. This paper focuses on the modeling of an EMR prototype for infants affected by Cerebral Palsy (CP), using the dual model approach and integrating semantic web technologies. Such a modeling provides a better delivery of quality of care and ensures semantic interoperability between all involved therapies' information systems. First, data to be documented are identified and collected from the involved therapies. Subsequently, data are analyzed and arranged into archetypes expressed in accordance of ADL. During this step, open archetype repositories are explored, in order to find the suitable archetypes. Then, ADL archetypes are transformed into archetypes expressed in OWL-DL (Ontology Web Language - Description Language). Finally, we construct an ontological source related to these archetypes enabling hence their annotation to facilitate data extraction and providing possibility to exercise semantic activities on such archetypes. Semantic dimension integration into EMR modeled in accordance to the archetype approach. The feasibility of our solution is shown through the development of a prototype, baptized "CP-SMS", which ensures semantic exploitation of CP EMR. This prototype provides the following features: (i) creation of CP EMR instances and their checking by using a knowledge base which we have constructed by interviews with domain experts, (ii) translation of initially CP ADL archetypes into CP OWL-DL archetypes, (iii) creation of an ontological source which we can use to annotate obtained archetypes and (vi) enrichment and supply of the ontological source and integration of semantic relations by providing hence fueling the ontology with new concepts, ensuring consistency and eliminating ambiguity between concepts. The degree of semantic interoperability that could be reached between EMR systems depends strongly on the quality of the used archetypes. Thus, the integration of semantic dimension in archetypes modeling process is crucial. By creating an ontological source and annotating archetypes, we create a supportive platform ensuring semantic interoperability between archetypes-based EMR-systems. Copyright © 2016. Published by Elsevier Inc.

  15. A framework for developing objective and measurable recovery criteria for threatened and endangered species.

    PubMed

    Himes Boor, Gina K

    2014-02-01

    For species listed under the U.S. Endangered Species Act (ESA), the U.S. Fish and Wildlife Service and National Marine Fisheries Service are tasked with writing recovery plans that include "objective, measurable criteria" that define when a species is no longer at risk of extinction, but neither the act itself nor agency guidelines provide an explicit definition of objective, measurable criteria. Past reviews of recovery plans, including one published in 2012, show that many criteria lack quantitative metrics with clear biological rationale and are not meeting the measureable and objective mandate. I reviewed how objective, measureable criteria have been defined implicitly and explicitly in peer-reviewed literature, the ESA, other U.S. statutes, and legal decisions. Based on a synthesis of these sources, I propose the following 6 standards be used as minimum requirements for objective, measurable criteria: contain a quantitative threshold with calculable units, stipulate a timeframe over which they must be met, explicitly define the spatial extent or population to which they apply, specify a sampling procedure that includes sample size, specify a statistical significance level, and include justification by providing scientific evidence that the criteria define a species whose extinction risk has been reduced to the desired level. To meet these 6 standards, I suggest that recovery plans be explicitly guided by and organized around a population viability modeling framework even if data or agency resources are too limited to complete a viability model. When data and resources are available, recovery criteria can be developed from the population viability model results, but when data and resources are insufficient for model implementation, extinction risk thresholds can be used as criteria. A recovery-planning approach centered on viability modeling will also yield appropriately focused data-acquisition and monitoring plans and will facilitate a seamless transition from recovery planning to delisting. © 2013 Society for Conservation Biology.

  16. EPA Scientific Knowledge Management Assessment and Needs

    EPA Science Inventory

    A series of activities have been conducted by a core group of EPA scientists from across the Agency. The activities were initiated in 2012 and the focus was to increase the reuse and interoperability of science software at EPA. The need for increased reuse and interoperability ...

  17. Interoperability in Personalized Adaptive Learning

    ERIC Educational Resources Information Center

    Aroyo, Lora; Dolog, Peter; Houben, Geert-Jan; Kravcik, Milos; Naeve, Ambjorn; Nilsson, Mikael; Wild, Fridolin

    2006-01-01

    Personalized adaptive learning requires semantic-based and context-aware systems to manage the Web knowledge efficiently as well as to achieve semantic interoperability between heterogeneous information resources and services. The technological and conceptual differences can be bridged either by means of standards or via approaches based on the…

  18. Towards Cross-Organizational Innovative Business Process Interoperability Services

    NASA Astrophysics Data System (ADS)

    Karacan, Ömer; Del Grosso, Enrico; Carrez, Cyril; Taglino, Francesco

    This paper presents the vision and initial results of the COIN (FP7-IST-216256) European project for the development of open source Collaborative Business Process Interoperability (CBPip) in cross-organisational business collaboration environments following the Software-as-a-Service Utility (SaaS-U) paradigm.

  19. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a... Public Safety and Homeland Security Bureau to develop, recommend, and administer policy goals, objectives... and procedures for the 700 MHz public safety broadband wireless network and other public safety...

  20. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a... Public Safety and Homeland Security Bureau to develop, recommend, and administer policy goals, objectives... and procedures for the 700 MHz public safety broadband wireless network and other public safety...

  1. 78 FR 15722 - Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-12

    ... Alerting, E9-1-1 Location Accuracy, Network Security Best Practices, DNSSEC Implementation Practices for... FEDERAL COMMUNICATIONS COMMISSION Federal Advisory Committee Act; Communications Security... Security, Reliability, and Interoperability Council (CSRIC) meeting that was scheduled for March 6, 2013 is...

  2. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran

    PubMed Central

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E.

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information. PMID:25954452

  3. Achieving interoperability for metadata registries using comparative object modeling.

    PubMed

    Park, Yu Rang; Kim, Ju Han

    2010-01-01

    Achieving data interoperability between organizations relies upon agreed meaning and representation (metadata) of data. For managing and registering metadata, many organizations have built metadata registries (MDRs) in various domains based on international standard for MDR framework, ISO/IEC 11179. Following this trend, two pubic MDRs in biomedical domain have been created, United States Health Information Knowledgebase (USHIK) and cancer Data Standards Registry and Repository (caDSR), from U.S. Department of Health & Human Services and National Cancer Institute (NCI), respectively. Most MDRs are implemented with indiscriminate extending for satisfying organization-specific needs and solving semantic and structural limitation of ISO/IEC 11179. As a result it is difficult to address interoperability among multiple MDRs. In this paper, we propose an integrated metadata object model for achieving interoperability among multiple MDRs. To evaluate this model, we developed an XML Schema Definition (XSD)-based metadata exchange format. We created an XSD-based metadata exporter, supporting both the integrated metadata object model and organization-specific MDR formats.

  4. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran.

    PubMed

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.

  5. Interoperable and accessible census and survey data from IPUMS.

    PubMed

    Kugler, Tracy A; Fitch, Catherine A

    2018-02-27

    The first version of the Integrated Public Use Microdata Series (IPUMS) was released to users in 1993, and since that time IPUMS has come to stand for interoperable and accessible census and survey data. Initially created to harmonize U.S. census microdata over time, IPUMS now includes microdata from the U.S. and international censuses and from surveys on health, employment, and other topics. IPUMS also provides geo-spatial data, aggregate population data, and environmental data. IPUMS supports ten data products, each disseminating an integrated data collection with a set of tools that make complex data easy to find, access, and use. Key features are record-level integration to create interoperable datasets, user-friendly interfaces, and comprehensive metadata and documentation. The IPUMS philosophy aligns closely with the FAIR principles of findability, accessibility, interoperability, and re-usability. IPUMS data have catalyzed knowledge generation across a wide range of social science and other disciplines, as evidenced by the large volume of publications and other products created by the vast IPUMS user community.

  6. Assessment of Collaboration and Interoperability in an Information Management System to Support Bioscience Research

    PubMed Central

    Myneni, Sahiti; Patel, Vimla L.

    2009-01-01

    Biomedical researchers often have to work on massive, detailed, and heterogeneous datasets that raise new challenges of information management. This study reports an investigation into the nature of the problems faced by the researchers in two bioscience test laboratories when dealing with their data management applications. Data were collected using ethnographic observations, questionnaires, and semi-structured interviews. The major problems identified in working with these systems were related to data organization, publications, and collaboration. The interoperability standards were analyzed using a C4I framework at the level of connection, communication, consolidation, and collaboration. Such an analysis was found to be useful in judging the capabilities of data management systems at different levels of technological competency. While collaboration and system interoperability are the “must have” attributes of these biomedical scientific laboratory information management applications, usability and human interoperability are the other design concerns that must also be addressed for easy use and implementation. PMID:20351900

  7. Auditing Consistency and Usefulness of LOINC Use among Three Large Institutions - Using Version Spaces for Grouping LOINC Codes

    PubMed Central

    Lin, M.C.; Vreeman, D.J.; Huff, S.M.

    2012-01-01

    Objectives We wanted to develop a method for evaluating the consistency and usefulness of LOINC code use across different institutions, and to evaluate the degree of interoperability that can be attained when using LOINC codes for laboratory data exchange. Our specific goals were to: 1) Determine if any contradictory knowledge exists in LOINC. 2) Determine how many LOINC codes were used in a truly interoperable fashion between systems. 3) Provide suggestions for improving the semantic interoperability of LOINC. Methods We collected Extensional Definitions (EDs) of LOINC usage from three institutions. The version space approach was used to divide LOINC codes into small sets, which made auditing of LOINC use across the institutions feasible. We then compared pairings of LOINC codes from the three institutions for consistency and usefulness. Results The number of LOINC codes evaluated were 1,917, 1,267 and 1,693 as obtained from ARUP, Intermountain and Regenstrief respectively. There were 2,022, 2,030, and 2,301 version spaces among ARUP & Intermountain, Intermountain & Regenstrief and ARUP & Regenstrief respectively. Using the EDs as the gold standard, there were 104, 109 and 112 pairs containing contradictory knowledge and there were 1,165, 765 and 1,121 semantically interoperable pairs. The interoperable pairs were classified into three levels: 1) Level I – No loss of meaning, complete information was exchanged by identical codes. 2) Level II – No loss of meaning, but processing of data was needed to make the data completely comparable. 3) Level III – Some loss of meaning. For example, tests with a specific ‘method’ could be rolled-up with tests that were ‘methodless’. Conclusions There are variations in the way LOINC is used for data exchange that result in some data not being truly interoperable across different enterprises. To improve its semantic interoperability, we need to detect and correct any contradictory knowledge within LOINC and add computable relationships that can be used for making reliable inferences about the data. The LOINC committee should also provide detailed guidance on best practices for mapping from local codes to LOINC codes and for using LOINC codes in data exchange. PMID:22306382

  8. Selection of remedial alternatives for mine sites: a multicriteria decision analysis approach.

    PubMed

    Betrie, Getnet D; Sadiq, Rehan; Morin, Kevin A; Tesfamariam, Solomon

    2013-04-15

    The selection of remedial alternatives for mine sites is a complex task because it involves multiple criteria and often with conflicting objectives. However, an existing framework used to select remedial alternatives lacks multicriteria decision analysis (MCDA) aids and does not consider uncertainty in the selection of alternatives. The objective of this paper is to improve the existing framework by introducing deterministic and probabilistic MCDA methods. The Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE) methods have been implemented in this study. The MCDA analysis involves processing inputs to the PROMETHEE methods that are identifying the alternatives, defining the criteria, defining the criteria weights using analytical hierarchical process (AHP), defining the probability distribution of criteria weights, and conducting Monte Carlo Simulation (MCS); running the PROMETHEE methods using these inputs; and conducting a sensitivity analysis. A case study was presented to demonstrate the improved framework at a mine site. The results showed that the improved framework provides a reliable way of selecting remedial alternatives as well as quantifying the impact of different criteria on selecting alternatives. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. 3D topography measurements on correlation cells—a new approach to forensic ballistics identifications

    NASA Astrophysics Data System (ADS)

    Song, John; Chu, Wei; Tong, Mingsi; Soons, Johannes

    2014-06-01

    Based on three-dimensional (3D) topography measurements on correlation cells, the National Institute of Standards and Technology (NIST) has developed the ‘NIST Ballistics Identification System (NBIS)’ aimed at accurate ballistics identifications and fast ballistics evidence searches. The 3D topographies are divided into arrays of correlation cells to identify ‘valid correlation areas’ and eliminate ‘invalid correlation areas’ from the matching and identification procedure. A ‘congruent matching cells’ (CMC)’ method using three types of identification parameters of the paired correlation cells (cross correlation function maximum CCFmax, spatial registration position in x-y and registration angle θ) is used for high accuracy ballistics identifications. ‘Synchronous processing’ is proposed for correlating multiple cell pairs at the same time to increase the correlation speed. The proposed NBIS can be used for correlations of both geometrical topographies and optical intensity images. All the correlation parameters and algorithms are in the public domain and subject to open tests. An error rate reporting procedure has been developed that can greatly add to the scientific support for the firearm and toolmark identification specialty, and give confidence to the trier of fact in court proceedings. The NBIS is engineered to employ transparent identification parameters and criteria, statistical models and correlation algorithms. In this way, interoperability between different ballistics identification systems can be more easily achieved. This interoperability will make the NBIS suitable for ballistics identifications and evidence searches with large national databases, such as the National Integrated Ballistic Information Network in the United States.

  10. Ontology-based knowledge representation for resolution of semantic heterogeneity in GIS

    NASA Astrophysics Data System (ADS)

    Liu, Ying; Xiao, Han; Wang, Limin; Han, Jialing

    2017-07-01

    Lack of semantic interoperability in geographical information systems has been identified as the main obstacle for data sharing and database integration. The new method should be found to overcome the problems of semantic heterogeneity. Ontologies are considered to be one approach to support geographic information sharing. This paper presents an ontology-driven integration approach to help in detecting and possibly resolving semantic conflicts. Its originality is that each data source participating in the integration process contains an ontology that defines the meaning of its own data. This approach ensures the automation of the integration through regulation of semantic integration algorithm. Finally, land classification in field GIS is described as the example.

  11. Using Knowledge Rules for Pharmacy Mapping

    PubMed Central

    Shakib, Shaun C.; Che, Chengjian; Lau, Lee Min

    2006-01-01

    The 3M Health Information Systems (HIS) Healthcare Data Dictionary (HDD) is used to encode and structure patient medication data for the Electronic Health Record (EHR) of the Department of Defense’s (DoD’s) Armed Forces Health Longitudinal Technology Application (AHLTA). HDD Subject Matter Experts (SMEs) are responsible for initial and maintenance mapping of disparate, standalone medication master files from all 100 DoD host sites worldwide to a single concept-based vocabulary, to accomplish semantic interoperability. To achieve higher levels of automation, SMEs began defining a growing set of knowledge rules. These knowledge rules were implemented in a pharmacy mapping tool, which enhanced consistency through automation and increased mapping rate by 29%. PMID:17238709

  12. 17 CFR Appendix A to Part 38 - Guidance on Compliance With Designation Criteria

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the criteria for designation. To the extent that compliance with, or satisfaction of, a criterion for designation is not self-explanatory from the face of the contract market's rules (as defined in § 40.1 of this... FACILITY—The board of trade shall—(A) establish and enforce rules defining, or specifications detailing...

  13. 15 CFR 768.6 - Criteria.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... DETERMINATION PROCEDURES AND CRITERIA § 768.6 Criteria. BIS will evaluate the evidence contained in a FAS or TAC... defined in § 768.1(d) of this part. In order to initiate an assessment, each FAS and TAC certification...

  14. 15 CFR 768.6 - Criteria.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... DETERMINATION PROCEDURES AND CRITERIA § 768.6 Criteria. BIS will evaluate the evidence contained in a FAS or TAC... defined in § 768.1(d) of this part. In order to initiate an assessment, each FAS and TAC certification...

  15. 15 CFR 768.6 - Criteria.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... DETERMINATION PROCEDURES AND CRITERIA § 768.6 Criteria. BIS will evaluate the evidence contained in a FAS or TAC... defined in § 768.1(d) of this part. In order to initiate an assessment, each FAS and TAC certification...

  16. 15 CFR 768.6 - Criteria.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... DETERMINATION PROCEDURES AND CRITERIA § 768.6 Criteria. BIS will evaluate the evidence contained in a FAS or TAC... defined in § 768.1(d) of this part. In order to initiate an assessment, each FAS and TAC certification...

  17. 15 CFR 768.6 - Criteria.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... DETERMINATION PROCEDURES AND CRITERIA § 768.6 Criteria. BIS will evaluate the evidence contained in a FAS or TAC... defined in § 768.1(d) of this part. In order to initiate an assessment, each FAS and TAC certification...

  18. The Next Stage: Moving from Isolated Digital Collections to Interoperable Digital Libraries.

    ERIC Educational Resources Information Center

    Besser, Howard

    2002-01-01

    Presents a conceptual framework for digital library development and discusses how to move from isolated digital collections to interoperable digital libraries. Topics include a history of digital libraries; user-centered architecture; stages of technological development; standards, including metadata; and best practices. (Author/LRW)

  19. Proceedings of the ITS Standards Program Review and Interoperability Workshop

    DOT National Transportation Integrated Search

    1997-12-17

    An ITS Standards Program Review and Interoperability Workshop was held on Dec. 17-18, 1997 in Arlington, Va. It was sponsored by the U.S. DOT, ITS America, George Mason University (GMU) and the University of Michigan. The purpose was to review the US...

  20. Putting the School Interoperability Framework to the Test

    ERIC Educational Resources Information Center

    Mercurius, Neil; Burton, Glenn; Hopkins, Bill; Larsen, Hans

    2004-01-01

    The Jurupa Unified School District in Southern California recently partnered with Microsoft, Dell and the Zone Integration Group for the implementation of a School Interoperability Framework (SIF) database repository model throughout the district (Magner 2002). A two-week project--the Integrated District Education Applications System, better known…

  1. 75 FR 66752 - Smart Grid Interoperability Standards; Notice of Technical Conference

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-29

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid... adoption of Smart Grid Interoperability Standards (Standards) in their States. On October 6, 2010, the....m. Eastern time in conjunction with the NARUC/FERC Collaborative on Smart Response (Collaborative...

  2. 76 FR 4102 - Smart Grid Interoperability Standards; Supplemental Notice of Technical Conference

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-24

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid... Federal Energy Regulatory Commission announced that a Technical Conference on Smart Grid Interoperability... National Institute of Standards and Technology are ready for Commission consideration in a rulemaking...

  3. An Agent-Based Dynamic Model for Analysis of Distributed Space Exploration Architectures

    NASA Astrophysics Data System (ADS)

    Sindiy, Oleg V.; DeLaurentis, Daniel A.; Stein, William B.

    2009-07-01

    A range of complex challenges, but also potentially unique rewards, underlie the development of exploration architectures that use a distributed, dynamic network of resources across the solar system. From a methodological perspective, the prime challenge is to systematically model the evolution (and quantify comparative performance) of such architectures, under uncertainty, to effectively direct further study of specialized trajectories, spacecraft technologies, concept of operations, and resource allocation. A process model for System-of-Systems Engineering is used to define time-varying performance measures for comparative architecture analysis and identification of distinguishing patterns among interoperating systems. Agent-based modeling serves as the means to create a discrete-time simulation that generates dynamics for the study of architecture evolution. A Solar System Mobility Network proof-of-concept problem is introduced representing a set of longer-term, distributed exploration architectures. Options within this set revolve around deployment of human and robotic exploration and infrastructure assets, their organization, interoperability, and evolution, i.e., a system-of-systems. Agent-based simulations quantify relative payoffs for a fully distributed architecture (which can be significant over the long term), the latency period before they are manifest, and the up-front investment (which can be substantial compared to alternatives). Verification and sensitivity results provide further insight on development paths and indicate that the framework and simulation modeling approach may be useful in architectural design of other space exploration mass, energy, and information exchange settings.

  4. Implementation of Medical Information Exchange System Based on EHR Standard

    PubMed Central

    Han, Soon Hwa; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong

    2010-01-01

    Objectives To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. Methods To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. Results The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. Conclusions This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information. PMID:21818447

  5. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things

    PubMed Central

    Jazayeri, Mohammad Ali; Liang, Steve H. L.; Huang, Chih-Yuan

    2015-01-01

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision. PMID:26402683

  6. [The role of Integrating the Healthcare Enterprise (IHE) in telemedicine].

    PubMed

    Bergh, B; Brandner, A; Heiß, J; Kutscha, U; Merzweiler, A; Pahontu, R; Schreiweis, B; Yüksekogul, N; Bronsch, T; Heinze, O

    2015-10-01

    Telemedicine systems are today already used in a variety of areas to improve patient care. The lack of standardization in those solutions creates a lack of interoperability of the systems. Internationally accepted standards can help to solve the lack of system interoperability. With Integrating the Healthcare Enterprise (IHE), a worldwide initiative of users and vendors is working on the use of defined standards for specific use cases by describing those use cases in so called IHE Profiles. The aim of this work is to determine how telemedicine applications can be implemented using IHE profiles. Based on a literature review, exemplary telemedicine applications are described and technical abilities of IHE Profiles are evaluated. These IHE Profiles are examined for their usability and are then evaluated in exemplary telemedicine application architectures. There are IHE Profiles which can be identified as being useful for intersectoral patient records (e.g. PEHR at Heidelberg), as well as for point to point communication where no patient record is involved. In the area of patient records, the IHE Profile "Cross-Enterprise Document Sharing (XDS)" is often used. The point to point communication can be supported using the IHE "Cross-Enterprise Document Media Interchange (XDM)". IHE-based telemedicine applications offer caregivers the possibility to be informed about their patients using data from intersectoral patient records, but also there are possible savings by reusing the standardized interfaces in other scenarios.

  7. An Assessment of Coherence Between Early Warning and Response Systems and Serious Cross-Border Health Threats in the European Union and Turkey.

    PubMed

    Ekmekci, Perihan Elif

    2016-12-01

    Disease outbreaks have attracted the attention of the public health community to early warning and response systems (EWRS) for communicable diseases and other cross-border threats to health. The European Union (EU) and the World Health Organization (WHO) have published regulations in this area. Decision 1082/2013/EU brought a new approach the management of public health threats in EU member states. Decision 1082/2013/EU brought several innovations, which included establishing a Health Security Committee; preparedness and response planning; joint procurement of medical countermeasures; ad hoc monitoring for biological, chemical, and environmental threats; EWRS; and recognition of an emergency situation and interoperability between various sectors. Turkey, as an acceding country to the EU and a member of the WHO, has been improving its national public health system to meet EU legislations and WHO standards. This article first explains EWRS as defined in Decision 1082/2013/EU and Turkey's obligations to align its public health laws to the EU acquis. EWRS in Turkey are addressed, particularly their coherence with EU policies regarding preparedness and response, alert notification, and interoperability between health and other sectors. Finally, the challenges and limitations of the current Turkish system are discussed and further improvements are suggested. (Disaster Med Public Health Preparedness. 2016;10:883-892).

  8. Developing a Standard Method for Link-Layer Security of CCSDS Space Communications

    NASA Technical Reports Server (NTRS)

    Biggerstaff, Craig

    2009-01-01

    Communications security for space systems has been a specialized field generally far removed from considerations of mission interoperability and cross-support in fact, these considerations often have been viewed as intrinsically opposed to security objectives. The space communications protocols defined by the Consultative Committee for Space Data Systems (CCSDS) have a twenty-five year history of successful use in over 400 missions. While the CCSDS Telemetry, Telecommand, and Advancing Orbiting Systems protocols for use at OSI Layer 2 are operationally mature, there has been no direct support within these protocols for communications security techniques. Link-layer communications security has been successfully implemented in the past using mission-unique methods, but never before with an objective of facilitating cross-support and interoperability. This paper discusses the design of a standard method for cryptographic authentication, encryption, and replay protection at the data link layer that can be integrated into existing CCSDS protocols without disruption to legacy communications services. Integrating cryptographic operations into existing data structures and processing sequences requires a careful assessment of the potential impediments within spacecraft, ground stations, and operations centers. The objective of this work is to provide a sound method for cryptographic encapsulation of frame data that also facilitates Layer 2 virtual channel switching, such that a mission may procure data transport services as needed without involving third parties in the cryptographic processing, or split independent data streams for separate cryptographic processing.

  9. Implementation of Medical Information Exchange System Based on EHR Standard.

    PubMed

    Han, Soon Hwa; Lee, Min Ho; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong

    2010-12-01

    To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information.

  10. The impact of interoperability of electronic health records on ambulatory physician practices: a discrete-event simulation study.

    PubMed

    Zhou, Yuan; Ancker, Jessica S; Upadhye, Mandar; McGeorge, Nicolette M; Guarrera, Theresa K; Hegde, Sudeep; Crane, Peter W; Fairbanks, Rollin J; Bisantz, Ann M; Kaushal, Rainu; Lin, Li

    2013-01-01

    The effect of health information technology (HIT) on efficiency and workload among clinical and nonclinical staff has been debated, with conflicting evidence about whether electronic health records (EHRs) increase or decrease effort. None of this paper to date, however, examines the effect of interoperability quantitatively using discrete event simulation techniques. To estimate the impact of EHR systems with various levels of interoperability on day-to-day tasks and operations of ambulatory physician offices. Interviews and observations were used to collect workflow data from 12 adult primary and specialty practices. A discrete event simulation model was constructed to represent patient flows and clinical and administrative tasks of physicians and staff members. High levels of EHR interoperability were associated with reduced time spent by providers on four tasks: preparing lab reports, requesting lab orders, prescribing medications, and writing referrals. The implementation of an EHR was associated with less time spent by administrators but more time spent by physicians, compared with time spent at paper-based practices. In addition, the presence of EHRs and of interoperability did not significantly affect the time usage of registered nurses or the total visit time and waiting time of patients. This paper suggests that the impact of using HIT on clinical and nonclinical staff work efficiency varies, however, overall it appears to improve time efficiency more for administrators than for physicians and nurses.

  11. IHE cross-enterprise document sharing for imaging: interoperability testing software

    PubMed Central

    2010-01-01

    Background With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties. PMID:20858241

  12. IHE cross-enterprise document sharing for imaging: interoperability testing software.

    PubMed

    Noumeir, Rita; Renaud, Bérubé

    2010-09-21

    With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  13. The RICORDO approach to semantic interoperability for biomedical data and models: strategy, standards and solutions

    PubMed Central

    2011-01-01

    Background The practice and research of medicine generates considerable quantities of data and model resources (DMRs). Although in principle biomedical resources are re-usable, in practice few can currently be shared. In particular, the clinical communities in physiology and pharmacology research, as well as medical education, (i.e. PPME communities) are facing considerable operational and technical obstacles in sharing data and models. Findings We outline the efforts of the PPME communities to achieve automated semantic interoperability for clinical resource documentation in collaboration with the RICORDO project. Current community practices in resource documentation and knowledge management are overviewed. Furthermore, requirements and improvements sought by the PPME communities to current documentation practices are discussed. The RICORDO plan and effort in creating a representational framework and associated open software toolkit for the automated management of PPME metadata resources is also described. Conclusions RICORDO is providing the PPME community with tools to effect, share and reason over clinical resource annotations. This work is contributing to the semantic interoperability of DMRs through ontology-based annotation by (i) supporting more effective navigation and re-use of clinical DMRs, as well as (ii) sustaining interoperability operations based on the criterion of biological similarity. Operations facilitated by RICORDO will range from automated dataset matching to model merging and managing complex simulation workflows. In effect, RICORDO is contributing to community standards for resource sharing and interoperability. PMID:21878109

  14. Interoperable web applications for sharing data and products of the International DORIS Service

    NASA Astrophysics Data System (ADS)

    Soudarin, L.; Ferrage, P.

    2017-12-01

    The International DORIS Service (IDS) was created in 2003 under the umbrella of the International Association of Geodesy (IAG) to foster scientific research related to the French satellite tracking system DORIS and to deliver scientific products, mostly related to the International Earth rotation and Reference systems Service (IERS). Since its start, the organization has continuously evolved, leading to additional and improved operational products from an expanded set of DORIS Analysis Centers. In addition, IDS has developed services for sharing data and products with the users. Metadata and interoperable web applications are proposed to explore, visualize and download the key products such as the position time series of the geodetic points materialized at the ground tracking stations. The Global Geodetic Observing System (GGOS) encourages the IAG Services to develop such interoperable facilities on their website. The objective for GGOS is to set up an interoperable portal through which the data and products produced by the IAG Services can be served to the user community. We present the web applications proposed by IDS to visualize time series of geodetic observables or to get information about the tracking ground stations and the tracked satellites. We discuss the future plans for IDS to meet the recommendations of GGOS. The presentation also addresses the needs for the IAG Services to adopt common metadata thesaurus to describe data and products, and interoperability standards to share them.

  15. MRML: an extensible communication protocol for interoperability and benchmarking of multimedia information retrieval systems

    NASA Astrophysics Data System (ADS)

    Mueller, Wolfgang; Mueller, Henning; Marchand-Maillet, Stephane; Pun, Thierry; Squire, David M.; Pecenovic, Zoran; Giess, Christoph; de Vries, Arjen P.

    2000-10-01

    While in the area of relational databases interoperability is ensured by common communication protocols (e.g. ODBC/JDBC using SQL), Content Based Image Retrieval Systems (CBIRS) and other multimedia retrieval systems are lacking both a common query language and a common communication protocol. Besides its obvious short term convenience, interoperability of systems is crucial for the exchange and analysis of user data. In this paper, we present and describe an extensible XML-based query markup language, called MRML (Multimedia Retrieval markup Language). MRML is primarily designed so as to ensure interoperability between different content-based multimedia retrieval systems. Further, MRML allows researchers to preserve their freedom in extending their system as needed. MRML encapsulates multimedia queries in a way that enable multimedia (MM) query languages, MM content descriptions, MM query engines, and MM user interfaces to grow independently from each other, reaching a maximum of interoperability while ensuring a maximum of freedom for the developer. For benefitting from this, only a few simple design principles have to be respected when extending MRML for one's fprivate needs. The design of extensions withing the MRML framework will be described in detail in the paper. MRML has been implemented and tested for the CBIRS Viper, using the user interface Snake Charmer. Both are part of the GNU project and can be downloaded at our site.

  16. Building the Synergy between Public Sector and Research Data Infrastructures

    NASA Astrophysics Data System (ADS)

    Craglia, Massimo; Friis-Christensen, Anders; Ostländer, Nicole; Perego, Andrea

    2014-05-01

    INSPIRE is a European Directive aiming to establish a EU-wide spatial data infrastructure to give cross-border access to information that can be used to support EU environmental policies, as well as other policies and activities having an impact on the environment. In order to ensure cross-border interoperability of data infrastructures operated by EU Member States, INSPIRE sets out a framework based on common specifications for metadata, data, network services, data and service sharing, monitoring and reporting. The implementation of INSPIRE has reached important milestones: the INSPIRE Geoportal was launched in 2011 providing a single access point for the discovery of INSPIRE data and services across EU Member States (currently, about 300K), while all the technical specifications for the interoperability of data across the 34 INSPIRE themes were adopted at the end of 2013. During this period a number of EU and international initiatives has been launched, concerning cross-domain interoperability and (Linked) Open Data. In particular, the EU Open Data Portal, launched in December 2012, made provisions to access government and scientific data from EU institutions and bodies, and the EU ISA Programme (Interoperability Solutions for European Public Administrations) promotes cross-sector interoperability by sharing and re-using EU-wide and national standards and components. Moreover, the Research Data Alliance (RDA), an initiative jointly funded by the European Commission, the US National Science Foundation and the Australian Research Council, was launched in March 2013 to promote scientific data sharing and interoperability. The Joint Research Centre of the European Commission (JRC), besides being the technical coordinator of the implementation of INSPIRE, is also actively involved in the initiatives promoting cross-sector re-use in INSPIRE, and sustainable approaches to address the evolution of technologies - in particular, how to support Linked Data in INSPIRE and the use of global persistent identifiers. It is evident that government and scientific data infrastructures are currently facing a number of issues that have already been addressed in INSPIRE. Sharing experiences and competencies will avoid re-inventing the wheel, and help promoting the cross-domain adoption of consistent solutions. Actually, one of the lessons learnt from INSPIRE and the initiatives in which JRC is involved, is that government and research data are not two separate worlds. Government data are commonly used as a basis to create scientific data, and vice-versa. Consequently, it is fundamental to adopt a consistent approach to address interoperability and data management issues shared by both government and scientific data. The presentation illustrates some of the lessons learnt during the implementation of INSPIRE and in work on data and service interoperability coordinated with European and international initiatives. We describe a number of critical interoperability issues and barriers affecting both scientific and government data, concerning, e.g., data terminologies, quality and licensing, and propose how these problems could be effectively addressed by a closer collaboration of the government and scientific communities, and the sharing of experiences and practices.

  17. Accuracy of Carotid Duplex Criteria in Diagnosis of Significant Carotid Stenosis in Asian Patients.

    PubMed

    Dharmasaroja, Pornpatr A; Uransilp, Nattaphol; Watcharakorn, Arvemas; Piyabhan, Pritsana

    2018-03-01

    Extracranial carotid stenosis can be diagnosed by velocity criteria of carotid duplex. Whether they are accurately applied to define severity of internal carotid artery (ICA) stenosis in Asian patients needs to be proved. The purpose of this study was to evaluate the accuracy of 2 carotid duplex velocity criteria in defining significant carotid stenosis. Carotid duplex studies and magnetic resonance angiography were reviewed. Criteria 1 was recommended by the Society of Radiologists in Ultrasound; moderate stenosis (50%-69%): peak systolic velocity (PSV) 125-230 cm/s, diastolic velocity (DV) 40-100 cm/s; severe stenosis (>70%): PSV greater than 230 cm/s, DV greater than 100 cm/s. Criteria 2 used PSV greater than 140 cm/s, DV less than 110 cm/s to define moderate stenosis (50%-75%) and PSV greater than 140 cm/s, DV greater than 110 cm/s for severe stenosis (76%-95%). A total of 854 ICA segments were reviewed. There was moderate stenosis in 72 ICAs, severe stenosis in 50 ICAs, and occlusion in 78 ICAs. Criteria 2 had slightly lower sensitivity, whereas higher specificity and accuracy than criteria 1 were observed in detecting moderate stenosis (criteria 1: sensitivity 95%, specificity 83%, accuracy 84%; criteria 2: sensitivity 92%, specificity 92%, and accuracy 92%). However, in detection of severe ICA stenosis, no significant difference in sensitivity, specificity, and accuracy was found (criteria 1: sensitivity 82%, specificity 99.57%, accuracy 98%; criteria 2: sensitivity 86%, specificity 99.68%, and accuracy 99%). In the subgroup of moderate stenosis, the criteria using ICA PSV greater than 140 cm/s had higher specificity and accuracy than the criteria using ICA PSV 125-230 cm/s. However, there was no significant difference in detection of severe stenosis or occlusion of ICA. Copyright © 2018 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  18. [Prognostic criteria of efficacy of programmed laparoscopic sanitation of the abdominal cavity in peritonitis].

    PubMed

    Salakhov, E K; Vlasov, A P; Bolotskyh, V A

    To define prognostic criteria of efficacy of programmed laparoscopic sanitation of the abdominal cavity in peritonitis. There were 32 patients after programmed laparoscopic sanitation of abdominal cavity for peritonitis due to different acute surgical diseases. Subsequently 12 of them required relaparotomy due to poor effectiveness of laparoscopic sanitation. Comprehensive clinical examination and laboratory assessment of some indexes of homeostasis and oxidative status were conducted. Prognostic clinical and laboratory criteria of efficacy of laparoscopic abdominal sanitation were suggested after analysis of intraoperative data during primary surgery and laboratory values in the 1st postoperative day. The offered prognostic criteria allow to define further management of peritonitis patients after primary laparotomy.

  19. Proposed Interoperability Readiness Level Assessment for Mission Critical Interfaces During Navy Acquisition

    DTIC Science & Technology

    2010-12-01

    This involves zeroing and recreating the interoperability arrays and other variables used in the simulation. Since the constants do not change from run......Using this algorithm, the process of encrypting/decrypting data requires very little computation, and the generation of the random pads can be

  20. 78 FR 66298 - Promoting Interoperability in the 700 MHz Commercial Spectrum; Requests for Waiver and Extension...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-05

    ... FEDERAL COMMUNICATIONS COMMISSION 47 CFR Part 27 [WT Docket Nos. 12-69, 12-332; FCC 13-136] Promoting Interoperability in the 700 MHz Commercial Spectrum; Requests for Waiver and Extension of Lower 700 MHz Band Interim Construction Benchmark Deadlines AGENCY: Federal Communications Commission...

  1. Exploring Interoperability as a Multidimensional Challenge for Effective Emergency Response

    ERIC Educational Resources Information Center

    Santisteban, Hiram

    2010-01-01

    Purpose. The purpose of this research was to further an understanding of how the federal government is addressing the challenges of interoperability for emergency response or crisis management (FEMA, 2009) by informing the development of standards through the review of current congressional law, commissions, studies, executive orders, and…

  2. An Access Control and Trust Management Framework for Loosely-Coupled Multidomain Environments

    ERIC Educational Resources Information Center

    Zhang, Yue

    2010-01-01

    Multidomain environments where multiple organizations interoperate with each other are becoming a reality as can be seen in emerging Internet-based enterprise applications. Access control to ensure secure interoperation in such an environment is a crucial challenge. A multidomain environment can be categorized as "tightly-coupled" and…

  3. OpenICE medical device interoperability platform overview and requirement analysis.

    PubMed

    Arney, David; Plourde, Jeffrey; Goldman, Julian M

    2018-02-23

    We give an overview of OpenICE, an open source implementation of the ASTM standard F2761 for the Integrated Clinical Environment (ICE) that leverages medical device interoperability, together with an analysis of the clinical and non-functional requirements and community process that inspired its design.

  4. 47 CFR 90.525 - Administration of interoperability channels.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Regulations Governing the Licensing and Use of... meeting the requirements of § 90.523 may operate mobile or portable units on the Interoperability channels... Commission provided it holds a part 90 license. All persons operating mobile or portable units under this...

  5. SURA Coastal Ocean Observing and Prediction (SCOOP) Program: Integrating Marine Science and Information Technology

    DTIC Science & Technology

    2006-09-30

    coastal phenomena. OBJECTIVES SURA is creating a SCOOP “Grid” that extends the interoperability enabled by the World Wide Web. The coastal ... community faces special challenges with respect to achieving a level of interoperability that can leverage emerging Grid technologies. With that in mind

  6. Waveform Diversity and Design for Interoperating Radar Systems

    DTIC Science & Technology

    2013-01-01

    University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica , Telecomunicazioni Via Girolamo Caruso 16 Pisa, Italy 56122...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica ...DIPARTIMENTO DI INGEGNERIA DELL’INFORMAZIONE ELETTRONICA, INFORMATICA , TELECOMUNICAZIONI WAVEFORM DIVERSITY AND DESIGN FOR INTEROPERATING

  7. RuleML-Based Learning Object Interoperability on the Semantic Web

    ERIC Educational Resources Information Center

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  8. 75 FR 81605 - Smart Grid Interoperability Standards; Notice of Technical Conference

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-28

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Technical Conference December 21, 2010. Take notice that the Federal Energy... National Institute of Standards and Technology and included in this proceeding are ready for Commission...

  9. 78 FR 59903 - Emission Mask Requirements for Digital Technologies on 800 MHz NPSPAC Channels; Analog FM...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ...] Emission Mask Requirements for Digital Technologies on 800 MHz NPSPAC Channels; Analog FM Capability on Mutual Aid and Interoperability Channels AGENCY: Federal Communications Commission. ACTION: Proposed rule... interoperability channels. These proposals could help safeguard public safety licensees in the NPSPAC band from...

  10. Using the EC decision on case definitions for communicable diseases as a terminology source--lessons learned.

    PubMed

    Balkanyi, Laszlo; Heja, Gergely; Nagy, Attlia

    2014-01-01

    Extracting scientifically accurate terminology from an EU public health regulation is part of the knowledge engineering work at the European Centre for Disease Prevention and Control (ECDC). ECDC operates information systems at the crossroads of many areas - posing a challenge for transparency and consistency. Semantic interoperability is based on the Terminology Server (TS). TS value sets (structured vocabularies) describe shared domains as "diseases", "organisms", "public health terms", "geo-entities" "organizations" and "administrative terms" and others. We extracted information from the relevant EC Implementing Decision on case definitions for reporting communicable diseases, listing 53 notifiable infectious diseases, containing clinical, diagnostic, laboratory and epidemiological criteria. We performed a consistency check; a simplification - abstraction; we represented lab criteria in triplets: as 'y' procedural result /of 'x' organism-substance/on 'z' specimen and identified negations. The resulting new case definition value set represents the various formalized criteria, meanwhile the existing disease value set has been extended, new signs and symptoms were added. New organisms enriched the organism value set. Other new categories have been added to the public health value set, as transmission modes; substances; specimens and procedures. We identified problem areas, as (a) some classification error(s); (b) inconsistent granularity of conditions; (c) seemingly nonsense criteria, medical trivialities; (d) possible logical errors, (e) seemingly factual errors that might be phrasing errors. We think our hypothesis regarding room for possible improvements is valid: there are some open issues and a further improved legal text might lead to more precise epidemiologic data collection. It has to be noted that formal representation for automatic classification of cases was out of scope, such a task would require other formalism, as e.g. those used by rule-based decision support systems.

  11. Survivability Modeling & Simulation(Aircraft Survivability, Fall 2009)

    DTIC Science & Technology

    2009-01-01

    Projects.” The Human Effectiveness Directorate is responsible for providing injury assessments for most modern Air Force ejection systems, for...developing ejection test mannequins, and for continuing to define human injury limits and criteria. The directorate maintains a man-rated horizontal...Using numerous models and testing, the directorate can define ejection /impact injury criteria for aircraft equipment to prevent personnel injuries

  12. Sharing Health Big Data for Research - A Design by Use Cases: The INSHARE Platform Approach.

    PubMed

    Bouzillé, Guillaume; Westerlynck, Richard; Defossez, Gautier; Bouslimi, Dalel; Bayat, Sahar; Riou, Christine; Busnel, Yann; Le Guillou, Clara; Cauvin, Jean-Michel; Jacquelinet, Christian; Pladys, Patrick; Oger, Emmanuel; Stindel, Eric; Ingrand, Pierre; Coatrieux, Gouenou; Cuggia, Marc

    2017-01-01

    Sharing and exploiting Health Big Data (HBD) allow tackling challenges: data protection/governance taking into account legal, ethical, and deontological aspects enables trust, transparent and win-win relationship between researchers, citizens, and data providers. Lack of interoperability: compartmentalized and syntactically/semantica heterogeneous data. INSHARE project using experimental proof of concept explores how recent technologies overcome such issues. Using 6 data providers, platform is designed via 3 steps to: (1) analyze use cases, needs, and requirements; (2) define data sharing governance, secure access to platform; and (3) define platform specifications. Three use cases - from 5 studies and 11 data sources - were analyzed for platform design. Governance derived from SCANNER model was adapted to data sharing. Platform architecture integrates: data repository and hosting, semantic integration services, data processing, aggregate computing, data quality and integrity monitoring, Id linking, multisource query builder, visualization and data export services, data governance, study management service and security including data watermarking.

  13. Integration of robotic resources into FORCEnet

    NASA Astrophysics Data System (ADS)

    Nguyen, Chinh; Carroll, Daniel; Nguyen, Hoa

    2006-05-01

    The Networked Intelligence, Surveillance, and Reconnaissance (NISR) project integrates robotic resources into Composeable FORCEnet to control and exploit unmanned systems over extremely long distances. The foundations are built upon FORCEnet-the U.S. Navy's process to define C4ISR for net-centric operations-and the Navy Unmanned Systems Common Control Roadmap to develop technologies and standards for interoperability, data sharing, publish-and-subscribe methodology, and software reuse. The paper defines the goals and boundaries for NISR with focus on the system architecture, including the design tradeoffs necessary for unmanned systems in a net-centric model. Special attention is given to two specific scenarios demonstrating the integration of unmanned ground and water surface vehicles into the open-architecture web-based command-and-control information-management system of Composeable FORCEnet. Planned spiral development for NISR will improve collaborative control, expand robotic sensor capabilities, address multiple domains including underwater and aerial platforms, and extend distributive communications infrastructure for battlespace optimization for unmanned systems in net-centric operations.

  14. What Is Trophoblast? A Combination of Criteria Define Human First-Trimester Trophoblast

    PubMed Central

    Lee, Cheryl Q.E.; Gardner, Lucy; Turco, Margherita; Zhao, Nancy; Murray, Matthew J.; Coleman, Nicholas; Rossant, Janet; Hemberger, Myriam; Moffett, Ashley

    2016-01-01

    Summary Controversy surrounds reports describing the derivation of human trophoblast cells from placentas and embryonic stem cells (ESC), partly due to the difficulty in identifying markers that define cells as belonging to the trophoblast lineage. We have selected criteria that are characteristic of primary first-trimester trophoblast: a set of protein markers, HLA class I profile, methylation of ELF5, and expression of microRNAs (miRNAs) from the chromosome 19 miRNA cluster (C19MC). We tested these criteria on cells previously reported to show some phenotypic characteristics of trophoblast: bone morphogenetic protein (BMP)-treated human ESC and 2102Ep, an embryonal carcinoma cell line. Both cell types only show some, but not all, of the four trophoblast criteria. Thus, BMP-treated human ESC have not fully differentiated to trophoblast. Our study identifies a robust panel, including both protein and non-protein-coding markers that, in combination, can be used to reliably define cells as characteristic of early trophoblast. PMID:26862703

  15. Coalition readiness management system preliminary interoperability experiment (CReaMS PIE)

    NASA Astrophysics Data System (ADS)

    Clark, Peter; Ryan, Peter; Zalcman, Lucien; Robbie, Andrew

    2003-09-01

    The United States Navy (USN) has initiated the Coalition Readiness Management System (CReaMS) Initiative to enhance coalition warfighting readiness through advancing development of a team interoperability training and combined mission rehearsal capability. It integrates evolving cognitive team learning principles and processes with advanced technology innovations to produce an effective and efficient team learning environment. The JOint Air Navy Networking Environment (JOANNE) forms the Australian component of CReaMS. The ultimate goal is to link Australian Defence simulation systems with the USN Battle Force Tactical Training (BFTT) system to demonstrate and achieve coalition level warfare training in a synthetic battlespace. This paper discusses the initial Preliminary Interoperability Experiment (PIE) involving USN and Australian Defence establishments.

  16. A Shovel-Ready Solution to Fill the Nursing Data Gap in the Interdisciplinary Clinical Picture.

    PubMed

    Keenan, Gail M; Lopez, Karen Dunn; Sousa, Vanessa E C; Stifter, Janet; Macieira, Tamara G R; Boyd, Andrew D; Yao, Yingwei; Herdman, T Heather; Moorhead, Sue; McDaniel, Anna; Wilkie, Diana J

    2018-01-01

    To critically evaluate 2014 American Academy of Nursing (AAN) call-to-action plan for generating interoperable nursing data. Healthcare literature. AAN's plan will not generate the nursing data needed to participate in big data science initiatives in the short term because Logical Observation Identifiers Names and Codes and Systematized Nomenclature of Medicine - Clinical Terms are not yet ripe for generating interoperable data. Well-tested viable alternatives exist. Authors present recommendations for revisions to AAN's plan and an evidence-based alternative to generating interoperable nursing data in the near term. These revisions can ultimately lead to the proposed terminology goals of the AAN's plan in the long term. © 2017 NANDA International, Inc.

  17. Archetype-based data warehouse environment to enable the reuse of electronic health record data.

    PubMed

    Marco-Ruiz, Luis; Moner, David; Maldonado, José A; Kolstrup, Nils; Bellika, Johan G

    2015-09-01

    The reuse of data captured during health care delivery is essential to satisfy the demands of clinical research and clinical decision support systems. A main barrier for the reuse is the existence of legacy formats of data and the high granularity of it when stored in an electronic health record (EHR) system. Thus, we need mechanisms to standardize, aggregate, and query data concealed in the EHRs, to allow their reuse whenever they are needed. To create a data warehouse infrastructure using archetype-based technologies, standards and query languages to enable the interoperability needed for data reuse. The work presented makes use of best of breed archetype-based data transformation and storage technologies to create a workflow for the modeling, extraction, transformation and load of EHR proprietary data into standardized data repositories. We converted legacy data and performed patient-centered aggregations via archetype-based transformations. Later, specific purpose aggregations were performed at a query level for particular use cases. Laboratory test results of a population of 230,000 patients belonging to Troms and Finnmark counties in Norway requested between January 2013 and November 2014 have been standardized. Test records normalization has been performed by defining transformation and aggregation functions between the laboratory records and an archetype. These mappings were used to automatically generate open EHR compliant data. These data were loaded into an archetype-based data warehouse. Once loaded, we defined indicators linked to the data in the warehouse to monitor test activity of Salmonella and Pertussis using the archetype query language. Archetype-based standards and technologies can be used to create a data warehouse environment that enables data from EHR systems to be reused in clinical research and decision support systems. With this approach, existing EHR data becomes available in a standardized and interoperable format, thus opening a world of possibilities toward semantic or concept-based reuse, query and communication of clinical data. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  18. Criteria for the use of regression analysis for remote sensing of sediment and pollutants

    NASA Technical Reports Server (NTRS)

    Whitlock, C. H.; Kuo, C. Y.; Lecroy, S. R.

    1982-01-01

    An examination of limitations, requirements, and precision of the linear multiple-regression technique for quantification of marine environmental parameters is conducted. Both environmental and optical physics conditions have been defined for which an exact solution to the signal response equations is of the same form as the multiple regression equation. Various statistical parameters are examined to define a criteria for selection of an unbiased fit when upwelled radiance values contain error and are correlated with each other. Field experimental data are examined to define data smoothing requirements in order to satisfy the criteria of Daniel and Wood (1971). Recommendations are made concerning improved selection of ground-truth locations to maximize variance and to minimize physical errors associated with the remote sensing experiment.

  19. 49 CFR 232.603 - Design, interoperability, and configuration management requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...: 1999; Revised 2002, 2007); (3) AAR S-4220, “ECP Cable-Based Brake DC Power Supply—Performance...; Revised: 2004); (7) AAR S-4260, “ECP Brake and Wire Distributed Power Interoperability Test Procedures...) Approval. A freight train or freight car equipped with an ECP brake system and equipment covered by the AAR...

  20. 76 FR 10295 - Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the 700 MHz Band

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-24

    ... Docket 07-100; FCC 11-6] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the... framework for the nationwide public safety broadband network. This document considers and proposes... broadband networks operating in the 700 MHz band. This document addresses public safety broadband network...

  1. 76 FR 62309 - Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the 700 MHz Band

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-07

    ... FEDERAL COMMUNICATIONS COMMISSION 47 CFR Chapter I [PS Docket No. 06-229; WT Docket 06-150; WP Docket 07-100; FCC 11-113] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the 700 MHz Band AGENCY: Federal Communications Commission. ACTION: Final rule. SUMMARY: In this...

  2. Interoperability Gap Challenges for Learning Object Repositories & Learning Management Systems

    ERIC Educational Resources Information Center

    Mason, Robert T.

    2011-01-01

    An interoperability gap exists between Learning Management Systems (LMSs) and Learning Object Repositories (LORs). Learning Objects (LOs) and the associated Learning Object Metadata (LOM) that is stored within LORs adhere to a variety of LOM standards. A common LOM standard found in LORs is the Sharable Content Object Reference Model (SCORM)…

  3. Promoting Interoperability: The Case for Discipline-Specific PSAPS

    DTIC Science & Technology

    2014-12-01

    incidents for two reasons: first, numerous steel and concrete floors that affected signal penetration; and second, so many different companies were...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited PROMOTING...INTEROPERABILITY: THE CASE FOR DISCIPLINE-SPECIFIC PSAPS by Thomas Michael Walsh December 2014 Thesis Advisor: Fathali Moghaddam Second Reader

  4. Perceived Progress, Risks of Information Sharing and Interoperability between Military Organizations and Federal Agencies

    ERIC Educational Resources Information Center

    Aburto, Rafael

    2014-01-01

    This qualitative study examined efforts by the military organizations and federal agencies to improve information sharing, interoperability, and systems integration in all business practices. More specifically, a survey instrument with six open-ended and eight demographic questions was used to measure the perceived progress, issues, challenges of…

  5. Look who's talking. A guide to interoperability groups and resources.

    PubMed

    2011-06-01

    There are huge challenges in getting medical devices to communicate with other devices and to information systems. Fortunately, a number of groups have emerged to help hospitals cope. Here's a description of the most prominent ones, including useful web links for each. We also discuss the latest and most pertinent interoperability standards.

  6. Improving component interoperability and reusability with the java connection framework (JCF): overview and application to the ages-w environmental model

    USDA-ARS?s Scientific Manuscript database

    Environmental modeling framework (EMF) design goals are multi-dimensional and often include many aspects of general software framework development. Many functional capabilities offered by current EMFs are closely related to interoperability and reuse aspects. For example, an EMF needs to support dev...

  7. 76 FR 2598 - Requests for Waiver of Various Petitioners To Allow the Establishment of 700 MHz Interoperable...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-14

    ... Broadband Networks AGENCY: Federal Communications Commission. ACTION: Final rule; waiver. SUMMARY: In this... Homeland Security Bureau (Bureau) approved an initial set of technical requirements for public safety... file an interoperability showing a renewed opportunity to do so and to proceed with network deployment...

  8. The Efficacy of Security Force Assistance in Countering Violent Extremist Organizations in Failed and Potentially Failing States

    DTIC Science & Technology

    2010-06-11

    maintenance funds when the purpose of the training is to enhance interoperability, familiarization , and safety training.3 While technically this type of...funding may not be used for SFA, significant gray area exists when determining what training is interoperability, familiarization , and safety and what

  9. On the feasibility of interoperable schemes in hand biometrics.

    PubMed

    Morales, Aythami; González, Ester; Ferrer, Miguel A

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  10. The e-MapScholar project—an example of interoperability in GIScience education

    NASA Astrophysics Data System (ADS)

    Purves, R. S.; Medyckyj-Scott, D. J.; Mackaness, W. A.

    2005-03-01

    The proliferation of the use of digital spatial data in learning and teaching provides a set of opportunities and challenges for the development of e-learning materials suitable for use by a broad spectrum of disciplines in Higher Education. Effective e-learning materials must both provide engaging materials with which the learner can interact and be relevant to the learners' disciplinary and background knowledge. Interoperability aims to allow sharing of data and materials through the use of common agreements and specifications. Shared learning materials can take advantage of interoperable components to provide customisable components, and must consider issues in sharing data across institutional borders. The e-MapScholar project delivers teaching materials related to spatial data, which are customisable with respect to both context and location. Issues in the provision of such interoperable materials are discussed, including suitable levels of granularity of materials, the provision of tools to facilitate customisation and mechanisms to deliver multiple data sets and the metadata issues related to such materials. The examples shown make extensive use of the OpenGIS consortium specifications in the delivery of spatial data.

  11. The Need to Apply Medical Device Informatics in Developing Standards for Safe Interoperable Medical Systems.

    PubMed

    Weininger, Sandy; Jaffe, Michael B; Goldman, Julian M

    2017-01-01

    Medical device and health information technology systems are increasingly interdependent with users demanding increased interoperability. Related safety standards must be developed taking into account these systems' perspective. In this article, we describe the current development of medical device standards and the need for these standards to address medical device informatics. Medical device information should be gathered from a broad range of clinical scenarios to lay the foundation for safe medical device interoperability. Five clinical examples show how medical device informatics principles, if applied in the development of medical device standards, could help facilitate the development of safe interoperable medical device systems. These examples illustrate the clinical implications of the failure to capture important signals and device attributes. We provide recommendations relating to the coordination between historically separate standards development groups, some of which focus on safety and effectiveness and others focus on health informatics. We identify the need for a shared understanding among stakeholders and describe organizational structures to promote cooperation such that device-to-device interactions and related safety information are considered during standards development.

  12. The Need to Apply Medical Device Informatics in Developing Standards for Safe Interoperable Medical Systems

    PubMed Central

    Weininger, Sandy; Jaffe, Michael B.; Goldman, Julian M

    2016-01-01

    Medical device and health information technology systems are increasingly interdependent with users demanding increased interoperability. Related safety standards must be developed taking into account this systems perspective. In this article we describe the current development of medical device standards and the need for these standards to address medical device informatics. Medical device information should be gathered from a broad range of clinical scenarios to lay the foundation for safe medical device interoperability. Five clinical examples show how medical device informatics principles, if applied in the development of medical device standards, could help facilitate the development of safe interoperable medical device systems. These examples illustrate the clinical implications of the failure to capture important signals and device attributes. We provide recommendations relating to the coordination between historically separate standards development groups; some which focus on safety and effectiveness, and others that focus on health informatics. We identify the need for a shared understanding among stakeholders and describe organizational structures to promote cooperation such that device-to-device interactions and related safety information are considered during standards development. PMID:27584685

  13. On the Feasibility of Interoperable Schemes in Hand Biometrics

    PubMed Central

    Morales, Aythami; González, Ester; Ferrer, Miguel A.

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors. PMID:22438714

  14. Extravehicular activity space suit interoperability.

    PubMed

    Skoog, A I; McBarron JW 2nd; Severin, G I

    1995-10-01

    The European Agency (ESA) and the Russian Space Agency (RKA) are jointly developing a new space suit system for improved extravehicular activity (EVA) capabilities in support of the MIR Space Station Programme, the EVA Suit 2000. Recent national policy agreements between the U.S. and Russia on planned cooperations in manned space also include joint extravehicular activity (EVA). With an increased number of space suit systems and a higher operational frequency towards the end of this century an improved interoperability for both routine and emergency operations is of eminent importance. It is thus timely to report the current status of ongoing work on international EVA interoperability being conducted by the Committee on EVA Protocols and Operations of the International Academy of Astronauts initiated in 1991. This paper summarises the current EVA interoperability issues to be harmonised and presents quantified vehicle interface requirements for the current U.S. Shuttle EMU and Russian MIR Orlan DMA and the new European/Russian EVA Suit 2000 extravehicular systems. Major critical/incompatible interfaces for suits/mother-craft of different combinations are discussed, and recommendations for standardisations given.

  15. Standard-compliant real-time transmission of ECGs: harmonization of ISO/IEEE 11073-PHD and SCP-ECG.

    PubMed

    Trigo, Jesús D; Chiarugi, Franco; Alesanco, Alvaro; Martínez-Espronceda, Miguel; Chronaki, Catherine E; Escayola, Javier; Martínez, Ignacio; García, José

    2009-01-01

    Ambient assisted living and integrated care in an aging society is based on the vision of the lifelong Electronic Health Record calling for HealthCare Information Systems and medical device interoperability. For medical devices this aim can be achieved by the consistent implementation of harmonized international interoperability standards. The ISO/IEEE 11073 (x73) family of standards is a reference standard for medical device interoperability. In its Personal Health Device (PHD) version several devices have been included, but an ECG device specialization is not yet available. On the other hand, the SCP-ECG standard for short-term diagnostic ECGs (EN1064) has been recently approved as an international standard ISO/IEEE 11073-91064:2009. In this paper, the relationships between a proposed x73-PHD model for an ECG device and the fields of the SCP-ECG standard are investigated. A proof-of-concept implementation of the proposed x73-PHD ECG model is also presented, identifying open issues to be addressed by standards development for the wider interoperability adoption of x73-PHD standards.

  16. EUnetHTA information management system: development and lessons learned.

    PubMed

    Chalon, Patrice X; Kraemer, Peter

    2014-11-01

    The aim of this study was to describe the techniques used in achieving consensus on common standards to be implemented in the EUnetHTA Information Management System (IMS); and to describe how interoperability between tools was explored. Three face to face meetings were organized to identify and agree on common standards to the development of online tools. Two tools were created to demonstrate the added value of implementing interoperability standards at local levels. Developers of tools outside EUnetHTA were identified and contacted. Four common standards have been agreed on by consensus; and consequently all EUnetHTA tools have been modified or designed accordingly. RDF Site Summary (RSS) has demonstrated a good potential to support rapid dissemination of HTA information. Contacts outside EUnetHTA resulted in direct collaboration (HTA glossary, HTAi Vortal), evaluation of options for interoperability between tools (CRD HTA database) or a formal framework to prepare cooperation on concrete projects (INAHTA projects database). While being entitled a project on IT infrastructure, the work program was also about people. When having to agree on complex topics, fostering a cohesive group dynamic and hosting face to face meetings brings added value and enhances understanding between partners. The adoption of widespread standards enhanced the homogeneity of the EUnetHTA tools and should thus contribute to their wider use, therefore, to the general objective of EUnetHTA. The initiatives on interoperability of systems need to be developed further to support a general interoperable information system that could benefit the whole HTA community.

  17. Secure Remote Access Issues in a Control Center Environment

    NASA Technical Reports Server (NTRS)

    Pitts, Lee; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The ISS finally reached an operational state and exists for local and remote users. Onboard payload systems are managed by the Huntsville Operations Support Center (HOSC). Users access HOSC systems by internet protocols in support of daily operations, preflight simulation, and test. In support of this diverse user community, a modem security architecture has been implemented. The architecture has evolved over time from an isolated but open system to a system which supports local and remote access to the ISS over broad geographic regions. This has been accomplished through the use of an evolved security strategy, PKI, and custom design. Through this paper, descriptions of the migration process and the lessons learned are presented. This will include product decision criteria, rationale, and the use of commodity products in the end architecture. This paper will also stress the need for interoperability of various products and the effects of seemingly insignificant details.

  18. Secure Payload Access to the International Space Station

    NASA Technical Reports Server (NTRS)

    Pitts, R. Lee; Reid, Chris

    2002-01-01

    The ISS finally reached an operational state and exists for local and remote users. Onboard payload systems are managed by the Huntsville Operations Support Center (HOSC). Users access HOSC systems by internet protocols in support of daily operations, preflight simulation, and test. In support of this diverse user community, a modem security architecture has been implemented. The architecture has evolved over time from an isolated but open system to a system which supports local and remote access to the ISS over broad geographic regions. This has been accomplished through the use of an evolved security strategy, PKI, and custom design. Through this paper, descriptions of the migration process and the lessons learned are presented. This will include product decision criteria, rationale, and the use of commodity products in the end architecture. This paper will also stress the need for interoperability of various products and the effects of seemingly insignificant details.

  19. Bantam System Technology Project

    NASA Technical Reports Server (NTRS)

    Moon, J. M.; Beveridge, J. R.

    1998-01-01

    This report focuses on determining a best value, low risk, low cost and highly reliable Data and Command System for support of the launch of low cost vehicles which are to carry small payloads into low earth orbit. The ground-based DCS is considered as a component of the overall ground and flight support system which includes the DCS, flight computer, mission planning system and simulator. Interfaces between the DCS and these other component systems are considered. Consideration is also given to the operational aspects of the mission and of the DCS selected. This project involved: defining requirements, defining an efficient operations concept, defining a DCS architecture which satisfies the requirements and concept, conducting a market survey of commercial and government off-the-shelf DCS candidate systems and rating the candidate systems against the requirements/concept. The primary conclusions are that several low cost, off-the-shelf DCS solutions exist and these can be employed to provide for very low cost operations and low recurring maintenance cost. The primary recommendation is that the DCS design/specification should be integrated within the ground and flight support system design as early as possible to ensure ease of interoperability and efficient allocation of automation functions among the component systems.

  20. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility.

    PubMed

    Zaballos, Agustín; Navarro, Joan; Martín De Pozuelo, Ramon

    2018-02-28

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid's data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

Top