Science.gov

Sample records for achieve semantic interoperability

  1. Real Time Semantic Interoperability in AD HOC Networks of Geospatial Data Sources: Challenges, Achievements and Perspectives

    NASA Astrophysics Data System (ADS)

    Mostafavi, M. A.; Bakillah, M.

    2012-07-01

    Recent advances in geospatial technologies have made available large amount of geospatial data. Meanwhile, new developments in Internet and communication technologies created a shift from isolated geospatial databases to ad hoc networks of geospatial data sources, where data sources can join or leave the network, and form groups to share data and services. However, effective integration and sharing of geospatial data among these data sources and their users are hampered by semantic heterogeneities. These heterogeneities affect the spatial, temporal and thematic aspects of geospatial concepts. There have been many efforts to address semantic interoperability issues in the geospatial domain. These efforts were mainly focused on resolving heterogeneities caused by different and implicit representations of the concepts. However, many approaches have focused on the thematic aspects, leaving aside the explicit representation of spatial and temporal aspects. Also, most semantic interoperability approaches for networks have focused on automating the semantic mapping process. However, the ad hoc network structure is continuously modified by source addition or removal, formation of groups, etc. This dynamic aspect is often neglected in those approaches. This paper proposes a conceptual framework for real time semantic interoperability in ad hoc networks of geospatial data sources. The conceptual framework presents the fundamental elements of real time semantic interoperability through a hierarchy of interrelated semantic states and processes. Then, we use the conceptual framework to set the discussion on the achievements that have already been made, the challenges that remain to be addressed and perspectives with respect to these challenges.

  2. Approaching semantic interoperability in Health Level Seven

    PubMed Central

    Alschuler, Liora

    2010-01-01

    Semantic Interoperability’ is a driving objective behind many of Health Level Seven's standards. The objective in this paper is to take a step back, and consider what semantic interoperability means, assess whether or not it has been achieved, and, if not, determine what concrete next steps can be taken to get closer. A framework for measuring semantic interoperability is proposed, using a technique called the ‘Single Logical Information Model’ framework, which relies on an operational definition of semantic interoperability and an understanding that interoperability improves incrementally. Whether semantic interoperability tomorrow will enable one computer to talk to another, much as one person can talk to another person, is a matter for speculation. It is assumed, however, that what gets measured gets improved, and in that spirit this framework is offered as a means to improvement. PMID:21106995

  3. Achieving clinical statement interoperability using R-MIM and archetype-based semantic transformations.

    PubMed

    Kilic, Ozgur; Dogac, Asuman

    2009-07-01

    Effective use of electronic healthcare records (EHRs) has the potential to positively influence both the quality and the cost of health care. Consequently, sharing patient's EHRs is becoming a global priority in the healthcare information technology domain. This paper addresses the interoperability of EHR structure and content. It describes how two different EHR standards derived from the same reference information model (RIM) can be mapped to each other by using archetypes, refined message information model (R-MIM) derivations, and semantic tools. It is also demonstrated that well-defined R-MIM derivation rules help tracing the class properties back to their origins when the R-MIMs of two EHR standards are derived from the same RIM. Using well-defined rules also enable finding equivalences in the properties of the source and target EHRs. Yet an R-MIM still defines the concepts at the generic level. Archetypes (or templates), on the other hand, constrain an R-MIM to domain-specific concepts, and hence, provide finer granularity semantics. Therefore, while mapping clinical statements between EHRs, we also make use of the archetype semantics. Derivation statements are inferred from the Web Ontology Language definitions of the RIM, the R-MIMs, and the archetypes. Finally, we show how to transform Health Level Seven clinical statement instances to EHRcom clinical statement instances and vice versa by using the generated mapping definitions.

  4. A logical approach to semantic interoperability in healthcare.

    PubMed

    Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni

    2011-01-01

    Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.

  5. Semantic Interoperability on the Web

    DTIC Science & Technology

    2000-01-01

    these agents would not be affected by presentation changes if the pages were available in XML, they would still break if the XML representation of the... these semantics into tools that are used to interpret or translate the XML documents, but software tools cannot acquire these semantics independently...mapping differences in naming conventions. As with natural language, XML DTDs have the problems of polysemy and synonymy. For example, the elements

  6. ARGOS policy brief on semantic interoperability.

    PubMed

    Kalra, Dipak; Musen, Mark; Smith, Barry; Ceusters, Werner; De Moor, Georges

    2011-01-01

    Semantic interoperability is one of the priority themes of the ARGOS Trans-Atlantic Observatory. This topic represents a globally recognised challenge that must be addressed if electronic health records are to be shared among heterogeneous systems, and the information in them exploited to the maximum benefit of patients, professionals, health services, research, and industry. Progress in this multi-faceted challenge has been piecemeal, and valuable lessons have been learned, and approaches discovered, in Europe and in the US that can be shared and combined. Experts from both continents have met at three ARGOS workshops during 2010 and 2011 to share understanding of these issues and how they might be tackled collectively from both sides of the Atlantic. This policy brief summarises the problems and the reasons why they are important to tackle, and also why they are so difficult. It outlines the major areas of semantic innovation that exist and that are available to help address this challenge. It proposes a series of next steps that need to be championed on both sides of the Atlantic if further progress is to be made in sharing and analysing electronic health records meaningfully. Semantic interoperability requires the use of standards, not only for EHR data to be transferred and structurally mapped into a receiving repository, but also for the clinical content of the EHR to be interpreted in conformity with the original meanings intended by its authors. Wide-scale engagement with professional bodies, globally, is needed to develop these clinical information standards. Accurate and complete clinical documentation, faithful to the patient's situation, and interoperability between systems, require widespread and dependable access to published and maintained collections of coherent and quality-assured semantic resources, including models such as archetypes and templates that would (1) provide clinical context, (2) be mapped to interoperability standards for EHR data

  7. Information Management Challenges in Achieving Coalition Interoperability

    DTIC Science & Technology

    2001-12-01

    SEINE CEDEX, FRANCE RTO MEETING PROCEEDINGS 64 Information Management Challenges in Achieving Coalition Interoperability (les Défis de la gestion de...CEDEX, FRANCE RTO MEETING PROCEEDINGS 64 Information Management Challenges in Achieving Coalition Interoperability (les Défis de la gestion de...collection of papers presented, and the resultant discussions. iii les Défis de la gestion de l’information dans la mise en œuvre de

  8. Semantic Interoperability in Clinical Decision Support Systems: A Systematic Review.

    PubMed

    Marco-Ruiz, Luis; Bellika, Johan Gustav

    2015-01-01

    The interoperability of Clinical Decision Support (CDS) systems with other health information systems has become one of the main limitations to their broad adoption. Semantic interoperability must be granted in order to share CDS modules across different health information systems. Currently, numerous standards for different purposes are available to enable the interoperability of CDS systems. We performed a literature review to identify and provide an overview of the available standards that enable CDS interoperability in the areas of clinical information, decision logic, terminology, and web service interfaces.

  9. Cross border semantic interoperability for clinical research: the EHR4CR semantic resources and services

    PubMed Central

    Daniel, Christel; Ouagne, David; Sadou, Eric; Forsberg, Kerstin; Gilchrist, Mark Mc; Zapletal, Eric; Paris, Nicolas; Hussain, Sajjad; Jaulent, Marie-Christine; MD, Dipka Kalra

    2016-01-01

    With the development of platforms enabling the use of routinely collected clinical data in the context of international clinical research, scalable solutions for cross border semantic interoperability need to be developed. Within the context of the IMI EHR4CR project, we first defined the requirements and evaluation criteria of the EHR4CR semantic interoperability platform and then developed the semantic resources and supportive services and tooling to assist hospital sites in standardizing their data for allowing the execution of the project use cases. The experience gained from the evaluation of the EHR4CR platform accessing to semantically equivalent data elements across 11 European participating EHR systems from 5 countries demonstrated how far the mediation model and mapping efforts met the expected requirements of the project. Developers of semantic interoperability platforms are beginning to address a core set of requirements in order to reach the goal of developing cross border semantic integration of data. PMID:27570649

  10. Establishing semantic interoperability of biomedical metadata registries using extended semantic relationships.

    PubMed

    Park, Yu Rang; Yoon, Young Jo; Kim, Hye Hyeon; Kim, Ju Han

    2013-01-01

    Achieving semantic interoperability is critical for biomedical data sharing between individuals, organizations and systems. The ISO/IEC 11179 MetaData Registry (MDR) standard has been recognized as one of the solutions for this purpose. The standard model, however, is limited. Representing concepts consist of two or more values, for instance, are not allowed including blood pressure with systolic and diastolic values. We addressed the structural limitations of ISO/IEC 11179 by an integrated metadata object model in our previous research. In the present study, we introduce semantic extensions for the model by defining three new types of semantic relationships; dependency, composite and variable relationships. To evaluate our extensions in a real world setting, we measured the efficiency of metadata reduction by means of mapping to existing others. We extracted metadata from the College of American Pathologist Cancer Protocols and then evaluated our extensions. With no semantic loss, one third of the extracted metadata could be successfully eliminated, suggesting better strategy for implementing clinical MDRs with improved efficiency and utility.

  11. Achieving interoperability for metadata registries using comparative object modeling.

    PubMed

    Park, Yu Rang; Kim, Ju Han

    2010-01-01

    Achieving data interoperability between organizations relies upon agreed meaning and representation (metadata) of data. For managing and registering metadata, many organizations have built metadata registries (MDRs) in various domains based on international standard for MDR framework, ISO/IEC 11179. Following this trend, two pubic MDRs in biomedical domain have been created, United States Health Information Knowledgebase (USHIK) and cancer Data Standards Registry and Repository (caDSR), from U.S. Department of Health & Human Services and National Cancer Institute (NCI), respectively. Most MDRs are implemented with indiscriminate extending for satisfying organization-specific needs and solving semantic and structural limitation of ISO/IEC 11179. As a result it is difficult to address interoperability among multiple MDRs. In this paper, we propose an integrated metadata object model for achieving interoperability among multiple MDRs. To evaluate this model, we developed an XML Schema Definition (XSD)-based metadata exchange format. We created an XSD-based metadata exporter, supporting both the integrated metadata object model and organization-specific MDR formats.

  12. Open PHACTS: semantic interoperability for drug discovery.

    PubMed

    Williams, Antony J; Harland, Lee; Groth, Paul; Pettifer, Stephen; Chichester, Christine; Willighagen, Egon L; Evelo, Chris T; Blomberg, Niklas; Ecker, Gerhard; Goble, Carole; Mons, Barend

    2012-11-01

    Open PHACTS is a public-private partnership between academia, publishers, small and medium sized enterprises and pharmaceutical companies. The goal of the project is to deliver and sustain an 'open pharmacological space' using and enhancing state-of-the-art semantic web standards and technologies. It is focused on practical and robust applications to solve specific questions in drug discovery research. OPS is intended to facilitate improvements in drug discovery in academia and industry and to support open innovation and in-house non-public drug discovery research. This paper lays out the challenges and how the Open PHACTS project is hoping to address these challenges technically and socially.

  13. An adaptive semantic based mediation system for data interoperability among Health Information Systems.

    PubMed

    Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung

    2014-08-01

    Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.

  14. A secure semantic interoperability infrastructure for inter-enterprise sharing of electronic healthcare records.

    PubMed

    Boniface, Mike; Watkins, E Rowland; Saleh, Ahmed; Dogac, Asuman; Eichelberg, Marco

    2006-01-01

    Healthcare professionals need access to accurate and complete healthcare records for effective assessment, diagnosis and treatment of patients. The non-interoperability of healthcare information systems means that interenterprise access to a patient's history over many distributed encounters is difficult to achieve. The ARTEMIS project has developed a secure semantic web service infrastructure for the interoperability of healthcare information systems. Healthcare professionals share services and medical information using a web service annotation and mediation environment based on functional and clinical semantics derived from healthcare standards. Healthcare professionals discover medical information about individuals using a patient identification protocol based on pseudonymous information. The management of care pathways and access to medical information is based on a well-defined business process allowing healthcare providers to negotiate collaboration and data access agreements within the context of strict legislative frameworks.

  15. Semantic Integration for Marine Science Interoperability Using Web Technologies

    NASA Astrophysics Data System (ADS)

    Rueda, C.; Bermudez, L.; Graybeal, J.; Isenor, A. W.

    2008-12-01

    The Marine Metadata Interoperability Project, MMI (http://marinemetadata.org) promotes the exchange, integration, and use of marine data through enhanced data publishing, discovery, documentation, and accessibility. A key effort is the definition of an Architectural Framework and Operational Concept for Semantic Interoperability (http://marinemetadata.org/sfc), which is complemented with the development of tools that realize critical use cases in semantic interoperability. In this presentation, we describe a set of such Semantic Web tools that allow performing important interoperability tasks, ranging from the creation of controlled vocabularies and the mapping of terms across multiple ontologies, to the online registration, storage, and search services needed to work with the ontologies (http://mmisw.org). This set of services uses Web standards and technologies, including Resource Description Framework (RDF), Web Ontology language (OWL), Web services, and toolkits for Rich Internet Application development. We will describe the following components: MMI Ontology Registry: The MMI Ontology Registry and Repository provides registry and storage services for ontologies. Entries in the registry are associated with projects defined by the registered users. Also, sophisticated search functions, for example according to metadata items and vocabulary terms, are provided. Client applications can submit search requests using the WC3 SPARQL Query Language for RDF. Voc2RDF: This component converts an ASCII comma-delimited set of terms and definitions into an RDF file. Voc2RDF facilitates the creation of controlled vocabularies by using a simple form-based user interface. Created vocabularies and their descriptive metadata can be submitted to the MMI Ontology Registry for versioning and community access. VINE: The Vocabulary Integration Environment component allows the user to map vocabulary terms across multiple ontologies. Various relationships can be established, for example

  16. Towards Semantic Interoperability Between C2 Systems Following the Principles of Distributed Simulation

    DTIC Science & Technology

    2011-06-01

    system Common Ground Transformation Semantic Description (A) Semantic Description (B) World Knowledge Translation Rules Ontology Operations A B...Simulation Interoperability Existing system Existing system Common Ground Transformation Semantic Description (A) Semantic Description (B) World Knowledge ... Translation Rules Ontology Operations A B Standards Organization (SISO), which has played a major role in these efforts, has succeeded in establishing

  17. An approach to define semantics for BPM systems interoperability

    NASA Astrophysics Data System (ADS)

    Rico, Mariela; Caliusco, María Laura; Chiotti, Omar; Rosa Galli, María

    2015-04-01

    This article proposes defining semantics for Business Process Management systems interoperability through the ontology of Electronic Business Documents (EBD) used to interchange the information required to perform cross-organizational processes. The semantic model generated allows aligning enterprise's business processes to support cross-organizational processes by matching the business ontology of each business partner with the EBD ontology. The result is a flexible software architecture that allows dynamically defining cross-organizational business processes by reusing the EBD ontology. For developing the semantic model, a method is presented, which is based on a strategy for discovering entity features whose interpretation depends on the context, and representing them for enriching the ontology. The proposed method complements ontology learning techniques that can not infer semantic features not represented in data sources. In order to improve the representation of these entity features, the method proposes using widely accepted ontologies, for representing time entities and relations, physical quantities, measurement units, official country names, and currencies and funds, among others. When the ontologies reuse is not possible, the method proposes identifying whether that feature is simple or complex, and defines a strategy to be followed. An empirical validation of the approach has been performed through a case study.

  18. A federated semantic metadata registry framework for enabling interoperability across clinical research and care domains.

    PubMed

    Sinaci, A Anil; Laleci Erturkmen, Gokce B

    2013-10-01

    In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems.

  19. Semantic Document Model to Enhance Data and Knowledge Interoperability

    NASA Astrophysics Data System (ADS)

    Nešić, Saša

    To enable document data and knowledge to be efficiently shared and reused across application, enterprise, and community boundaries, desktop documents should be completely open and queryable resources, whose data and knowledge are represented in a form understandable to both humans and machines. At the same time, these are the requirements that desktop documents need to satisfy in order to contribute to the visions of the Semantic Web. With the aim of achieving this goal, we have developed the Semantic Document Model (SDM), which turns desktop documents into Semantic Documents as uniquely identified and semantically annotated composite resources, that can be instantiated into human-readable (HR) and machine-processable (MP) forms. In this paper, we present the SDM along with an RDF and ontology-based solution for the MP document instance. Moreover, on top of the proposed model, we have built the Semantic Document Management System (SDMS), which provides a set of services that exploit the model. As an application example that takes advantage of SDMS services, we have extended MS Office with a set of tools that enables users to transform MS Office documents (e.g., MS Word and MS PowerPoint) into Semantic Documents, and to search local and distant semantic document repositories for document content units (CUs) over Semantic Web protocols.

  20. CityGML - Interoperable semantic 3D city models

    NASA Astrophysics Data System (ADS)

    Gröger, Gerhard; Plümer, Lutz

    2012-07-01

    CityGML is the international standard of the Open Geospatial Consortium (OGC) for the representation and exchange of 3D city models. It defines the three-dimensional geometry, topology, semantics and appearance of the most relevant topographic objects in urban or regional contexts. These definitions are provided in different, well-defined Levels-of-Detail (multiresolution model). The focus of CityGML is on the semantical aspects of 3D city models, its structures, taxonomies and aggregations, allowing users to employ virtual 3D city models for advanced analysis and visualization tasks in a variety of application domains such as urban planning, indoor/outdoor pedestrian navigation, environmental simulations, cultural heritage, or facility management. This is in contrast to purely geometrical/graphical models such as KML, VRML, or X3D, which do not provide sufficient semantics. CityGML is based on the Geography Markup Language (GML), which provides a standardized geometry model. Due to this model and its well-defined semantics and structures, CityGML facilitates interoperable data exchange in the context of geo web services and spatial data infrastructures. Since its standardization in 2008, CityGML has become used on a worldwide scale: tools from notable companies in the geospatial field provide CityGML interfaces. Many applications and projects use this standard. CityGML is also having a strong impact on science: numerous approaches use CityGML, particularly its semantics, for disaster management, emergency responses, or energy-related applications as well as for visualizations, or they contribute to CityGML, improving its consistency and validity, or use CityGML, particularly its different Levels-of-Detail, as a source or target for generalizations. This paper gives an overview of CityGML, its underlying concepts, its Levels-of-Detail, how to extend it, its applications, its likely future development, and the role it plays in scientific research. Furthermore, its

  1. Towards ISO 13606 and openEHR archetype-based semantic interoperability.

    PubMed

    Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2009-01-01

    Semantic interoperability of clinical standards is a major challenge in eHealth across Europe. It would allow healthcare professionals to manage the complete electronic healthcare record of the patient regardless of which institution generated each clinical session. Clinical archetypes are fundamental for the consecution of semantic interoperability, but they are built for particular electronic healthcare record standards. Therefore, methods for transforming archetypes between standards are needed. In this work, a method for transforming archetypes between ISO 13606 and openEHR, based on Model-Driven Engineering and Semantic Web technologies, is presented.

  2. Semantic interoperability between clinical and public health information systems for improving public health services.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2007-01-01

    Improving public health services requires comprehensively integrating all services including medical, social, community, and public health ones. Therefore, developing integrated health information services has to start considering business process, rules and information semantics of involved domains. The paper proposes a business and information architecture for the specification of a future-proof national integrated system, concretely the requirements for semantic integration between public health surveillance and clinical information systems. The architecture is a semantically interoperable approach because it describes business process, rules and information semantics based on national policy documents and expressed in a standard language such us the Unified Modeling Language UML. Having the enterprise and information models formalized, semantically interoperable Health IT components/services development is supported.

  3. The Long Road to Semantic Interoperability in Support of Public Health: Experiences from Two States

    PubMed Central

    Vreeman, Daniel J.; Grannis, Shaun J.

    2014-01-01

    Proliferation of health information technologies creates opportunities to improve clinical and public health, including high quality, safer care and lower costs. To maximize such potential benefits, health information technologies must readily and reliably exchange information with other systems. However, evidence from public health surveillance programs in two states suggests that operational clinical information systems often fail to use available standards, a barrier to semantic interoperability. Furthermore, analysis of existing policies incentivizing semantic interoperability suggests they have limited impact and are fragmented. In this essay, we discuss three approaches for increasing semantic interoperability to support national goals for using health information technologies. A clear, comprehensive strategy requiring collaborative efforts by clinical and public health stakeholders is suggested as a guide for the long road towards better population health data and outcomes. PMID:24680985

  4. Reporting Device Observations for semantic interoperability of surgical devices and clinical information systems.

    PubMed

    Andersen, Björn; Ulrich, Hannes; Rehmann, Daniel; Kock, Ann-Kristin; Wrage, Jan-Hinrich; Ingenerf, Josef

    2015-08-01

    Service-oriented medical device architectures make the progress from interdisciplinary research projects to international standardisation: A new set of IEEE 11073 proposals shall pave the way to industry acceptance. This expected availability of device observations in a standardised representation enables secondary usage if interoperability with clinical information systems can be achieved. The Device Observation Reporter (DOR) described in this work is a gateway that connects these realms. After a user chooses a selection of signals from different devices in the digital operating room, the DOR records these semantically described values for a specified duration. Upon completion, the signals descriptions and values are transformed to Health Level Seven version 2 messages and sent to a hospital information system/electronic health record system within the clinical IT network. The successful integration of device data for documentation and usage in clinical information systems can further leverage the novel device communication standard proposals. Complementing these, an Integrating the Healthcare Enterprise profile will aid commercial implementers in achieving interoperability. Their solutions could incorporate clinical knowledge to autonomously select signal combinations and generate reports of diagnostic and interventional procedures, thus saving time and effort for surgical documentation.

  5. Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes

    NASA Astrophysics Data System (ADS)

    Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.

    2014-12-01

    With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data

  6. Interoperability.

    PubMed

    Jarvis, Dennis H; Jarvis, Jacqueline H

    2010-01-01

    This chapter gives an educational overview of: * the roles that ontology and process play in interoperability * the processes that can be employed to realise interoperability and their supporting technologies * interoperability solutions employed in the health informatics sector within the conceptual model presented in the chapter * directions for future research in the area of interoperability for health informatics.

  7. An Approach to Semantic Interoperability for Improved Capability Exchanges in Federations of Systems

    ERIC Educational Resources Information Center

    Moschoglou, Georgios

    2013-01-01

    This study seeks an affirmative answer to the question whether a knowledge-based approach to system of systems interoperation using semantic web standards and technologies can provide the centralized control of the capability for exchanging data and services lacking in a federation of systems. Given the need to collect and share real-time…

  8. RuleML-Based Learning Object Interoperability on the Semantic Web

    ERIC Educational Resources Information Center

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  9. Sharing meanings: developing interoperable semantic technologies to enhance reproducibility in earth and environmental science research

    NASA Astrophysics Data System (ADS)

    Schildhauer, M.

    2015-12-01

    Earth and environmental scientists are familiar with the entities, processes, and theories germane to their field of study, and comfortable collecting and analyzing data in their area of interest. Yet, while there appears to be consistency and agreement as to the scientific "terms" used to describe features in their data and analyses, aside from a few fundamental physical characteristics—such as mass or velocity-- there can be broad tolerances, if not considerable ambiguity, in how many earth science "terms" map to the underlying "concepts" that they actually represent. This ambiguity in meanings, or "semantics", creates major problems for scientific reproducibility. It greatly impedes the ability to replicate results—by making it difficult to determine the specifics of the intended meanings of terms such as deforestation or carbon flux -- as to scope, composition, magnitude, etc. In addition, semantic ambiguity complicates assemblage of comparable data for reproducing results, due to ambiguous or idiosyncratic labels for measurements, such as percent cover of forest, where the term "forest" is undefined; or where a reported output of "total carbon-emissions" might just include CO2 emissions, but not methane emissions. In this talk, we describe how the NSF-funded DataONE repository for earth and environmental science data (http://dataone.org), is using W3C-standard languages (RDF/OWL) to build an ontology for clarifying concepts embodied in heterogeneous data and model outputs. With an initial focus on carbon cycling concepts using terrestrial biospheric model outputs and LTER productivity data, we describe how we are achieving interoperability with "semantic vocabularies" (or ontologies) from aligned earth and life science domains, including OBO-foundry ontologies such as ENVO and BCO; the ISO/OGC O&M; and the NSF Earthcube GeoLink project. Our talk will also discuss best practices that may be helpful for other groups interested in constructing their own

  10. Kosovo Armed Forces Development; Achieving NATO Non-Article 5 Crisis Response Operations Interoperability

    DTIC Science & Technology

    2014-12-12

    KOSOVO ARMED FORCES DEVELOPMENT; ACHIEVING NATO NON-ARTICLE 5 CRISIS RESPONSE OPERATIONS INTEROPERABILITY A thesis presented...Achieving NATO Non- Article 5 Crisis Response Operations Interoperability 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...interoperable and effective in NATO-led Crisis Response Operations. Therefore, this study focuses in examining current Kosovo Armed Forces development program

  11. Interoperability and different ways of knowing: How semantics can aid in cross-cultural understanding

    NASA Astrophysics Data System (ADS)

    Pulsifer, P. L.; Parsons, M. A.; Duerr, R. E.; Fox, P. A.; Khalsa, S. S.; McCusker, J. P.; McGuinness, D. L.

    2012-12-01

    differences in its application. Furthermore, it is an analog encoding scheme whose meaning has evolved over time. By semantically modeling the egg code, its subtle variations, and how it connects to other data, we illustrate a mechanism for translating across data formats and representations. But there are limits to what semantically modeling the egg-code can achieve. The egg-code and common operational sea ice formats do not address community needs, notably the timing and processes of sea ice freeze-up and break-up which have profound impact on local hunting, shipping, oil exploration, and safety. We work with local experts from four very different Indigenous communities and scientific creators of sea ice forecasts to establish an understanding of concepts and terminology related to fall freeze-up and spring break up from the individually represented regions. This helps expand our conceptions of sea ice while also aiding in understanding across cultures and communities, and in passing knowledge to younger generations. This is an early step to expanding concepts of interoperability to very different ways of knowing to make data truly relevant and locally useful.

  12. Interoperable cross-domain semantic and geospatial framework for automatic change detection

    NASA Astrophysics Data System (ADS)

    Kuo, Chiao-Ling; Hong, Jung-Hong

    2016-01-01

    With the increasingly diverse types of geospatial data established over the last few decades, semantic interoperability in integrated applications has attracted much interest in the field of Geographic Information System (GIS). This paper proposes a new strategy and framework to process cross-domain geodata at the semantic level. This framework leverages the semantic equivalence of concepts between domains through bridge ontology and facilitates the integrated use of different domain data, which has been long considered as an essential superiority of GIS, but is impeded by the lack of understanding about the semantics implicitly hidden in the data. We choose the task of change detection to demonstrate how the introduction of ontology concept can effectively make the integration possible. We analyze the common properties of geodata and change detection factors, then construct rules and summarize possible change scenario for making final decisions. The use of topographic map data to detect changes in land use shows promising success, as far as the improvement of efficiency and level of automation is concerned. We believe the ontology-oriented approach will enable a new way for data integration across different domains from the perspective of semantic interoperability, and even open a new dimensionality for the future GIS.

  13. An approach for the semantic interoperability of ISO EN 13606 and OpenEHR archetypes.

    PubMed

    Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2010-10-01

    The communication between health information systems of hospitals and primary care organizations is currently an important challenge to improve the quality of clinical practice and patient safety. However, clinical information is usually distributed among several independent systems that may be syntactically or semantically incompatible. This fact prevents healthcare professionals from accessing clinical information of patients in an understandable and normalized way. In this work, we address the semantic interoperability of two EHR standards: OpenEHR and ISO EN 13606. Both standards follow the dual model approach which distinguishes information and knowledge, this being represented through archetypes. The solution presented here is capable of transforming OpenEHR archetypes into ISO EN 13606 and vice versa by combining Semantic Web and Model-driven Engineering technologies. The resulting software implementation has been tested using publicly available collections of archetypes for both standards.

  14. Investigating the semantic interoperability of laboratory data exchanged using LOINC codes in three large institutions.

    PubMed

    Lin, Ming-Chin; Vreeman, Daniel J; Huff, Stanley M

    2011-01-01

    LOINC codes are seeing increased use in many organizations. In this study, we examined the barriers to semantic interoperability that still exist in electronic data exchange of laboratory results even when LOINC codes are being used as the observation identifiers. We analyzed semantic interoperability of laboratory data exchanged using LOINC codes in three large institutions. To simplify the analytic process, we divided the laboratory data into quantitative and non-quantitative tests. The analysis revealed many inconsistencies even when LOINC codes are used to exchange laboratory data. For quantitative tests, the most frequent problems were inconsistencies in the use of units of measure: variations in the strings used to represent units (unrecognized synonyms), use of units that result in different magnitudes of the numeric quantity, and missing units of measure. For non-quantitative tests, the most frequent problems were acronyms/synonyms, different classes of elements in enumerated lists, and the use of free text. Our findings highlight the limitations of interoperability in current laboratory reporting.

  15. Implementation of a metadata architecture and knowledge collection to support semantic interoperability in an enterprise data warehouse.

    PubMed

    Dhaval, Rakesh; Borlawsky, Tara; Ostrander, Michael; Santangelo, Jennifer; Kamal, Jyoti; Payne, Philip R O

    2008-11-06

    In order to enhance interoperability between enterprise systems, and improve data validity and reliability throughout The Ohio State University Medical Center (OSUMC), we have initiated the development of an ontology-anchored metadata architecture and knowledge collection for our enterprise data warehouse. The metadata and corresponding semantic relationships stored in the OSUMC knowledge collection are intended to promote consistency and interoperability across the heterogeneous clinical, research, business and education information managed within the data warehouse.

  16. Cohort Selection and Management Application Leveraging Standards-based Semantic Interoperability and a Groovy DSL.

    PubMed

    Bucur, Anca; van Leeuwen, Jasper; Chen, Njin-Zu; Claerhout, Brecht; de Schepper, Kristof; Perez-Rey, David; Paraiso-Medina, Sergio; Alonso-Calvo, Raul; Mehta, Keyur; Krykwinski, Cyril

    2016-01-01

    This paper describes a new Cohort Selection application implemented to support streamlining the definition phase of multi-centric clinical research in oncology. Our approach aims at both ease of use and precision in defining the selection filters expressing the characteristics of the desired population. The application leverages our standards-based Semantic Interoperability Solution and a Groovy DSL to provide high expressiveness in the definition of filters and flexibility in their composition into complex selection graphs including splits and merges. Widely-adopted ontologies such as SNOMED-CT are used to represent the semantics of the data and to express concepts in the application filters, facilitating data sharing and collaboration on joint research questions in large communities of clinical users. The application supports patient data exploration and efficient collaboration in multi-site, heterogeneous and distributed data environments.

  17. Cohort Selection and Management Application Leveraging Standards-based Semantic Interoperability and a Groovy DSL

    PubMed Central

    Bucur, Anca; van Leeuwen, Jasper; Chen, Njin-Zu; Claerhout, Brecht; de Schepper, Kristof; Perez-Rey, David; Paraiso-Medina, Sergio; Alonso-Calvo, Raul; Mehta, Keyur; Krykwinski, Cyril

    2016-01-01

    This paper describes a new Cohort Selection application implemented to support streamlining the definition phase of multi-centric clinical research in oncology. Our approach aims at both ease of use and precision in defining the selection filters expressing the characteristics of the desired population. The application leverages our standards-based Semantic Interoperability Solution and a Groovy DSL to provide high expressiveness in the definition of filters and flexibility in their composition into complex selection graphs including splits and merges. Widely-adopted ontologies such as SNOMED-CT are used to represent the semantics of the data and to express concepts in the application filters, facilitating data sharing and collaboration on joint research questions in large communities of clinical users. The application supports patient data exploration and efficient collaboration in multi-site, heterogeneous and distributed data environments. PMID:27570644

  18. Case Study for Integration of an Oncology Clinical Site in a Semantic Interoperability Solution based on HL7 v3 and SNOMED-CT: Data Transformation Needs.

    PubMed

    Ibrahim, Ahmed; Bucur, Anca; Perez-Rey, David; Alonso, Enrique; de Hoog, Matthy; Dekker, Andre; Marshall, M Scott

    2015-01-01

    This paper describes the data transformation pipeline defined to support the integration of a new clinical site in a standards-based semantic interoperability environment. The available datasets combined structured and free-text patient data in Dutch, collected in the context of radiation therapy in several cancer types. Our approach aims at both efficiency and data quality. We combine custom-developed scripts, standard tools and manual validation by clinical and knowledge experts. We identified key challenges emerging from the several sources of heterogeneity in our case study (systems, language, data structure, clinical domain) and implemented solutions that we will further generalize for the integration of new sites. We conclude that the required effort for data transformation is manageable which supports the feasibility of our semantic interoperability solution. The achieved semantic interoperability will be leveraged for the deployment and evaluation at the clinical site of applications enabling secondary use of care data for research. This work has been funded by the European Commission through the INTEGRATE (FP7-ICT-2009-6-270253) and EURECA (FP7-ICT-2011-288048) projects.

  19. Case Study for Integration of an Oncology Clinical Site in a Semantic Interoperability Solution based on HL7 v3 and SNOMED-CT: Data Transformation Needs

    PubMed Central

    Ibrahim, Ahmed; Bucur, Anca; Perez-Rey, David; Alonso, Enrique; de Hoog, Matthy; Dekker, Andre; Marshall, M. Scott

    2015-01-01

    This paper describes the data transformation pipeline defined to support the integration of a new clinical site in a standards-based semantic interoperability environment. The available datasets combined structured and free-text patient data in Dutch, collected in the context of radiation therapy in several cancer types. Our approach aims at both efficiency and data quality. We combine custom-developed scripts, standard tools and manual validation by clinical and knowledge experts. We identified key challenges emerging from the several sources of heterogeneity in our case study (systems, language, data structure, clinical domain) and implemented solutions that we will further generalize for the integration of new sites. We conclude that the required effort for data transformation is manageable which supports the feasibility of our semantic interoperability solution. The achieved semantic interoperability will be leveraged for the deployment and evaluation at the clinical site of applications enabling secondary use of care data for research. This work has been funded by the European Commission through the INTEGRATE (FP7-ICT-2009-6-270253) and EURECA (FP7-ICT-2011-288048) projects. PMID:26306242

  20. Archetype-based knowledge management for semantic interoperability of electronic health records.

    PubMed

    Garde, Sebastian; Chen, Rong; Leslie, Heather; Beale, Thomas; McNicoll, Ian; Heard, Sam

    2009-01-01

    Formal modeling of clinical content that can be made available internationally is one of the most promising pathways to semantic interoperability of health information. Drawing on the extensive experience from openEHR archetype research and implementation work, we present the latest research and development in this area to improve semantic interoperability of Electronic Health Records (EHRs) using openEHR (ISO 13606) archetypes. Archetypes as the formal definition of clinical content need to be of high technical and clinical quality. We will start with a brief introduction of the openEHR architecture followed by presentations on specific topics related to the management of a wide range of clinical knowledge artefacts. We will describe a web-based review process for archetypes that enables international involvement and ensures that released archetypes are technically and clinically correct. Tools for validation of archetypes will be presented, along with templates and compliance templates. All this in combination enables the openEHR computing platform to be the foundation for safely sharing the information clinicians need, using this information within computerized clinical guidelines, for decision support as well as migrating legacy data.

  1. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability

    PubMed Central

    Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.

    2008-01-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service Oriented Architecture (SSOA) for cancer research by the National Cancer Institute’s cancer Biomedical Informatics Grid (caBIG™). PMID:17512259

  2. Interoperability in Personalized Adaptive Learning

    ERIC Educational Resources Information Center

    Aroyo, Lora; Dolog, Peter; Houben, Geert-Jan; Kravcik, Milos; Naeve, Ambjorn; Nilsson, Mikael; Wild, Fridolin

    2006-01-01

    Personalized adaptive learning requires semantic-based and context-aware systems to manage the Web knowledge efficiently as well as to achieve semantic interoperability between heterogeneous information resources and services. The technological and conceptual differences can be bridged either by means of standards or via approaches based on the…

  3. A Proof-of-Concept for Semantically Interoperable Federation of IoT Experimentation Facilities

    PubMed Central

    Lanza, Jorge; Sanchez, Luis; Gomez, David; Elsaleh, Tarek; Steinke, Ronald; Cirillo, Flavio

    2016-01-01

    The Internet-of-Things (IoT) is unanimously identified as one of the main pillars of future smart scenarios. The potential of IoT technologies and deployments has been already demonstrated in a number of different application areas, including transport, energy, safety and healthcare. However, despite the growing number of IoT deployments, the majority of IoT applications tend to be self-contained, thereby forming application silos. A lightweight data centric integration and combination of these silos presents several challenges that still need to be addressed. Indeed, the ability to combine and synthesize data streams and services from diverse IoT platforms and testbeds, holds the promise to increase the potentiality of smart applications in terms of size, scope and targeted business context. In this article, a proof-of-concept implementation that federates two different IoT experimentation facilities by means of semantic-based technologies will be described. The specification and design of the implemented system and information models will be described together with the practical details of the developments carried out and its integration with the existing IoT platforms supporting the aforementioned testbeds. Overall, the system described in this paper demonstrates that it is possible to open new horizons in the development of IoT applications and experiments at a global scale, that transcend the (silo) boundaries of individual deployments, based on the semantic interconnection and interoperability of diverse IoT platforms and testbeds. PMID:27367695

  4. A Proof-of-Concept for Semantically Interoperable Federation of IoT Experimentation Facilities.

    PubMed

    Lanza, Jorge; Sanchez, Luis; Gomez, David; Elsaleh, Tarek; Steinke, Ronald; Cirillo, Flavio

    2016-06-29

    The Internet-of-Things (IoT) is unanimously identified as one of the main pillars of future smart scenarios. The potential of IoT technologies and deployments has been already demonstrated in a number of different application areas, including transport, energy, safety and healthcare. However, despite the growing number of IoT deployments, the majority of IoT applications tend to be self-contained, thereby forming application silos. A lightweight data centric integration and combination of these silos presents several challenges that still need to be addressed. Indeed, the ability to combine and synthesize data streams and services from diverse IoT platforms and testbeds, holds the promise to increase the potentiality of smart applications in terms of size, scope and targeted business context. In this article, a proof-of-concept implementation that federates two different IoT experimentation facilities by means of semantic-based technologies will be described. The specification and design of the implemented system and information models will be described together with the practical details of the developments carried out and its integration with the existing IoT platforms supporting the aforementioned testbeds. Overall, the system described in this paper demonstrates that it is possible to open new horizons in the development of IoT applications and experiments at a global scale, that transcend the (silo) boundaries of individual deployments, based on the semantic interconnection and interoperability of diverse IoT platforms and testbeds.

  5. Analyzing SNOMED CT and HL7 terminology binding for semantic interoperability on post-genomic clinical trials.

    PubMed

    Aso, Santiago; Perez-Rey, David; Alonso-Calvo, Raul; Rico-Diez, Antonio; Bucur, Anca; Claerhout, Brecht; Maojo, Victor

    2013-01-01

    Current post-genomic clinical trials in cancer involve the collaboration of several institutions. Multi-centric retrospective analysis requires advanced methods to ensure semantic interoperability. In this scenario, the objective of the EU funded INTEGRATE project, is to provide an infrastructure to share knowledge and data in post-genomic breast cancer clinical trials. This paper presents the process carried out in this project, to bind domain terminologies in the area, such as SNOMED CT, with the HL7 v3 Reference Information Model (RIM). The proposed terminology binding follow the HL7 recommendations, but should also consider important issues such as overlapping concepts and domain terminology coverage. Although there are limitations due to the large heterogeneity of the data in the area, the proposed process has been successfully applied within the context of the INTEGRATE project. An improvement in semantic interoperability of patient data from modern breast cancer clinical trials, aims to enhance the clinical practice in oncology.

  6. W2E--Wellness Warehouse Engine for Semantic Interoperability of Consumer Health Data.

    PubMed

    Honko, Harri; Andalibi, Vafa; Aaltonen, Timo; Parak, Jakub; Saaranen, Mika; Viik, Jari; Korhonen, Ilkka

    2016-11-01

    Novel health monitoring devices and applications allow consumers easy and ubiquitous ways to monitor their health status. However, technologies from different providers lack both technical and semantic interoperability and hence the resulting health data are often deeply tied to a specific service, which is limiting its reusability and utilization in different services. We have designed a Wellness Warehouse Engine (W2E) that bridges this gap and enables seamless exchange of data between different services. W2E provides interfaces to various data sources and makes data available via unified representational state transfer application programming interface to other services. Importantly, it includes Unifier--an engine that allows transforming input data into generic units reusable by other services, and Analyzer--an engine that allows advanced analysis of input data, such as combining different data sources into new output parameters. In this paper, we describe the architecture of W2E and demonstrate its applicability by using it for unifying data from four consumer activity trackers, using a test base of 20 subjects each carrying out three different tracking sessions. Finally, we discuss challenges of building a scalable Unifier engine for the ever-enlarging number of new devices.

  7. A step-by-step methodology for enterprise interoperability projects

    NASA Astrophysics Data System (ADS)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  8. Semantic Interoperability for Computational Mineralogy: Experiences of the eMinerals Consortium

    NASA Astrophysics Data System (ADS)

    Walker, A. M.; White, T. O.; Dove, M. T.; Bruin, R. P.; Couch, P. A.; Tyer, R. P.

    2006-12-01

    The use of atomic scale computer simulation of minerals to obtain information for geophysics and environmental science has grown enormously over the past couple of decades. It is now routine to probe mineral behavior in the Earth's deep interior and in the surface environment by borrowing methods and simulation codes from computational chemistry and physics. It is becoming increasingly important to use methods embodied in more than one of these codes to solve any single scientific problem. However, scientific codes are rarely designed for easy interoperability and data exchange; data formats are often code-specific, poorly documented and fragile, liable to frequent change between software versions, and even compiler versions. This means that the scientist's simple desire to use the methodological approaches offered by multiple codes is frustrated, and even the sharing of data between collaborators becomes fraught with difficulties. The eMinerals consortium was formed in the early stages of the UK eScience program with the aim of developing the tools needed to apply atomic scale simulation to environmental problems in a grid-enabled world, and to harness the computational power offered by grid technologies to address some outstanding mineralogical problems. One example of the kind of problem we can tackle is the origin of the compressibility anomaly in silica glass. By passing data directly between simulation and analysis tools we were able to probe this effect in more detail than has previously been possible and have shown how the anomaly is related to the details of the amorphous structure. In order to approach this kind of problem we have constructed a mini-grid, a small scale and extensible combined compute- and data-grid that allows the execution of many calculations in parallel, and the transparent storage of semantically-rich marked-up result data. Importantly, we automatically capture multiple kinds of metadata and key results from each calculation. We

  9. An integrated framework to achieve interoperability in person-centric health management.

    PubMed

    Vergari, Fabio; Salmon Cinotti, Tullio; D'Elia, Alfredo; Roffia, Luca; Zamagni, Guido; Lamberti, Claudio

    2011-01-01

    The need for high-quality out-of-hospital healthcare is a known socioeconomic problem. Exploiting ICT's evolution, ad-hoc telemedicine solutions have been proposed in the past. Integrating such ad-hoc solutions in order to cost-effectively support the entire healthcare cycle is still a research challenge. In order to handle the heterogeneity of relevant information and to overcome the fragmentation of out-of-hospital instrumentation in person-centric healthcare systems, a shared and open source interoperability component can be adopted, which is ontology driven and based on the semantic web data model. The feasibility and the advantages of the proposed approach are demonstrated by presenting the use case of real-time monitoring of patients' health and their environmental context.

  10. An Integrated Framework to Achieve Interoperability in Person-Centric Health Management

    PubMed Central

    Vergari, Fabio; Salmon Cinotti, Tullio; D'Elia, Alfredo; Roffia, Luca; Zamagni, Guido; Lamberti, Claudio

    2011-01-01

    The need for high-quality out-of-hospital healthcare is a known socioeconomic problem. Exploiting ICT's evolution, ad-hoc telemedicine solutions have been proposed in the past. Integrating such ad-hoc solutions in order to cost-effectively support the entire healthcare cycle is still a research challenge. In order to handle the heterogeneity of relevant information and to overcome the fragmentation of out-of-hospital instrumentation in person-centric healthcare systems, a shared and open source interoperability component can be adopted, which is ontology driven and based on the semantic web data model. The feasibility and the advantages of the proposed approach are demonstrated by presenting the use case of real-time monitoring of patients' health and their environmental context. PMID:21811499

  11. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran.

    PubMed

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.

  12. Establishing Semantic Interoperability, Under Denied, Disconnected, Intermittant, and Limited Telecommunications Conditions

    DTIC Science & Technology

    2014-06-01

    always been there for me and gave me a beautiful model of hard work and perseverance. xv THIS PAGE INTENTIONALLY LEFT...efforts on Data Distribution Services (DDS) for its applicability to military IT/C2 systems operating in a denied environment. DDS is advertised ...interoperability is simply the ability to send signals or bytes through 5 a reliable physical connection. Technical integration is more feasible and less complex

  13. Community-Driven Initiatives to Achieve Interoperability for Ecological and Environmental Data

    NASA Astrophysics Data System (ADS)

    Madin, J.; Bowers, S.; Jones, M.; Schildhauer, M.

    2007-12-01

    interoperability by describing the semantics of data at the level of observation and measurement (rather than the traditional focus at the level of the data set) and will define the necessary specifications and technologies to facilitate semantic interpretation and integration of observational data for the environmental sciences. As such, this initiative will focus on unifying the various existing approaches for representing and describing observation data (e.g., SEEK's Observation Ontology, CUAHSI's Observation Data Model, NatureServe's Observation Data Standard, to name a few). Products of this initiative will be compatible with existing standards and build upon recent advances in knowledge representation (e.g., W3C's recommended Web Ontology Language, OWL) that have demonstrated practical utility in enhancing scientific communication and data interoperability in other communities (e.g., the genomics community). A community-sanctioned, extensible, and unified model for observational data will support metadata standards such as EML while reducing the "babel" of scientific dialects that currently impede effective data integration, which will in turn provide a strong foundation for enabling cross-disciplinary synthetic research in the ecological and environmental sciences.

  14. The Semantic Management of Environmental Resources within the Interoperable Context of the EuroGEOSS: Alignment of GEMET and the GEOSS SBAs

    NASA Astrophysics Data System (ADS)

    Cialone, Claudia; Stock, Kristin

    2010-05-01

    EuroGEOSS is a European Commission funded project. It aims at improving a scientific understanding of the complex mechanisms which drive changes affecting our planet, identifying and establishing interoperable arrangements between environmental information systems. These systems would be sustained and operated by organizations with a clear mandate and resources and rendered available following the specifications of already existent frameworks such as GEOSS (the Global Earth Observation System of systems)1 and INSPIRE (the Infrastructure for Spatial Information in the European Community)2. The EuroGEOSS project's infrastructure focuses on three thematic areas: forestry, drought and biodiversity. One of the important activities in the project is the retrieval, parsing and harmonization of the large amount of heterogeneous environmental data available at local, regional and global levels between these strategic areas. The challenge is to render it semantically and technically interoperable in a simple way. An initial step in achieving this semantic and technical interoperability involves the selection of appropriate classification schemes (for example, thesauri, ontologies and controlled vocabularies) to describe the resources in the EuroGEOSS framework. These classifications become a crucial part of the interoperable framework scaffolding because they allow data providers to describe their resources and thus support resource discovery, execution and orchestration of varying levels of complexity. However, at present, given the diverse range of environmental thesauri, controlled vocabularies and ontologies and the large number of resources provided by project participants, the selection of appropriate classification schemes involves a number of considerations. First of all, there is the semantic difficulty of selecting classification schemes that contain concepts that are relevant to each thematic area. Secondly, EuroGEOSS is intended to accommodate a number of

  15. Using architectures for semantic interoperability to create journal clubs for emergency response

    SciTech Connect

    Powell, James E; Collins, Linn M; Martinez, Mark L B

    2009-01-01

    In certain types of 'slow burn' emergencies, careful accumulation and evaluation of information can offer a crucial advantage. The SARS outbreak in the first decade of the 21st century was such an event, and ad hoc journal clubs played a critical role in assisting scientific and technical responders in identifying and developing various strategies for halting what could have become a dangerous pandemic. This research-in-progress paper describes a process for leveraging emerging semantic web and digital library architectures and standards to (1) create a focused collection of bibliographic metadata, (2) extract semantic information, (3) convert it to the Resource Description Framework /Extensible Markup Language (RDF/XML), and (4) integrate it so that scientific and technical responders can share and explore critical information in the collections.

  16. United States Air Force (USAF) Semantic Interoperability Capabilities Based Assessment and Technology Roadmap

    DTIC Science & Technology

    2007-03-01

    GNOWSYS GNOWSYS, Gnowledge Networking and Organizing System, is a web based hybrid knowledge base with a kernel for semantic computing. It is developed in...instances in the 3store knowledge base. OntoEdit/OntoStudio Engineering environment for ontologies Ontology Organizer A DAML+ OIL ontology editor with...Java, Javadoc, Jira, Subversion and Random. In addition, the project page has links to other third-party RDF converters for iCal, Palm , Outlook

  17. Semantic and Domain-based Interoperability (Interoperabilite Semantique et Interoperabilite basee sur un domaine)

    DTIC Science & Technology

    2011-11-01

    approval of the RTA Information Management Systems Branch is required for more than one copy to be made or an extract included in another publication...of knowledge-based methods based on ontologies for the bridging of semantic gaps between different systems. Finally, ‘Battle Management Language...sporadique et quelques projets de référence ont été créés mais d’une part, des recherches approfondies s’avèrent nécessaires et d’autre part, le

  18. Interoperability in clinical research: from metadata registries to semantically annotated CDISC ODM.

    PubMed

    Bruland, Philipp; Breil, Bernhard; Fritz, Fleur; Dugas, Martin

    2012-01-01

    Planning case report forms for data capture in clinical trials is a labor-insensitive and not formalized process. These CRFs are often neither standardized nor using defined data elements. Metadata registries as the NCI caDSR provide the capability to create forms based on common data elements. However, an exchange of these forms into clinical trial management systems through a standardized format like CDISC ODM is currently not offered. Thus, our objectives were to develop a mapping model between NCI forms and ODM. We analyzed 3012 NCI forms and included common data elements regarding their frequency and uniqueness. In this paper, we have created a mapping model between both formats and identified limitations in the conversion process: Semantic codes requested from the caDSR registry did not allow a proper mapping to ODM items and information like the number of module repetitions got lost. Summarized, it can be stated that our mapping model is feasible. However, mapping of semantic concepts in ODM needs to be specified more precisely.

  19. A Semantic Web-based System for Managing Clinical Archetypes.

    PubMed

    Fernandez-Breis, Jesualdo Tomas; Menarguez-Tortosa, Marcos; Martinez-Costa, Catalina; Fernandez-Breis, Eneko; Herrero-Sempere, Jose; Moner, David; Sanchez, Jesus; Valencia-Garcia, Rafael; Robles, Montserrat

    2008-01-01

    Archetypes facilitate the sharing of clinical knowledge and therefore are a basic tool for achieving interoperability between healthcare information systems. In this paper, a Semantic Web System for Managing Archetypes is presented. This system allows for the semantic annotation of archetypes, as well for performing semantic searches. The current system is capable of working with both ISO13606 and OpenEHR archetypes.

  20. Achieving control and interoperability through unified model-based systems and software engineering

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Ingham, Michel; Dvorak, Daniel

    2005-01-01

    Control and interoperation of complex systems is one of the most difficult challenges facing NASA's Exploration Systems Mission Directorate. An integrated but diverse array of vehicles, habitats, and supporting facilities, evolving over the long course of the enterprise, must perform ever more complex tasks while moving steadily away from the sphere of ground support and intervention.

  1. Achieving Interoperability Through Base Registries for Governmental Services and Document Management

    NASA Astrophysics Data System (ADS)

    Charalabidis, Yannis; Lampathaki, Fenareti; Askounis, Dimitris

    As digital infrastructures increase their presence worldwide, following the efforts of governments to provide citizens and businesses with high-quality one-stop services, there is a growing need for the systematic management of those newly defined and constantly transforming processes and electronic documents. E-government Interoperability Frameworks usually cater to the technical standards of e-government systems interconnection, but do not address service composition and use by citizens, businesses, or other administrations.

  2. Biologically Inspired Model for Visual Cognition Achieving Unsupervised Episodic and Semantic Feature Learning.

    PubMed

    Qiao, Hong; Li, Yinlin; Li, Fengfu; Xi, Xuanyang; Wu, Wei

    2016-10-01

    Recently, many biologically inspired visual computational models have been proposed. The design of these models follows the related biological mechanisms and structures, and these models provide new solutions for visual recognition tasks. In this paper, based on the recent biological evidence, we propose a framework to mimic the active and dynamic learning and recognition process of the primate visual cortex. From principle point of view, the main contributions are that the framework can achieve unsupervised learning of episodic features (including key components and their spatial relations) and semantic features (semantic descriptions of the key components), which support higher level cognition of an object. From performance point of view, the advantages of the framework are as follows: 1) learning episodic features without supervision-for a class of objects without a prior knowledge, the key components, their spatial relations and cover regions can be learned automatically through a deep neural network (DNN); 2) learning semantic features based on episodic features-within the cover regions of the key components, the semantic geometrical values of these components can be computed based on contour detection; 3) forming the general knowledge of a class of objects-the general knowledge of a class of objects can be formed, mainly including the key components, their spatial relations and average semantic values, which is a concise description of the class; and 4) achieving higher level cognition and dynamic updating-for a test image, the model can achieve classification and subclass semantic descriptions. And the test samples with high confidence are selected to dynamically update the whole model. Experiments are conducted on face images, and a good performance is achieved in each layer of the DNN and the semantic description learning process. Furthermore, the model can be generalized to recognition tasks of other objects with learning ability.

  3. Toward Semantic Interoperability in Home Health Care: Formally Representing OASIS Items for Integration into a Concept-oriented Terminology

    PubMed Central

    Choi, Jeungok; Jenkins, Melinda L.; Cimino, James J.; White, Thomas M.; Bakken, Suzanne

    2005-01-01

    Objective: The authors aimed to (1) formally represent OASIS-B1 concepts using the Logical Observation Identifiers, Names, and Codes (LOINC) semantic structure; (2) demonstrate integration of OASIS-B1 concepts into a concept-oriented terminology, the Medical Entities Dictionary (MED); (3) examine potential hierarchical structures within LOINC among OASIS-B1 and other nursing terms; and (4) illustrate a Web-based implementation for OASIS-B1 data entry using Dialogix, a software tool with a set of functions that supports complex data entry. Design and Measurements: Two hundred nine OASIS-B1 items were dissected into the six elements of the LOINC semantic structure and then integrated into the MED hierarchy. Each OASIS-B1 term was matched to LOINC-coded nursing terms, Home Health Care Classification, the Omaha System, and the Sign and Symptom Check-List for Persons with HIV, and the extent of the match was judged based on a scale of 0 (no match) to 4 (exact match). OASIS-B1 terms were implemented as a Web-based survey using Dialogix. Results: Of 209 terms, 204 were successfully dissected into the elements of the LOINC semantics structure and integrated into the MED with minor revisions of MED semantics. One hundred fifty-one OASIS-B1 terms were mapped to one or more of the LOINC-coded nursing terms. Conclusion: The LOINC semantic structure offers a standard way to add home health care data to a comprehensive patient record to facilitate data sharing for monitoring outcomes across sites and to further terminology management, decision support, and accurate information retrieval for evidence-based practice. The cross-mapping results support the possibility of a hierarchical structure of the OASIS-B1 concepts within nursing terminologies in the LOINC database. PMID:15802480

  4. A cloud-based approach for interoperable electronic health records (EHRs).

    PubMed

    Bahga, Arshdeep; Madisetti, Vijay K

    2013-09-01

    We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.

  5. Towards a Common Platform to Support Business Processes, Services and Semantics

    NASA Astrophysics Data System (ADS)

    Piprani, Baba

    The search for the Holy Grail in achieving interoperability of business processes, services and semantics continues with every new type or search for the Silver Bullet. Most approaches towards interoperability either are focusing narrowly on the simplistic notion using technology supporting a cowboy-style development without much regard to metadata or semantics. At the same time, the distortions on semantics created by many of current modeling paradigms and approaches - including the disharmony created by multiplicity of parallel approaches to standardization - are not helping us resolve the real issues facing knowledge and semantics management. This paper will address some of the issues facing us, like: What have we achieved? Where did we go wrong? What are we doing right? - providing an ipso-facto encapsulated candid snapshot on an approach to harmonizing our approach to interoperability, and propose a common platform to support Business Processes, Services and Semantics.

  6. The HL7-OMG Healthcare Services Specification Project: Motivation, Methodology, and Deliverables for Enabling a Semantically Interoperable Service-oriented Architecture for Healthcare

    PubMed Central

    Kawamoto, Kensaku; Honey, Alan; Rubin, Ken

    2009-01-01

    Context The healthcare industry could achieve significant benefits through the adoption of a service-oriented architecture (SOA). The specification and adoption of standard software service interfaces will be critical to achieving these benefits. Objective To develop a replicable, collaborative framework for standardizing the interfaces of software services important to healthcare. Design Iterative, peer-reviewed development of a framework for generating interoperable service specifications that build on existing and ongoing standardization efforts. The framework was created under the auspices of the Healthcare Services Specification Project (HSSP), which was initiated in 2005 as a joint initiative between Health Level7 (HL7) and the Object Management Group (OMG). In this framework, known as the HSSP Service Specification Framework, HL7 identifies candidates for service standardization and defines normative Service Functional Models (SFMs) that specify the capabilities and conformance criteria for these services. OMG then uses these SFMs to generate technical service specifications as well as reference implementations. Measurements The ability of the framework to support the creation of multiple, interoperable service specifications useful for healthcare. Results Functional specifications have been defined through HL7 for four services: the Decision Support Service; the Entity Identification Service; the Clinical Research Filtered Query Service; and the Retrieve, Locate, and Update Service. Technical specifications and commercial implementations have been developed for two of these services within OMG. Furthermore, three additional functional specifications are being developed through HL7. Conclusions The HSSP Service Specification Framework provides a replicable and collaborative approach to defining standardized service specifications for healthcare. PMID:19717796

  7. Large scale healthcare data integration and analysis using the semantic web.

    PubMed

    Timm, John; Renly, Sondra; Farkash, Ariel

    2011-01-01

    Healthcare data interoperability can only be achieved when the semantics of the content is well defined and consistently implemented across heterogeneous data sources. Achieving these objectives of interoperability requires the collaboration of experts from several domains. This paper describes tooling that integrates Semantic Web technologies with common tools to facilitate cross-domain collaborative development for the purposes of data interoperability. Our approach is divided into stages of data harmonization and representation, model transformation, and instance generation. We applied our approach on Hypergenes, an EU funded project, where we use our method to the Essential Hypertension disease model using a CDA template. Our domain expert partners include clinical providers, clinical domain researchers, healthcare information technology experts, and a variety of clinical data consumers. We show that bringing Semantic Web technologies into the healthcare interoperability toolkit increases opportunities for beneficial collaboration thus improving patient care and clinical research outcomes.

  8. Semantic Registration and Discovery System of Subsystems and Services within an Interoperable Coordination Platform in Smart Cities

    PubMed Central

    Rubio, Gregorio; Martínez, José Fernán; Gómez, David; Li, Xin

    2016-01-01

    Smart subsystems like traffic, Smart Homes, the Smart Grid, outdoor lighting, etc. are built in many urban areas, each with a set of services that are offered to citizens. These subsystems are managed by self-contained embedded systems. However, coordination and cooperation between them are scarce. An integration of these systems which truly represents a “system of systems” could introduce more benefits, such as allowing the development of new applications and collective optimization. The integration should allow maximum reusability of available services provided by entities (e.g., sensors or Wireless Sensor Networks). Thus, it is of major importance to facilitate the discovery and registration of available services and subsystems in an integrated way. Therefore, an ontology-based and automatic system for subsystem and service registration and discovery is presented. Using this proposed system, heterogeneous subsystems and services could be registered and discovered in a dynamic manner with additional semantic annotations. In this way, users are able to build customized applications across different subsystems by using available services. The proposed system has been fully implemented and a case study is presented to show the usefulness of the proposed method. PMID:27347965

  9. Semantic Registration and Discovery System of Subsystems and Services within an Interoperable Coordination Platform in Smart Cities.

    PubMed

    Rubio, Gregorio; Martínez, José Fernán; Gómez, David; Li, Xin

    2016-06-24

    Smart subsystems like traffic, Smart Homes, the Smart Grid, outdoor lighting, etc. are built in many urban areas, each with a set of services that are offered to citizens. These subsystems are managed by self-contained embedded systems. However, coordination and cooperation between them are scarce. An integration of these systems which truly represents a "system of systems" could introduce more benefits, such as allowing the development of new applications and collective optimization. The integration should allow maximum reusability of available services provided by entities (e.g., sensors or Wireless Sensor Networks). Thus, it is of major importance to facilitate the discovery and registration of available services and subsystems in an integrated way. Therefore, an ontology-based and automatic system for subsystem and service registration and discovery is presented. Using this proposed system, heterogeneous subsystems and services could be registered and discovered in a dynamic manner with additional semantic annotations. In this way, users are able to build customized applications across different subsystems by using available services. The proposed system has been fully implemented and a case study is presented to show the usefulness of the proposed method.

  10. Interoperability and information discovery

    USGS Publications Warehouse

    Christian, E.

    2001-01-01

    In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.

  11. GENESIS SciFlo: Choreographing Interoperable Web Services on the Grid using a Semantically-Enabled Dataflow Execution Environment

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.

    2007-12-01

    Access Protocol (OpenDAP) servers. SciFlo also publishes its own SOAP services for space/time query and subsetting of Earth Science datasets, and automated access to large datasets via lists of (FTP, HTTP, or DAP) URLs which point to on-line HDF or netCDF files. Typical distributed workflows obtain datasets by calling standard WMS/WCS servers or discovering and fetching data granules from ftp sites; invoke remote analysis operators available as SOAP services (interface described by a WSDL document); and merge results into binary containers (netCDF or HDF files) for further analysis using local executable operators. Naming conventions (HDFEOS and CF-1.0 for netCDF) are exploited to automatically understand and read on-line datasets. More interoperable conventions, and broader adoption of existing converntions, are vital if we are to "scale up" automated choreography of Web Services beyond toy applications. Recently, the ESIP Federation sponsored a collaborative activity in which several ESIP members developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, the benefits of doing collaborative science analysis at the "touch of a button" once services are connected, and further collaborations that are being pursued.

  12. Interoperability Measurement

    DTIC Science & Technology

    2008-08-01

    briefings, pictures, maps, spreadsheets, and databases , yet does not specify the application to generate these files, nor does it specify the format...operational activities, system functions, needlines, and information exchanges stored in a CADM database could be interoperability characters, but not...Interoperability Workshop (February 2003). February 2003. Linnaeus, C. Systema Naturae . 1st ed. Netherlands:, 1735. Litwin, W., and A. Abdellatif

  13. Designing Information Interoperability

    SciTech Connect

    Gorman, Bryan L.; Shankar, Mallikarjun; Resseguie, David R.

    2009-01-01

    Examples of incompatible systems are offered with a discussion of the relationship between incompatibility and innovation. Engineering practices and the role of standards are reviewed as a means of resolving issues of incompatibility, with particular attention to the issue of innovation. Loosely-coupled systems are described as a means of achieving and sustaining both interoperability and innovation in heterogeneous environments. A virtual unifying layer, in terms of a standard, a best practice, and a methodology, is proposed as a modality for designing information interoperability for enterprise applicaitons. The Uniform Resource Identifier (URI), microformats, and Joshua Porter s AOF Method are described and presented as solutions for designing interoperable information sharing web sites. The Special Operations Force Information Access (SOFIA), a mock design, is presented as an example of information interoperability.

  14. User-centered semantic harmonization: a case study.

    PubMed

    Weng, Chunhua; Gennari, John H; Fridsma, Douglas B

    2007-06-01

    Semantic interoperability is one of the great challenges in biomedical informatics. Methods such as ontology alignment or use of metadata neither scale nor fundamentally alleviate semantic heterogeneity among information sources. In the context of the Cancer Biomedical Informatics Grid program, the Biomedical Research Integrated Domain Group (BRIDG) has been making an ambitious effort to harmonize existing information models for clinical research from a variety of sources and modeling agreed-upon semantics shared by the technical harmonization committee and the developers of these models. This paper provides some observations on this user-centered semantic harmonization effort and its inherent technical and social challenges. The authors also compare BRIDG with related efforts to achieve semantic interoperability in healthcare, including UMLS, InterMed, the Semantic Web, and the Ontology for Biomedical Investigations initiative. The BRIDG project demonstrates the feasibility of user-centered collaborative domain modeling as an approach to semantic harmonization, but also highlights a number of technology gaps in support of collaborative semantic harmonization that remain to be filled.

  15. Coalition Search and Rescue - Task Support Intelligent Task Achieving Agents on the Semantic Web

    DTIC Science & Technology

    2006-03-01

    Creating the Semantic Web”, Fensel, D., Hendler, J., Liebermann , H. and Wahlster, W. (eds.), MIT Press, 2001. Uszok, A., Bradshaw, J. M., Jeffers...Agents on the World Wide Web, in Spinning the Semantic Web, Fensel, D., Hendler, J., Liebermann , H. and Wahlster, W. (eds.), Chapter 15, pp. 431-458

  16. Interoperation of heterogeneous CAD tools in Ptolemy II

    NASA Astrophysics Data System (ADS)

    Liu, Jie; Wu, Bicheng; Liu, Xiaojun; Lee, Edward A.

    1999-03-01

    Typical complex systems that involve microsensors and microactuators exhibit heterogeneity both at the implementation level and the problem level. For example, a system can be modeled using discrete events for digital circuits and SPICE-like analog descriptions for sensors. This heterogeneity exist not only in different implementation domains, but also at different level of abstraction. This naturally leads to a heterogeneous approach to system design that uses domain-specific models of computation (MoC) at various levels of abstractions to define a system, and leverages multiple CAD tools to do simulation, verification and synthesis. As the size and scope of the system increase, the integration becomes too difficult and unmanageable if different tools are coordinated using simple scripts. In addition, for MEMS devices and mixed-signal circuits, it is essential to integrate tools with different MoC to simulate the whole system. Ptolemy II, a heterogeneous system-level design tool, supports the interaction among different MoCs. This paper discusses heterogeneous CAD tool interoperability in the Ptolemy II framework. The key is to understand the semantic interface and classify the tools by their MoC and their level of abstraction. Interfaces are designed for each domain so that the external tools can be easily wrapped. Then the interoperability of the tools becomes the interoperability of the semantics. Ptolemy II can act as the standard interface among different tools to achieve the overall design modeling. A micro-accelerometer with digital feedback is studied as an example.

  17. Lemnos Interoperable Security Program

    SciTech Connect

    Stewart, John; Halbgewachs, Ron; Chavez, Adrian; Smith, Rhett; Teumim, David

    2012-01-31

    The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relating to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock

  18. A Semantic Web Service and Simulation Framework to Intelligent Distributed Manufacturing

    SciTech Connect

    Son, Young Jun; Kulvatunyou, Boonserm; Cho, Hyunbo; Feng, Shaw

    2005-11-01

    To cope with today's fluctuating markets, a virtual enterprise (VE) concept can be employed to achieve the cooperation among independently operating enterprises. The success of VE depends on reliable interoperation among trading partners. This paper proposes a framework based on semantic web of manufacturing and simulation services to enable business and engineering collaborations between VE partners, particularly a design house and manufacturing suppliers.

  19. Toward the interoperability of HL7 v3 and SNOMED CT: a case study modeling mobile clinical treatment.

    PubMed

    Ryan, Amanda; Eklund, Peter; Esler, Brett

    2007-01-01

    Semantic interoperability in healthcare can be achieved by a tighter coupling of terminology and HL7 message models. In this paper, we highlight the difficulty of achieving this goal, but show how it can become attainable by basing HL7 message models on SNOMED CT concepts and relationships. We then demonstrate how this methodology has been applied to a set of clinical observations for use in the ePOC project, and discuss our findings.

  20. Improving Interoperability in ePrescribing

    PubMed Central

    Åstrand, Bengt; Petersson, Göran

    2012-01-01

    Background The increased application of eServices in health care, in general, and ePrescribing (electronic prescribing) in particular, have brought quality and interoperability to the forefront. The application of standards has been put forward as one important factor in improving interoperability. However, less focus has been placed on other factors, such as stakeholders’ involvement and the measurement of interoperability. An information system (IS) can be regarded to comprise an instrument for technology-mediated work communication. In this study, interoperability refers to the interoperation in the ePrescribing process, involving people, systems, procedures and organizations. We have focused on the quality of the ePrescription message as one component of the interoperation in the ePrescribing process. Objective The objective was to analyze how combined efforts in improving interoperability with the introduction of the new national ePrescription format (NEF) have impacted interoperability in the ePrescribing process in Sweden, with the focus on the quality of the ePrescription message. Methods Consecutive sampling of electronic prescriptions in Sweden before and after the introduction of NEF was undertaken in April 2008 (pre-NEF) and April 2009 (post-NEF). Interoperability problems were identified and classified based on message format specifications and prescription rules. Results The introduction of NEF improved the interoperability of ePrescriptions substantially. In the pre-NEF sample, a total of 98.6% of the prescriptions had errors. In the post-NEF sample, only 0.9% of the prescriptions had errors. The mean number of errors was fewer for the erroneous prescriptions: 4.8 in pre-NEF compared to 1.0 in post-NEF. Conclusions We conclude that a systematic comprehensive work on interoperability, covering technical, semantical, professional, judicial and process aspects, involving the stakeholders, resulted in an improved interoperability of e

  1. Effects of Semantic Ambiguity Detection Training on Reading Comprehension Achievement of English Learners with Learning Difficulties

    ERIC Educational Resources Information Center

    Jozwik, Sara L.; Douglas, Karen H.

    2016-01-01

    This study examined how explicit instruction in semantic ambiguity detection affected the reading comprehension and metalinguistic awareness of five English learners (ELs) with learning difficulties (e.g., attention deficit/hyperactivity disorder, specific learning disability). A multiple probe across participants design (Gast & Ledford, 2010)…

  2. Effects of Semantic Web Based Learning on Pre-Service Teachers' ICT Learning Achievement and Satisfaction

    ERIC Educational Resources Information Center

    Karalar, Halit; Korucu, Agah Tugrul

    2016-01-01

    Although the Semantic Web offers many opportunities for learners, effects of it in the classroom is not well known. Therefore, in this study explanations have been stated as how the learning objects defined by means of using the terminology in a developed ontology and kept in objects repository should be presented to learners with the aim of…

  3. Turning Interoperability Operational with GST

    NASA Astrophysics Data System (ADS)

    Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha

    2013-04-01

    GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially

  4. Clinical data interoperability based on archetype transformation.

    PubMed

    Costa, Catalina Martínez; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2011-10-01

    The semantic interoperability between health information systems is a major challenge to improve the quality of clinical practice and patient safety. In recent years many projects have faced this problem and provided solutions based on specific standards and technologies in order to satisfy the needs of a particular scenario. Most of such solutions cannot be easily adapted to new scenarios, thus more global solutions are needed. In this work, we have focused on the semantic interoperability of electronic healthcare records standards based on the dual model architecture and we have developed a solution that has been applied to ISO 13606 and openEHR. The technological infrastructure combines reference models, archetypes and ontologies, with the support of Model-driven Engineering techniques. For this purpose, the interoperability infrastructure developed in previous work by our group has been reused and extended to cover the requirements of data transformation.

  5. Semantic Research for Digital Libraries.

    ERIC Educational Resources Information Center

    Chen, Hsinchun

    1999-01-01

    Discusses the need for semantic research in digital libraries to help overcome interoperability problems. Highlights include federal initiatives; semantic analysis; knowledge representations; human-computer interactions and information visualization; and the University of Illinois DLI (Digital Libraries Initiative) project through partnership with…

  6. ARTEMIS: towards a secure interoperability infrastructure for healthcare information systems.

    PubMed

    Boniface, Mike; Wilken, Paul

    2005-01-01

    The ARTEMIS project is developing a semantic web service based P2P interoperability infrastructure for healthcare information systems. The strict legislative framework in which these systems are deployed means that the interoperability of security and privacy mechanisms is an important requirement in supporting communication of electronic healthcare records across organisation boundaries. In ARTEMIS, healthcare providers define semantically annotated security and privacy policies for web services based on organisational requirements. The ARTEMIS mediator uses these semantic web service descriptions to broker between organisational policies by reasoning over security and clinical concept ontologies.

  7. Challenges of Space Mission Interoperability

    NASA Technical Reports Server (NTRS)

    Martin, Warren L.; Hooke, Adrian J.

    2007-01-01

    This viewgraph presentation reviews some of the international challenges to space mission interoperability. Interoperability is the technical capability of two or more systems or components to exchange information and to use the information that has been exchanged. One of the challenges that is addressed is the problem of spectrum bandwidth, and interference. The key to interoperability is the standardization of space communications services and protocols. Various levels of international cross support are reviewed: harmony, cooperation cross support and confederation cross support. The various international bodies charged with implementing cross support are reviewed. The goal of the Interagency Operations Advisory Group (IOAG) is to achieve plug-and-play operations where all that is required is for each of the systems to use an agreed communications medium, after which the systems configure each other for the purpose of exchanging information and subsequently effect such exchange automatically.

  8. Combining Archetypes with Fast Health Interoperability Resources in Future-proof Health Information Systems.

    PubMed

    Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat

    2015-01-01

    Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes.

  9. Controlled Vocabularies, Mini Ontologies and Interoperability (Invited)

    NASA Astrophysics Data System (ADS)

    King, T. A.; Walker, R. J.; Roberts, D.; Thieman, J.; Ritschel, B.; Cecconi, B.; Genot, V. N.

    2013-12-01

    Interoperability has been an elusive goal, but in recent years advances have been made using controlled vocabularies, mini-ontologies and a lot of collaboration. This has led to increased interoperability between disciplines in the U.S. and between international projects. We discuss the successful pattern followed by SPASE, IVOA and IPDA to achieve this new level of international interoperability. A key aspect of the pattern is open standards and open participation with interoperability achieved with shared services, public APIs, standard formats and open access to data. Many of these standards are expressed as controlled vocabularies and mini ontologies. To illustrate the pattern we look at SPASE related efforts and participation of North America's Heliophysics Data Environment and CDPP; Europe's Cluster Active Archive, IMPEx, EuroPlanet, ESPAS and HELIO; and Japan's magnetospheric missions. Each participating project has its own life cycle and successful standards development must always take this into account. A major challenge for sustained collaboration and interoperability is the limited lifespan of many of the participating projects. Innovative approaches and new tools and frameworks are often developed as competitively selected, limited term projects, but for sustainable interoperability successful approaches need to become part of a long term infrastructure. This is being encouraged and achieved in many domains and we are entering a golden age of interoperability.

  10. Improving healthcare middleware standards with semantic methods and technologies.

    PubMed

    Román, Isabel; Calvillo, Jorge; Roa, Laura M; Madinabeitia, Germán

    2008-01-01

    A critical issue in healthcare informatics is to facilitate the integration and interoperability of applications. This goal can be achieved through an open architecture based on a middleware independent from specific applications; useful for working with existing systems, as well as for the integration of new systems. Several standard organizations are making efforts toward this target. This work is based on the EN 12967-1,2,3, developed by CEN, that follows the ODP (Open Distributed Processing) methodology, providing a specification of distributed systems based on the definition of five viewpoints. However, only the three upper viewpoints are used to produce EN 12967, the two lower viewpoints should be considered in the implementation context. We are using Semantic Grid for lower views and Semantic Web and Web Services for the definition of the upper views. We analyze benefits of using these methods and technologies and expose methodology for the development of this semantic healthcare middleware observing European Standards.

  11. Ontologies, knowledge representation, artificial intelligence - hype or prerequisites for international pHealth Interoperability?

    PubMed

    Blobel, Bernd

    2011-01-01

    Nowadays, eHealth and pHealth solutions have to meet advanced interoperability challenges. Enabling pervasive computing and even autonomic computing, pHealth system architectures cover many domains, scientifically managed by specialized disciplines using their specific ontologies. Therefore, semantic interoperability has to advance from a communication protocol to an ontology coordination challenge including semantic integration, bringing knowledge representation and artificial intelligence on the table. The resulting solutions comprehensively support multi-lingual and multi-jurisdictional environments.

  12. Semantic Sensor Web

    NASA Astrophysics Data System (ADS)

    Sheth, A.; Henson, C.; Thirunarayan, K.

    2008-12-01

    Sensors are distributed across the globe leading to an avalanche of data about our environment. It is possible today to utilize networks of sensors to detect and identify a multitude of observations, from simple phenomena to complex events and situations. The lack of integration and communication between these networks, however, often isolates important data streams and intensifies the existing problem of too much data and not enough knowledge. With a view to addressing this problem, the Semantic Sensor Web (SSW) [1] proposes that sensor data be annotated with semantic metadata that will both increase interoperability and provide contextual information essential for situational knowledge. Kno.e.sis Center's approach to SSW is an evolutionary one. It adds semantic annotations to the existing standard sensor languages of the Sensor Web Enablement (SWE) defined by OGC. These annotations enhance primarily syntactic XML-based descriptions in OGC's SWE languages with microformats, and W3C's Semantic Web languages- RDF and OWL. In association with semantic annotation and semantic web capabilities including ontologies and rules, SSW supports interoperability, analysis and reasoning over heterogeneous multi-modal sensor data. In this presentation, we will also demonstrate a mashup with support for complex spatio-temporal-thematic queries [2] and semantic analysis that utilize semantic annotations, multiple ontologies and rules. It uses existing services (e.g., GoogleMap) and semantics enhanced SWE's Sensor Observation Service (SOS) over weather and road condition data from various sensors that are part of Ohio's transportation network. Our upcoming plans are to demonstrate end to end (heterogeneous sensor to application) semantics support and study scalability of SSW involving thousands of sensors to about a billion triples. Keywords: Semantic Sensor Web, Spatiotemporal thematic queries, Semantic Web Enablement, Sensor Observation Service [1] Amit Sheth, Cory Henson, Satya

  13. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  14. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  15. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  16. Groundwater data network interoperability

    USGS Publications Warehouse

    Brodaric, Boyan; Booth, Nathaniel; Boisvert, Eric; Lucido, Jessica M.

    2016-01-01

    Water data networks are increasingly being integrated to answer complex scientific questions that often span large geographical areas and cross political borders. Data heterogeneity is a major obstacle that impedes interoperability within and between such networks. It is resolved here for groundwater data at five levels of interoperability, within a Spatial Data Infrastructure architecture. The result is a pair of distinct national groundwater data networks for the United States and Canada, and a combined data network in which they are interoperable. This combined data network enables, for the first time, transparent public access to harmonized groundwater data from both sides of the shared international border.

  17. The Relationship Between Responses to Science Concepts on a Semantic Differential Instrument and Achievement in Freshman Physics and Chemistry.

    ERIC Educational Resources Information Center

    Rothman, Arthur Israel

    Students taking freshman physics and freshman chemistry at The State University of New York at Buffalo (SUNYAB) were administered a science-related semantic differential instrument. This same test was administered to physics and chemistry graduate students from SUNYAB and the University of Rochester. A scoring procedure was developed which…

  18. Towards an interoperable International Lattice Datagrid

    SciTech Connect

    G. Beckett; P. Coddington; N. Ishii; B. Joo; D. Melkumyan; R. Ostrowski; D. Pleiter; M. Sato; J. Simone; C. Watson; S. Zhang

    2007-11-01

    The International Lattice Datagrid (ILDG) is a federation of several regional grids. Since most of these grids have reached production level, an increasing number of lattice scientists start to benefit from this new research infrastructure. The ILDG Middleware Working Group has the task of specifying the ILDG middleware such that interoperability among the different grids is achieved. In this paper we will present the architecture of the ILDG middleware and describe what has actually been achieved in recent years. Particular focus is given to interoperability and security issues. We will conclude with a short overview on issues which we plan to address in the near future.

  19. The caCORE Software Development Kit: Streamlining construction of interoperable biomedical information services

    PubMed Central

    Phillips, Joshua; Chilukuri, Ram; Fragoso, Gilberto; Warzel, Denise; Covitz, Peter A

    2006-01-01

    Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs). The National Cancer Institute (NCI) developed the cancer common ontologic representation environment (caCORE) to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK) was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML) tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR) using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG) program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has emerged as a key

  20. Buildings Interoperability Landscape

    SciTech Connect

    Hardin, Dave; Stephan, Eric G.; Wang, Weimin; Corbin, Charles D.; Widergren, Steven E.

    2015-12-31

    Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability lie the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.

  1. 23 CFR 950.7 - Interoperability requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... facility; and (3) Identify the noncash electronic technology likely to be in use within the next five years... FHWA that the selected toll collection system and technology achieves the highest reasonable degree of interoperability both with technology currently in use at other existing toll facilities and with technology...

  2. Architecture for interoperable software in biology.

    PubMed

    Bare, James Christopher; Baliga, Nitin S

    2014-07-01

    Understanding biological complexity demands a combination of high-throughput data and interdisciplinary skills. One way to bring to bear the necessary combination of data types and expertise is by encapsulating domain knowledge in software and composing that software to create a customized data analysis environment. To this end, simple flexible strategies are needed for interconnecting heterogeneous software tools and enabling data exchange between them. Drawing on our own work and that of others, we present several strategies for interoperability and their consequences, in particular, a set of simple data structures--list, matrix, network, table and tuple--that have proven sufficient to achieve a high degree of interoperability. We provide a few guidelines for the development of future software that will function as part of an interoperable community of software tools for biological data analysis and visualization.

  3. MENTOR: an enabler for interoperable intelligent systems

    NASA Astrophysics Data System (ADS)

    Sarraipa, João; Jardim-Goncalves, Ricardo; Steiger-Garcao, Adolfo

    2010-07-01

    A community with knowledge organisation based on ontologies will enable an increase in the computational intelligence of its information systems. However, due to the worldwide diversity of communities, a high number of knowledge representation elements, which are not semantically coincident, have appeared representing the same segment of reality, becoming a barrier to business communications. Even if a domain community uses the same kind of technologies in its information systems, such as ontologies, it doesn't solve its semantics differences. In order to solve this interoperability problem, a solution is to use a reference ontology as an intermediary in the communications between the community enterprises and the outside, while allowing the enterprises to keep their own ontology and semantics unchanged internally. This work proposes MENTOR, a methodology to support the development of a common reference ontology for a group of organisations sharing the same business domain. This methodology is based on the mediator ontology (MO) concept, which assists the semantic transformations among each enterprise's ontology and the referential one. The MO enables each organisation to keep its own terminology, glossary and ontological structures, while providing seamless communication and interaction with the others.

  4. Toward interoperable bioscience data.

    PubMed

    Sansone, Susanna-Assunta; Rocca-Serra, Philippe; Field, Dawn; Maguire, Eamonn; Taylor, Chris; Hofmann, Oliver; Fang, Hong; Neumann, Steffen; Tong, Weida; Amaral-Zettler, Linda; Begley, Kimberly; Booth, Tim; Bougueleret, Lydie; Burns, Gully; Chapman, Brad; Clark, Tim; Coleman, Lee-Ann; Copeland, Jay; Das, Sudeshna; de Daruvar, Antoine; de Matos, Paula; Dix, Ian; Edmunds, Scott; Evelo, Chris T; Forster, Mark J; Gaudet, Pascale; Gilbert, Jack; Goble, Carole; Griffin, Julian L; Jacob, Daniel; Kleinjans, Jos; Harland, Lee; Haug, Kenneth; Hermjakob, Henning; Ho Sui, Shannan J; Laederach, Alain; Liang, Shaoguang; Marshall, Stephen; McGrath, Annette; Merrill, Emily; Reilly, Dorothy; Roux, Magali; Shamu, Caroline E; Shang, Catherine A; Steinbeck, Christoph; Trefethen, Anne; Williams-Jones, Bryn; Wolstencroft, Katherine; Xenarios, Ioannis; Hide, Winston

    2012-01-27

    To make full use of research data, the bioscience community needs to adopt technologies and reward mechanisms that support interoperability and promote the growth of an open 'data commoning' culture. Here we describe the prerequisites for data commoning and present an established and growing ecosystem of solutions using the shared 'Investigation-Study-Assay' framework to support that vision.

  5. IMPI: Making MPI Interoperable.

    PubMed

    George, W L; Hagedorn, J G; Devaney, J E

    2000-01-01

    The Message Passing Interface (MPI) is the de facto standard for writing parallel scientific applications in the message passing programming paradigm. Implementations of MPI were not designed to interoperate, thereby limiting the environments in which parallel jobs could be run. We briefly describe a set of protocols, designed by a steering committee of current implementors of MPI, that enable two or more implementations of MPI to interoperate within a single application. Specifically, we introduce the set of protocols collectively called Interoperable MPI (IMPI). These protocols make use of novel techniques to handle difficult requirements such as maintaining interoperability among all IMPI implementations while also allowing for the independent evolution of the collective communication algorithms used in IMPI. Our contribution to this effort has been as a facilitator for meetings, editor of the IMPI Specification document, and as an early testbed for implementations of IMPI. This testbed is in the form of an IMPI conformance tester, a system that can verify the correct operation of an IMPI-enabled version of MPI.

  6. The semantic web in translational medicine: current applications and future directions.

    PubMed

    Machado, Catia M; Rebholz-Schuhmann, Dietrich; Freitas, Ana T; Couto, Francisco M

    2015-01-01

    Semantic web technologies offer an approach to data integration and sharing, even for resources developed independently or broadly distributed across the web. This approach is particularly suitable for scientific domains that profit from large amounts of data that reside in the public domain and that have to be exploited in combination. Translational medicine is such a domain, which in addition has to integrate private data from the clinical domain with proprietary data from the pharmaceutical domain. In this survey, we present the results of our analysis of translational medicine solutions that follow a semantic web approach. We assessed these solutions in terms of their target medical use case; the resources covered to achieve their objectives; and their use of existing semantic web resources for the purposes of data sharing, data interoperability and knowledge discovery. The semantic web technologies seem to fulfill their role in facilitating the integration and exploration of data from disparate sources, but it is also clear that simply using them is not enough. It is fundamental to reuse resources, to define mappings between resources, to share data and knowledge. All these aspects allow the instantiation of translational medicine at the semantic web-scale, thus resulting in a network of solutions that can share resources for a faster transfer of new scientific results into the clinical practice. The envisioned network of translational medicine solutions is on its way, but it still requires resolving the challenges of sharing protected data and of integrating semantic-driven technologies into the clinical practice.

  7. Model and Interoperability using Meta Data Annotations

    NASA Astrophysics Data System (ADS)

    David, O.

    2011-12-01

    Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and

  8. Maturity model for enterprise interoperability

    NASA Astrophysics Data System (ADS)

    Guédria, Wided; Naudet, Yannick; Chen, David

    2015-01-01

    Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.

  9. Supply Chain Interoperability Measurement

    DTIC Science & Technology

    2015-06-19

    Supply Chain Interoperability Measurement DISSERTATION June 2015 Christos E. Chalyvidis, Major, Hellenic Air ...Force AFIT-ENS-DS-15-J-001 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY AIR FORCE INSTITUTE OF TECHNOLOGY Wright-Patterson Air Force...are those of the author and do not reflect the official policy or position of the United States Air Force, Department of Defense, or the United

  10. JCR VSIL Interoperability Testing

    DTIC Science & Technology

    2009-08-01

    JCR VSIL Interoperability Testing Paul Bounker, TARDEC JCR Warren, MI Tim Lee Joshua Walters DCS Corporation Alexandria, VA The Joint Center...for Robotics ( JCR ) Virtual Systems Integration Lab (VSIL) is a combination of Robotics Software Models and Tools used to stimulate Hardware or...other RDECOM labs and centers. JCR VSIL is focusing on supporting the Robotic Systems Joint Project Office (RS JPO) in testing of their developing

  11. National Flood Interoperability Experiment

    NASA Astrophysics Data System (ADS)

    Maidment, D. R.

    2014-12-01

    The National Flood Interoperability Experiment is led by the academic community in collaboration with the National Weather Service through the new National Water Center recently opened on the Tuscaloosa campus of the University of Alabama. The experiment will also involve the partners in IWRSS (Integrated Water Resources Science and Services), which include the USGS, the Corps of Engineers and FEMA. The experiment will address the following questions: (1) How can near-real-time hydrologic forecasting at high spatial resolution, covering the nation, be carried out using the NHDPlus or next generation geofabric (e.g. hillslope, watershed scales)? (2) How can this lead to improved emergency response and community resilience? (3) How can improved an improved interoperability framework support the first two goals and lead to sustained innovation in the research to operations process? The experiment will run from September 2014 through August 2015, in two phases. The mobilization phase from September 2014 until May 2015 will assemble the components of the interoperability framework. A Summer Institute to integrate the components will be held from June to August 2015 at the National Water Center involving faculty and students from the University of Alabama and other institutions coordinated by CUAHSI. It is intended that the insight that arises from this experiment will help lay the foundation for a new national scale, high spatial resolution, near-real-time hydrologic simulation system for the United States.

  12. Lemnos interoperable security project.

    SciTech Connect

    Halbgewachs, Ronald D.

    2010-03-01

    With the Lemnos framework, interoperability of control security equipment is straightforward. To obtain interoperability between proprietary security appliance units, one or both vendors must now write cumbersome 'translation code.' If one party changes something, the translation code 'breaks.' The Lemnos project is developing and testing a framework that uses widely available security functions and protocols like IPsec - to form a secure communications channel - and Syslog, to exchange security log messages. Using this model, security appliances from two or more different vendors can clearly and securely exchange information, helping to better protect the total system. Simplify regulatory compliance in a complicated security environment by leveraging the Lemnos framework. As an electric utility, are you struggling to implement the NERC CIP standards and other regulations? Are you weighing the misery of multiple management interfaces against committing to a ubiquitous single-vendor solution? When vendors build their security appliances to interoperate using the Lemnos framework, it becomes practical to match best-of-breed offerings from an assortment of vendors to your specific control systems needs. The Lemnos project is developing and testing a framework that uses widely available open-source security functions and protocols like IPsec and Syslog to create a secure communications channel between appliances in order to exchange security data.

  13. Principles of data integration and interoperability in the GEO Biodiversity Observation Network

    NASA Astrophysics Data System (ADS)

    Saarenmaa, Hannu; Ó Tuama, Éamonn

    2010-05-01

    The goal of the Global Earth Observation System of Systems (GEOSS) is to link existing information systems into a global and flexible network to address nine areas of critical importance to society. One of these "societal benefit areas" is biodiversity and it will be supported by a GEOSS sub-system known as the GEO Biodiversity Observation Network (GEO BON). In planning the GEO BON, it was soon recognised that there are already a multitude of existing networks and initiatives in place worldwide. What has been lacking is a coordinated framework that allows for information sharing and exchange between the networks. Traversing across the various scales of biodiversity, in particular from the individual and species levels to the ecosystems level has long been a challenge. Furthermore, some of the major regions of the world have already taken steps to coordinate their efforts, but links between the regions have not been a priority until now. Linking biodiversity data to that of the other GEO societal benefit areas, in particular ecosystems, climate, and agriculture to produce useful information for the UN Conventions and other policy-making bodies is another need that calls for integration of information. Integration and interoperability are therefore a major theme of GEO BON, and a "system of systems" is very much needed. There are several approaches to integration that need to be considered. Data integration requires harmonising concepts, agreeing on vocabularies, and building ontologies. Semantic mediation of data using these building blocks is still not easy to achieve. Agreements on, or mappings between, the metadata standards that will be used across the networks is a major requirement that will need to be addressed early on. With interoperable metadata, service integration will be possible through registry of registries systems such as GBIF's forthcoming GBDRS and the GEO Clearinghouse. Chaining various services that build intermediate products using workflow

  14. Leveraging the Semantic Web for Adaptive Education

    ERIC Educational Resources Information Center

    Kravcik, Milos; Gasevic, Dragan

    2007-01-01

    In the area of technology-enhanced learning reusability and interoperability issues essentially influence the productivity and efficiency of learning and authoring solutions. There are two basic approaches how to overcome these problems--one attempts to do it via standards and the other by means of the Semantic Web. In practice, these approaches…

  15. Managing Interoperability for GEOSS - A Report from the SIF

    NASA Astrophysics Data System (ADS)

    Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.

    2009-04-01

    The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of

  16. Best Practices for Preparing Interoperable Geospatial Data

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Beaty, T. W.

    2010-12-01

    mechanisms to advertise, visualize, and distribute the standardized geospatial data. In this presentation, we summarize the experiences learned and the best practices for geospatial data standardization. The presentation will describe how diverse and historical data archived in the ORNL DAAC were converted into standard and non-proprietary formats; what tools were used to make the conversion; how the spatial and temporal information are properly captured in a consistent manor; how to name a data file or a variable to make it both human-friendly and semantically interoperable; how NetCDF file format and CF convention can promote the data usage in ecosystem modeling user community; how those standardized geospatial data can be fed into OGC Web Services to support on-demand data visualization and access; and how the metadata should be collected and organized so that they can be discovered through standard catalog services.

  17. Warfighter IT Interoperability Standards Study

    DTIC Science & Technology

    2012-07-22

    for Combat Casualty Care MCS Maneuver Control System MCWS Mission Command Workstation MDI Model, data, implement MDMP Military Decision Making ... make IT interoperability more efficient and effective by adoption of standards. By interoperability, we mean the definition given in the CJCSI...standards that will make their IT systems interoperable. Analysis of existing Army and DoD architectural and policy guidance has identified the

  18. Risk Management Considerations for Interoperable Acquisition

    DTIC Science & Technology

    2006-08-01

    the Scope of Interaction 4 Figure 3: IEEE Risk Management Process Model 11 Figure 4: Risk Management Process from AN/NZS 4360 11 Figure 5: Example...Identify risks • Analyze risks • Plan • Track • Control • Communicate10 | CMU/SEI-2006-TN-032 Figure 3: IEEE Risk Management Process Model Figure 4...statements made by Program-2? QUESTIONS ON PRACTICES What are the general implications for process models to achieve interoperable risk management

  19. Interoperability of heterogeneous distributed systems

    NASA Astrophysics Data System (ADS)

    Zaschke, C.; Essendorfer, B.; Kerth, C.

    2016-05-01

    To achieve knowledge superiority in today's operations interoperability is the key. Budget restrictions as well as the complexity and multiplicity of threats combined with the fact that not single nations but whole areas are subject to attacks force nations to collaborate and share information as appropriate. Multiple data and information sources produce different kinds of data, real time and non-real time, in different formats that are disseminated to the respective command and control level for further distribution. The data is most of the time highly sensitive and restricted in terms of sharing. The question is how to make this data available to the right people at the right time with the right granularity. The Coalition Shared Data concept aims to provide a solution to these questions. It has been developed within several multinational projects and evolved over time. A continuous improvement process was established and resulted in the adaptation of the architecture as well as the technical solution and the processes it supports. Coming from the idea of making use of existing standards and basing the concept on sharing of data through standardized interfaces and formats and enabling metadata based query the concept merged with a more sophisticated service based approach. The paper addresses concepts for information sharing to facilitate interoperability between heterogeneous distributed systems. It introduces the methods that were used and the challenges that had to be overcome. Furthermore, the paper gives a perspective how the concept could be used in the future and what measures have to be taken to successfully bring it into operations.

  20. A semantically rich and standardised approach enhancing discovery of sensor data and metadata

    NASA Astrophysics Data System (ADS)

    Kokkinaki, Alexandra; Buck, Justin; Darroch, Louise

    2016-04-01

    The marine environment plays an essential role in the earth's climate. To enhance the ability to monitor the health of this important system, innovative sensors are being produced and combined with state of the art sensor technology. As the number of sensors deployed is continually increasing,, it is a challenge for data users to find the data that meet their specific needs. Furthermore, users need to integrate diverse ocean datasets originating from the same or even different systems. Standards provide a solution to the above mentioned challenges. The Open Geospatial Consortium (OGC) has created Sensor Web Enablement (SWE) standards that enable different sensor networks to establish syntactic interoperability. When combined with widely accepted controlled vocabularies, they become semantically rich and semantic interoperability is achievable. In addition, Linked Data is the recommended best practice for exposing, sharing and connecting information on the Semantic Web using Uniform Resource Identifiers (URIs), Resource Description Framework (RDF) and RDF Query Language (SPARQL). As part of the EU-funded SenseOCEAN project, the British Oceanographic Data Centre (BODC) is working on the standardisation of sensor metadata enabling 'plug and play' sensor integration. Our approach combines standards, controlled vocabularies and persistent URIs to publish sensor descriptions, their data and associated metadata as 5 star Linked Data and OGC SWE (SensorML, Observations & Measurements) standard. Thus sensors become readily discoverable, accessible and useable via the web. Content and context based searching is also enabled since sensors descriptions are understood by machines. Additionally, sensor data can be combined with other sensor or Linked Data datasets to form knowledge. This presentation will describe the work done in BODC to achieve syntactic and semantic interoperability in the sensor domain. It will illustrate the reuse and extension of the Semantic Sensor

  1. Enhanced semantic interpretability by healthcare standards profiling.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2008-01-01

    Several current healthcare standards support semantic interoperability. These standards are far to be completely adopted in health information system development, however. The objective of this paper is to provide a method and necessary tooling for reusing healthcare standards by exploiting the extensibility mechanisms of UML, by that way supporting the development of semantically interoperable systems and components. The method identifies first the models and tasks in the software development process in which health care standards can be reused. Then, the selected standard is formalized as a UML profile. Finally that profile is applied to system models, annotating them with the standard semantics. The supporting tools are Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development. The feasibility of the approach is exemplified by a scenario reusing HL7 RIM and DIMs specifications. The approach presented is also applicable for harmonizing different standard specifications.

  2. A study on heterogeneous distributed spatial information platform based on semantic Web services

    NASA Astrophysics Data System (ADS)

    Peng, Shuang-yun; Yang, Kun; Xu, Quan-li; Huang, Bang-mei

    2008-10-01

    With the development of Semantic Web technology, the spatial information service based on ontology is an effective way for sharing and interoperation of heterogeneous information resources in the distributed network environment. This paper discusses spatial information sharing and interoperability in the Semantic Web Services architecture. Through using Ontology record spatial information in sharing knowledge system, explicit and formalization expresses the default and the concealment semantic information. It provides the prerequisite for spatial information sharing and interoperability; Through Semantic Web Services technology parses Ontology and intelligent buildings services under network environment, form a network of services. In order to realize the practical applications of spatial information sharing and interoperation in different brunches of CDC system, a prototype system for HIV/AIDS information sharing based on geo-ontology has also been developed by using the methods described above.

  3. The EuroGEOSS Brokering Framework for Multidisciplinary Interoperability

    NASA Astrophysics Data System (ADS)

    Santoro, M.; Nativi, S.; Craglia, M.; Boldrini, E.; Vaccari, L.; Papeschi, F.; Bigagli, L.

    2011-12-01

    The Global Earth Observation System of Systems (GEOSS), envisioned by the group of eight most industrialized countries (G-8) in 2003, provides the indispensable framework to integrate the Earth observation efforts at a global level. The European Commission also contributes to the implementation of the GEOSS through research projects funded from its Framework Programme for Research & Development. The EuroGEOSS (A European Approach to GEOSS) project was launched on May 2009 for a three-year period with the aim of supporting existing Earth Observing systems and applications interoperability and use within the GEOSS and INSPIRE frameworks. EuroGEOSS developed a multidisciplinary interoperability infrastructure for the three strategic areas of Drought, Forestry and Biodiversity; this operating capacity is currently being extended to other scientific domains (i.e. Climate Change, Water, Ocean, Weather, etc.) Central to the multidisciplinary infrastructure is the "EuroGEOSS Brokering Framework", which is based on a Brokered SOA (Service Oriented Architecture) Approach. This approach extends the typical SOA archetype introducing "expert" components: the Brokers. The Brokers provide the mediation and distribution functionalities needed to interconnect the distributed and heterogeneous resources characterizing a System of Systems (SoS) environment. Such a solution addresses significant shortcomings characterizing the present SOA implementations for global frameworks, such as multiple protocols and data models interoperability. Currently, the EuroGEOSS multidisciplinary infrastructure is composed of the following brokering components: 1. The Discovery Broker: providing harmonized discovery functionalities by mediating and distributing user queries against tens of heterogeneous services. 2. The Semantic Discovery Augmentation Component: enhancing the capabilities of the discovery broker with semantic query-expansion. 3. The Data Access Broker: enabling users to seamlessly

  4. Extending the GI Brokering Suite to Support New Interoperability Specifications

    NASA Astrophysics Data System (ADS)

    Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.

    2014-12-01

    The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by

  5. The Open Archives Initiative: Realizing Simple and Effective Digital Library Interoperability.

    ERIC Educational Resources Information Center

    Suleman, Hussein; Fox, Edward

    2001-01-01

    Explains the Open Archives Initiative (OAI) which was developed to solve problems of digital library interoperability on the World Wide Web. Topics include metadata; HTTP; XML; Dublin Core; the OAI Metadata Harvesting Protocol; data providers; service providers; reference linking; library policies; shared semantics; and name authority systems.…

  6. Interoperability between phenotype and anatomy ontologies

    PubMed Central

    Hoehndorf, Robert; Oellrich, Anika; Rebholz-Schuhmann, Dietrich

    2010-01-01

    Motivation: Phenotypic information is important for the analysis of the molecular mechanisms underlying disease. A formal ontological representation of phenotypic information can help to identify, interpret and infer phenotypic traits based on experimental findings. The methods that are currently used to represent data and information about phenotypes fail to make the semantics of the phenotypic trait explicit and do not interoperate with ontologies of anatomy and other domains. Therefore, valuable resources for the analysis of phenotype studies remain unconnected and inaccessible to automated analysis and reasoning. Results: We provide a framework to formalize phenotypic descriptions and make their semantics explicit. Based on this formalization, we provide the means to integrate phenotypic descriptions with ontologies of other domains, in particular anatomy and physiology. We demonstrate how our framework leads to the capability to represent disease phenotypes, perform powerful queries that were not possible before and infer additional knowledge. Availability: http://bioonto.de/pmwiki.php/Main/PheneOntology Contact: rh497@cam.ac.uk PMID:20971987

  7. Environmental Models as a Service: Enabling Interoperability ...

    EPA Pesticide Factsheets

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantage of streamlined deployment processes and affordable cloud access to move algorithms and data to the web for discoverability and consumption. In these deployments, environmental models can become available to end users through RESTful web services and consistent application program interfaces (APIs) that consume, manipulate, and store modeling data. RESTful modeling APIs also promote discoverability and guide usability through self-documentation. Embracing the RESTful paradigm allows models to be accessible via a web standard, and the resulting endpoints are platform- and implementation-agnostic while simultaneously presenting significant computational capabilities for spatial and temporal scaling. RESTful APIs present data in a simple verb-noun web request interface: the verb dictates how a resource is consumed using HTTP methods (e.g., GET, POST, and PUT) and the noun represents the URL reference of the resource on which the verb will act. The RESTful API can self-document in both the HTTP response and an interactive web page using the Open API standard. This lets models function as an interoperable service that promotes sharing, documentation, and discoverability. Here, we discuss the

  8. Interoperability of wearable cuffless BP measuring devices.

    PubMed

    Liu, Jing; Zhang, Yuan-Ting

    2014-01-01

    While a traditional cuff-based Blood Pressure (BP) measuring device can only take a snap shot of BP, real-time and continuous measurement of BP without an occluding cuff is preferred which usually use the pulse transit time (PTT) in combination with other physiological parameters to estimate or track BP over a certain period of time after an initial calibration. This article discusses some perspectives of interoperability of wearable medical devices, based on IEEE P1708 draft standard that focuses on the objective performance evaluation of wearable cuffless BP measuring devices. The ISO/IEEE 11073 family of standards, supporting the plug-and play feature, is intended to enable medical devices to interconnect and interoperate with other medical devices and with computerized healthcare information systems in a manner suitable for the clinical environment. In this paper, the possible adoption of ISO/IEEE 11073 for the interoperability of wearable cuffless BP devices is proposed. In the consideration of the difference of the continuous and cuffless BP measuring methods from the conventional ones, the existing device specialization standards of ISO/IEEE 11073 cannot be directly followed when designing the cuffless BP device. Specifically, this paper discusses how the domain information model (DIM), in which vital sign information is abstracted as objects, is used to structure the information about the device and that generated from the device. Though attention should also be paid to adopt the communication standards for other parts for the communication system, applying communication standards that enable plug-and-play feature allows achieving the interoperability of different cuffless BP measuring devices with possible different configurations.

  9. Fusion is possible only with interoperability agreements; the GEOSS experience

    NASA Astrophysics Data System (ADS)

    Percivall, G.

    2008-12-01

    Data fusion is defined for this session as the merging of disparate data sources for multidisciplinary study. Implicit in this definition is that the data consumer may not be intimately familiar with the data sources. In order to achieve fusion of the data, there must be generalized concepts that apply to both the data sources and consumer; and those concepts must be implemented in our information systems. The successes of GEOSS depend on data and information providers accepting and implementing a set of interoperability arrangements, including technical specifications for collecting, processing, storing, and disseminating shared data, metadata, and products. GEOSS interoperability is based on non-proprietary standards, with preference to formal international standards. GEOSS requires a scientific basis for the collection, processing and interpretation of the data. Use of standards is a hallmark of a sound scientific basis. In order communicate effectively to achieve data fusion, interoperability arrangements must be based upon sound scientific principles that have been implemented in efficient and effective tools. Establishing such interoperability arrangements depends upon social processes and technology. Through the use of Interoperability Arrangements based upon standards, GEOSS achieves data fusion to in order to answer humanities critical questions. Decision making in support of societal benefit areas depends upon data fusion in multidisciplinary settings.

  10. Interoperability of clinical decision-support systems and electronic health records using archetypes: a case study in clinical trial eligibility.

    PubMed

    Marcos, Mar; Maldonado, Jose A; Martínez-Salvador, Begoña; Boscá, Diego; Robles, Montserrat

    2013-08-01

    patient recruitment in the framework of a clinical trial for colorectal cancer screening. The utilisation of archetypes not only has proved satisfactory to achieve interoperability between CDSSs and EHRs but also offers various advantages, in particular from a data model perspective. First, the VHR/data models we work with are of a high level of abstraction and can incorporate semantic descriptions. Second, archetypes can potentially deal with different EHR architectures, due to their deliberate independence of the reference model. Third, the archetype instances we obtain are valid instances of the underlying reference model, which would enable e.g. feeding back the EHR with data derived by abstraction mechanisms. Lastly, the medical and technical validity of archetype models would be assured, since in principle clinicians should be the main actors in their development.

  11. Building a logical EHR architecture based on ISO 13606 standard and semantic web technologies.

    PubMed

    Santos, Marcelo R; Bax, Marcello P; Kalra, Dipak

    2010-01-01

    Among the existing patterns of EHR interoperability, the ISO 13606 standard is an important consideration. It is believed that the use of this norm, in conjunction with semantic technologies, may aid in the construction of a robust architecture, keeping in mind the challenges of semantic interoperability. The objective of this paper is to present a proposal for an EHR architecture, based on ISO 13606 and on the utilization of semantic technologies, for a real EHR scenario. In order to accomplish that, a real EHR scenario is described, as well as its main interoperability requirements and a candidate architecture is proposed to solve the presented challenges of interoperability. The ability of the ISO 13606 EHR reference model to accommodate the scenario was highlighted, together with the support provided by the use of the ontology specification languages--RDF and OWL--in respect to the maintenance of a controlled vocabulary.

  12. A Research on E - learning Resources Construction Based on Semantic Web

    NASA Astrophysics Data System (ADS)

    Rui, Liu; Maode, Deng

    Traditional e-learning platforms have the flaws that it's usually difficult to query or positioning, and realize the cross platform sharing and interoperability. In the paper, the semantic web and metadata standard is discussed, and a kind of e - learning system framework based on semantic web is put forward to try to solve the flaws of traditional elearning platforms.

  13. Approach for ontological modeling of database schema for the generation of semantic knowledge on the web

    NASA Astrophysics Data System (ADS)

    Rozeva, Anna

    2015-11-01

    Currently there is large quantity of content on web pages that is generated from relational databases. Conceptual domain models provide for the integration of heterogeneous content on semantic level. The use of ontology as conceptual model of a relational data sources makes them available to web agents and services and provides for the employment of ontological techniques for data access, navigation and reasoning. The achievement of interoperability between relational databases and ontologies enriches the web with semantic knowledge. The establishment of semantic database conceptual model based on ontology facilitates the development of data integration systems that use ontology as unified global view. Approach for generation of ontologically based conceptual model is presented. The ontology representing the database schema is obtained by matching schema elements to ontology concepts. Algorithm of the matching process is designed. Infrastructure for the inclusion of mediation between database and ontology for bridging legacy data with formal semantic meaning is presented. Implementation of the knowledge modeling approach on sample database is performed.

  14. D-ATM, a working example of health care interoperability: From dirt path to gravel road.

    PubMed

    DeClaris, John-William

    2009-01-01

    For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.

  15. Smart Grid Interoperability Maturity Model

    SciTech Connect

    Widergren, Steven E.; Levinson, Alex; Mater, J.; Drummond, R.

    2010-04-28

    The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizational alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.

  16. Advancing Smart Grid Interoperability and Implementing NIST's Interoperability Roadmap

    SciTech Connect

    Basso,T.; DeBlasio, R.

    2010-04-01

    The IEEE American National Standards project P2030TM addressing smart grid interoperability and the IEEE 1547 series of standards addressing distributed resources interconnection with the grid have been identified in priority action plans in the Report to NIST on the Smart Grid Interoperability Standards Roadmap. This paper presents the status of the IEEE P2030 development, the IEEE 1547 series of standards publications and drafts, and provides insight on systems integration and grid infrastructure. The P2030 and 1547 series of standards are sponsored by IEEE Standards Coordinating Committee 21.

  17. Exploring NASA GES DISC Data with Interoperable Services

    NASA Technical Reports Server (NTRS)

    Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey

    2015-01-01

    Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.

  18. Interoperability of Heliophysics Virtual Observatories

    NASA Technical Reports Server (NTRS)

    Thieman, J.; Roberts, A.; King, T.; King, J.; Harvey, C.

    2008-01-01

    If you'd like to find interrelated heliophysics (also known as space and solar physics) data for a research project that spans, for example, magnetic field data and charged particle data from multiple satellites located near a given place and at approximately the same time, how easy is this to do? There are probably hundreds of data sets scattered in archives around the world that might be relevant. Is there an optimal way to search these archives and find what you want? There are a number of virtual observatories (VOs) now in existence that maintain knowledge of the data available in subdisciplines of heliophysics. The data may be widely scattered among various data centers, but the VOs have knowledge of what is available and how to get to it. The problem is that research projects might require data from a number of subdisciplines. Is there a way to search multiple VOs at once and obtain what is needed quickly? To do this requires a common way of describing the data such that a search using a common term will find all data that relate to the common term. This common language is contained within a data model developed for all of heliophysics and known as the SPASE (Space Physics Archive Search and Extract) Data Model. NASA has funded the main part of the development of SPASE but other groups have put resources into it as well. How well is this working? We will review the use of SPASE and how well the goal of locating and retrieving data within the heliophysics community is being achieved. Can the VOs truly be made interoperable despite being developed by so many diverse groups?

  19. Potential interoperability problems facing multi-site radiation oncology centers in The Netherlands

    NASA Astrophysics Data System (ADS)

    Scheurleer, J.; Koken, Ph; Wessel, R.

    2014-03-01

    Aim: To identify potential interoperability problems facing multi-site Radiation Oncology (RO) departments in the Netherlands and solutions for unambiguous multi-system workflows. Specific challenges confronting the RO department of VUmc (RO-VUmc), which is soon to open a satellite department, were characterized. Methods: A nationwide questionnaire survey was conducted to identify possible interoperability problems and solutions. Further detailed information was obtained by in-depth interviews at 3 Dutch RO institutes that already operate in more than one site. Results: The survey had a 100% response rate (n=21). Altogether 95 interoperability problems were described. Most reported problems were on a strategic and semantic level. The majority were DICOM(-RT) and HL7 related (n=65), primarily between treatment planning and verification systems or between departmental and hospital systems. Seven were identified as being relevant for RO-VUmc. Departments have overcome interoperability problems with their own, or with tailor-made vendor solutions. There was little knowledge about or utilization of solutions developed by Integrating the Healthcare Enterprise Radiation Oncology (IHE-RO). Conclusions: Although interoperability problems are still common, solutions have been identified. Awareness of IHE-RO needs to be raised. No major new interoperability problems are predicted as RO-VUmc develops into a multi-site department.

  20. Dynamic Business Networks: A Headache for Sustainable Systems Interoperability

    NASA Astrophysics Data System (ADS)

    Agostinho, Carlos; Jardim-Goncalves, Ricardo

    Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.

  1. Semantic Web for Manufacturing Web Services

    SciTech Connect

    Kulvatunyou, Boonserm; Ivezic, Nenad

    2002-06-01

    As markets become unexpectedly turbulent with a shortened product life cycle and a power shift towards buyers, the need for methods to rapidly and cost-effectively develop products, production facilities and supporting software is becoming urgent. The use of a virtual enterprise plays a vital role in surviving turbulent markets. However, its success requires reliable and large-scale interoperation among trading partners via a semantic web of trading partners' services whose properties, capabilities, and interfaces are encoded in an unambiguous as well as computer-understandable form. This paper demonstrates a promising approach to integration and interoperation between a design house and a manufacturer by developing semantic web services for business and engineering transactions. To this end, detailed activity and information flow diagrams are developed, in which the two trading partners exchange messages and documents. The properties and capabilities of the manufacturer sites are defined using DARPA Agent Markup Language (DAML) ontology definition language. The prototype development of semantic webs shows that enterprises can widely interoperate in an unambiguous and autonomous manner; hence, virtual enterprise is realizable at a low cost.

  2. Web-Based Policy Interoperability via a Semantic Policy Interlingua

    DTIC Science & Technology

    2012-09-12

    Herklotz Air Force Office of Scientific Research robert.herklotz@afosr.af.mil AFOSR AFRL-OSR-VA-TR-2012-1101 DISTRIBUTION STATEMENT A. Approved for...benefits. In Massachusetts, the Department of Elementary and Secondary Education ( DESE )2 is responsible for the education of the approximately 550,000...the DESE to track the progress of students as they advance through the grades. Moreover, it is necessary to address the needs of children in early

  3. US and Coalition Forces Data (Semantic) Interoperability Study

    DTIC Science & Technology

    2010-05-01

    PhD Dissertation, Air Force Institute of Technology, August 2008. (6) Chomsky , Noam , Syntactic Structures, (Mouton: The Hague/Paris, 1957). (7...Water boils at 100 degrees Fahrenheit” conveys (expresses) false information (contrary to a fact). Noam Chomsky’s famous nonsense sentence

  4. Semantic Interoperability Framework (Cadre de l’Interoperabilite Semantique)

    DTIC Science & Technology

    2011-11-01

    RTA Information Management Systems Branch is required for more than one copy to be made or an extract included in another publication. Requests to...vitale d’interopérabilité sémantique a été reconnue à maintes reprises et quelques projets de référence ont été créés mais d’une part, des recherches

  5. An ontological system for interoperable spatial generalisation in biodiversity monitoring

    NASA Astrophysics Data System (ADS)

    Nieland, Simon; Moran, Niklas; Kleinschmit, Birgit; Förster, Michael

    2015-11-01

    Semantic heterogeneity remains a barrier to data comparability and standardisation of results in different fields of spatial research. Because of its thematic complexity, differing acquisition methods and national nomenclatures, interoperability of biodiversity monitoring information is especially difficult. Since data collection methods and interpretation manuals broadly vary there is a need for automatised, objective methodologies for the generation of comparable data-sets. Ontology-based applications offer vast opportunities in data management and standardisation. This study examines two data-sets of protected heathlands in Germany and Belgium which are based on remote sensing image classification and semantically formalised in an OWL2 ontology. The proposed methodology uses semantic relations of the two data-sets, which are (semi-)automatically derived from remote sensing imagery, to generate objective and comparable information about the status of protected areas by utilising kernel-based spatial reclassification. This automatised method suggests a generalisation approach, which is able to generate delineation of Special Areas of Conservation (SAC) of the European biodiversity Natura 2000 network. Furthermore, it is able to transfer generalisation rules between areas surveyed with varying acquisition methods in different countries by taking into account automated inference of the underlying semantics. The generalisation results were compared with the manual delineation of terrestrial monitoring. For the different habitats in the two sites an accuracy of above 70% was detected. However, it has to be highlighted that the delineation of the ground-truth data inherits a high degree of uncertainty, which is discussed in this study.

  6. CCP interoperability and system stability

    NASA Astrophysics Data System (ADS)

    Feng, Xiaobing; Hu, Haibo

    2016-09-01

    To control counterparty risk, financial regulations such as the Dodd-Frank Act are increasingly requiring standardized derivatives trades to be cleared by central counterparties (CCPs). It is anticipated that in the near term future, CCPs across the world will be linked through interoperability agreements that facilitate risk sharing but also serve as a conduit for transmitting shocks. This paper theoretically studies a networked network with CCPs that are linked through interoperability arrangements. The major finding is that the different configurations of networked network CCPs contribute to the different properties of the cascading failures.

  7. Towards Multilingual Interoperability in Automatic Speech Recognition

    DTIC Science & Technology

    2000-08-01

    UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADP010388 TITLE: Towards Multilingual Interoperability in Automatic Speech...component part numbers comprise the compilation report: ADPO10378 thru ADPO10397 UNCLASSIFIED 69 TOWARDS MULTILINGUAL INTEROPERABILITY IN AUTOMATIC SPEECH...communication, we address multilingual interoperability (DARPA) [39, 5, 12, 40, 14, 43]. aspects in speech recognition. After giving a tentative

  8. Improving Groundwater Data Interoperability: Results of the Second OGC Groundwater Interoperability Experiment

    NASA Astrophysics Data System (ADS)

    Lucido, J. M.; Booth, N.

    2014-12-01

    Interoperable sharing of groundwater data across international boarders is essential for the proper management of global water resources. However storage and management of groundwater data is often times distributed across many agencies or organizations. Furthermore these data may be represented in disparate proprietary formats, posing a significant challenge for integration. For this reason standard data models are required to achieve interoperability across geographical and political boundaries. The GroundWater Markup Language 1.0 (GWML1) was developed in 2010 as an extension of the Geography Markup Language (GML) in order to support groundwater data exchange within Spatial Data Infrastructures (SDI). In 2013, development of GWML2 was initiated under the sponsorship of the Open Geospatial Consortium (OGC) for intended adoption by the international community as the authoritative standard for the transfer of groundwater feature data, including data about water wells, aquifers, and related entities. GWML2 harmonizes GWML1 and the EU's INSPIRE models related to geology and hydrogeology. Additionally, an interoperability experiment was initiated to test the model for commercial, technical, scientific, and policy use cases. The scientific use case focuses on the delivery of data required for input into computational flow modeling software used to determine the flow of groundwater within a particular aquifer system. It involves the delivery of properties associated with hydrogeologic units, observations related to those units, and information about the related aquifers. To test this use case web services are being implemented using GWML2 and WaterML2, which is the authoritative standard for water time series observations, in order to serve USGS water well and hydrogeologic data via standard OGC protocols. Furthermore, integration of these data into a computational groundwater flow model will be tested. This submission will present the GWML2 information model and results

  9. Interoperability in the CDS services.

    NASA Astrophysics Data System (ADS)

    Genova, F.; Allen, M.; Bonnarel, F.; Boch, T.; Derriere, S.; Egret, D.; Fernique, P.; Ochsenbein, F.; Schaaff, A.; Wenger, M.

    2002-12-01

    The Astrophysical Virtual Observatory Project (PI: P. Quinn, ESO) has three Work Areas: Science case (P. Benvenuti, ST-ECF), Interoperability (F. Genova, CDS) and Advanced technologies (A. Lawrence, AstroGrid). The development of an Interoperability prototype, implementing a set of European archives into VizieR and Aladin, in collaboration with all the AVO partners, has been a first-year milestone of the AVO. Interoperability standards are widely discussed in all VO projects, and in the Interoperability Working Group first set by the European OPTICON Network. They are a main topic of the International Virtual Observatory Alliance. Specific developments and customizations have been integrated in SIMBAD, VizieR and Aladin. The adopted VOTable standard is used for the exchange of tabular data, and a VOTable parser, able to give rapidly access to tables containing large numbers of objects, has been developed. The categorization of column contents in VizieR tables and catalogues has lead to the definition of the Uniform Content Descriptors (UCDs). The UCDs have proven very powerful for building new functionalities such as checking of table contents, catalogue selection (e.g. finding tables which contain specific information item), filtering (e.g. visualizing, through Aladin, objects of a specific magnitude or colour range) and data transformation and combination (e.g. computing a colour index).

  10. Interoperability with Moby 1.0--it's better than sharing your toothbrush!

    PubMed

    Wilkinson, Mark D; Senger, Martin; Kawas, Edward; Bruskiewich, Richard; Gouzy, Jerome; Noirot, Celine; Bardou, Philippe; Ng, Ambrose; Haase, Dirk; Saiz, Enrique de Andres; Wang, Dennis; Gibbons, Frank; Gordon, Paul M K; Sensen, Christoph W; Carrasco, Jose Manuel Rodriguez; Fernández, José M; Shen, Lixin; Links, Matthew; Ng, Michael; Opushneva, Nina; Neerincx, Pieter B T; Leunissen, Jack A M; Ernst, Rebecca; Twigger, Simon; Usadel, Bjorn; Good, Benjamin; Wong, Yan; Stein, Lincoln; Crosby, William; Karlsson, Johan; Royo, Romina; Párraga, Iván; Ramírez, Sergio; Gelpi, Josep Lluis; Trelles, Oswaldo; Pisano, David G; Jimenez, Natalia; Kerhornou, Arnaud; Rosset, Roman; Zamacola, Leire; Tarraga, Joaquin; Huerta-Cepas, Jaime; Carazo, Jose María; Dopazo, Joaquin; Guigo, Roderic; Navarro, Arcadi; Orozco, Modesto; Valencia, Alfonso; Claros, M Gonzalo; Pérez, Antonio J; Aldana, Jose; Rojano, M Mar; Fernandez-Santa Cruz, Raul; Navas, Ismael; Schiltz, Gary; Farmer, Andrew; Gessler, Damian; Schoof, Heiko; Groscurth, Andreas

    2008-05-01

    The BioMoby project was initiated in 2001 from within the model organism database community. It aimed to standardize methodologies to facilitate information exchange and access to analytical resources, using a consensus driven approach. Six years later, the BioMoby development community is pleased to announce the release of the 1.0 version of the interoperability framework, registry Application Programming Interface and supporting Perl and Java code-bases. Together, these provide interoperable access to over 1400 bioinformatics resources worldwide through the BioMoby platform, and this number continues to grow. Here we highlight and discuss the features of BioMoby that make it distinct from other Semantic Web Service and interoperability initiatives, and that have been instrumental to its deployment and use by a wide community of bioinformatics service providers. The standard, client software, and supporting code libraries are all freely available at http://www.biomoby.org/.

  11. The interoperability force in the ERP field

    NASA Astrophysics Data System (ADS)

    Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon

    2015-04-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.

  12. Auto-Generated Semantic Processing Services

    NASA Technical Reports Server (NTRS)

    Davis, Rodney; Hupf, Greg

    2009-01-01

    Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.

  13. Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology

    NASA Astrophysics Data System (ADS)

    Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Iyemori, Toshihiko; Koyama, Yukinobu; Yatagai, Akiyo; Murayama, Yasuhiro; King, Todd; Hughes, John; Fung, Shing; Galkin, Ivan; Hapgood, Michael; Belehaki, Anna

    2015-04-01

    Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology European Union ESPAS, Japanese IUGONET and GFZ ISDC data server are developed for the ingestion, archiving and distributing of geo and space science domain data. Main parts of the data -managed by the mentioned data server- are related to near earth-space and geomagnetic field data. A smart mashup of the data server would allow a seamless browse and access to data and related context information. However the achievement of a high level of interoperability is a challenge because the data server are based on different data models and software frameworks. This paper is focused on the latest experiments and results for the mashup of the data server using the semantic Web approach. Besides the mashup of domain and terminological ontologies, especially the options to connect data managed by relational databases using D2R server and SPARQL technology will be addressed. A successful realization of the data server mashup will not only have a positive impact to the data users of the specific scientific domain but also to related projects, such as e.g. the development of a new interoperable version of NASA's Planetary Data System (PDS) or ICUS's World Data System alliance. ESPAS data server: https://www.espas-fp7.eu/portal/ IUGONET data server: http://search.iugonet.org/iugonet/ GFZ ISDC data server (semantic Web based prototype): http://rz-vm30.gfz-potsdam.de/drupal-7.9/ NASA PDS: http://pds.nasa.gov ICSU-WDS: https://www.icsu-wds.org

  14. Enhancing NATO Interoperability

    DTIC Science & Technology

    2013-03-01

    into the base were observed by one of the Polish towers. They immediately determined the point of origin and called it to their company command...over 30 minutes to clear fires because the Polish Company did not understand the task and their visually identified point of origin was more than 400...meters from the electronically acquired point of origin . After clearance had been achieved, the call for fire mission was relayed back to the

  15. Safe and Principled Language Interoperation

    DTIC Science & Technology

    2005-01-01

    different safe languages may fail when the languages have different systems of computational effects : an exception raised by an ML function may have no...valid semantic in- terpretation in the context of a Safe-C caller. Sandboxing costs performance and still may violate the semantics if effects are not...taken into account. We show that effect annotations alone are insufficient to guarantee safety, and we present a type system with bounded effect

  16. A Reference Model for Semantic Peer-to-Peer Networks

    NASA Astrophysics Data System (ADS)

    Mawlood-Yunis, Abdul-Rahman; Weiss, Michael; Santoro, Nicola

    Today’s information systems are highly networked and need to operate in a global world. With this comes the problem of semantic heterogeneity of information representations. Semantic peer-to- peer networks have been proposed as a solution to this problem. They are based around two components: a peer-to-peer infrastructure for information exchange between information system, and the use of ontologies to define application semantics. However, progress in this area is hampered by a lack of commonality between these approaches, which makes their comparison and translation into practical implementations difficult. In this paper, we describe a reference model for semantic peer-to-peer networks in an effort to remedy this problem. The reference model will (1) enable the establishment of a common terminology for describing semantic peer-to-peer networks, and (2) pave the way for an emerging standardized API that will promote information system interoperability.

  17. Profibus features intrinsic safety, interoperability

    SciTech Connect

    Bryant, M.

    1996-11-01

    The newest member of the Profibus (process fieldbus) family of interoperable field-bus protocols is {open_quotes}PA{close_quotes}, an intrinsically safe (IS) standard released more than a year ago. IS and non-IS plants using PA for process chemicals, energy production, and food manufacturing are coming online. PA was developed by vendor and user members of the Profibus standards community to meet the needs of customers in the process industries. PA complies with IEC 1158-2, which, among non-IS capabilities, specifies a low-speed, intrinsically safe fieldbus for automating explosive chemical manufacturing. PA thus provides all H1, or {open_quotes}hunk{close_quotes} 1, IS and non-IS services. Importantly, it also provides all H2, or {open_quotes}hunk{close_quotes} 2, services. As the newest segment of the site-proven system of fieldbus protocols, Profibus-PA defines by example the concepts of interoperability and interchangeability. It is a field instrument network that automatically interoperates with a large installed base of fieldbus nodes. As low-speed networks, PA and its competitor, Foundation fieldbus H1 comply with the same standard. They do the same job; auxiliary power to the application, with a data rate of 31.25 kbit/sec. Similarities include a function-block-based architecture and a device description language (DDL). They use the same physical layer for digital data transfer. A casual observer would find PA and H1 virtually the same. The key differences are in the protocol implementations. Although PA and H1 could be wired together, the messages delivered by one would make no sense to the other. At least not yet. PA protocols are capable of both IS and non-IS operations. This opens the door to a wide range of interoperable process-manufacturing requirements. 1 fig., 1 tab.

  18. Extravehicular activity space suit interoperability

    NASA Astrophysics Data System (ADS)

    Skoog, A. Ingemar; McBarron, James W.; Severin, Guy I.

    1995-10-01

    The European Agency (ESA) and the Russian Space Agency (RKA) are jointly developing a new space suit system for improved extravehicular activity (EVA) capabilities in support of the MIR Space Station Programme, the EVA Suit 2000. Recent national policy agreements between the U.S. and Russia on planned cooperations in manned space also include joint extravehicular activity (EVA). With an increased number of space suit systems and a higher operational frequency towards the end of this century an improved interoperability for both routine and emergency operations is of eminent importance. It is thus timely to report the current status of ongoing work on international EVA interoperability being conducted by the Committee on EVA Protocols and Operations of the International Academy of Astronautics initialed in 1991. This paper summarises the current EVA interoperability issues to be harmonised and presents quantified vehicle interface requirements for the current U.S. Shuttle EMU and Russian MIR Orlan DMA and the new European/Russian EVA Suit 2000 extravehicular systems. Major critical/incompatible interfaces for suits/mothercraft of different combinations arc discussed, and recommendations for standardisations given.

  19. Semantic SenseLab: implementing the vision of the Semantic Web in neuroscience

    PubMed Central

    Samwald, Matthias; Chen, Huajun; Ruttenberg, Alan; Lim, Ernest; Marenco, Luis; Miller, Perry; Shepherd, Gordon; Cheung, Kei-Hoi

    2011-01-01

    Summary Objective Integrative neuroscience research needs a scalable informatics framework that enables semantic integration of diverse types of neuroscience data. This paper describes the use of the Web Ontology Language (OWL) and other Semantic Web technologies for the representation and integration of molecular-level data provided by several of SenseLab suite of neuroscience databases. Methods Based on the original database structure, we semi-automatically translated the databases into OWL ontologies with manual addition of semantic enrichment. The SenseLab ontologies are extensively linked to other biomedical Semantic Web resources, including the Subcellular Anatomy Ontology, Brain Architecture Management System, the Gene Ontology, BIRNLex and UniProt. The SenseLab ontologies have also been mapped to the Basic Formal Ontology and Relation Ontology, which helps ease interoperability with many other existing and future biomedical ontologies for the Semantic Web. In addition, approaches to representing contradictory research statements are described. The SenseLab ontologies are designed for use on the Semantic Web that enables their integration into a growing collection of biomedical information resources. Conclusion We demonstrate that our approach can yield significant potential benefits and that the Semantic Web is rapidly becoming mature enough to realize its anticipated promises. The ontologies are available online at http://neuroweb.med.yale.edu/senselab/ PMID:20006477

  20. Using ontological inference and hierarchical matchmaking to overcome semantic heterogeneity in remote sensing-based biodiversity monitoring

    NASA Astrophysics Data System (ADS)

    Nieland, Simon; Kleinschmit, Birgit; Förster, Michael

    2015-05-01

    Ontology-based applications hold promise in improving spatial data interoperability. In this work we use remote sensing-based biodiversity information and apply semantic formalisation and ontological inference to show improvements in data interoperability/comparability. The proposed methodology includes an observation-based, "bottom-up" engineering approach for remote sensing applications and gives a practical example of semantic mediation of geospatial products. We apply the methodology to three different nomenclatures used for remote sensing-based classification of two heathland nature conservation areas in Belgium and Germany. We analysed sensor nomenclatures with respect to their semantic formalisation and their bio-geographical differences. The results indicate that a hierarchical and transparent nomenclature is far more important for transferability than the sensor or study area. The inclusion of additional information, not necessarily belonging to a vegetation class description, is a key factor for the future success of using semantics for interoperability in remote sensing.

  1. Vocabulary services to support scientific data interoperability

    NASA Astrophysics Data System (ADS)

    Cox, Simon; Mills, Katie; Tan, Florence

    2013-04-01

    Shared vocabularies are a core element in interoperable systems. Vocabularies need to be available at run-time, and where the vocabularies are shared by a distributed community this implies the use of web technology to provide vocabulary services. Given the ubiquity of vocabularies or classifiers in systems, vocabulary services are effectively the base of the interoperability stack. In contemporary knowledge organization systems, a vocabulary item is considered a concept, with the "terms" denoting it appearing as labels. The Simple Knowledge Organization System (SKOS) formalizes this as an RDF Schema (RDFS) application, with a bridge to formal logic in Web Ontology Language (OWL). For maximum utility, a vocabulary should be made available through the following interfaces: * the vocabulary as a whole - at an ontology URI corresponding to a vocabulary document * each item in the vocabulary - at the item URI * summaries, subsets, and resources derived by transformation * through the standard RDF web API - i.e. a SPARQL endpoint * through a query form for human users. However, the vocabulary data model may be leveraged directly in a standard vocabulary API that uses the semantics provided by SKOS. SISSvoc3 [1] accomplishes this as a standard set of URI templates for a vocabulary. Any URI comforming to the template selects a vocabulary subset based on the SKOS properties, including labels (skos:prefLabel, skos:altLabel, rdfs:label) and a subset of the semantic relations (skos:broader, skos:narrower, etc). SISSvoc3 thus provides a RESTFul SKOS API to query a vocabulary, but hiding the complexity of SPARQL. It has been implemented using the Linked Data API (LDA) [2], which connects to a SPARQL endpoint. By using LDA, we also get content-negotiation, alternative views, paging, metadata and other functionality provided in a standard way. A number of vocabularies have been formalized in SKOS and deployed by CSIRO, the Australian Bureau of Meteorology (BOM) and their

  2. Knowledge-oriented semantics modelling towards uncertainty reasoning.

    PubMed

    Mohammed, Abdul-Wahid; Xu, Yang; Liu, Ming

    2016-01-01

    Distributed reasoning in M2M leverages the expressive power of ontology to enable semantic interoperability between heterogeneous systems of connected devices. Ontology, however, lacks the built-in, principled support to effectively handle the uncertainty inherent in M2M application domains. Thus, efficient reasoning can be achieved by integrating the inferential reasoning power of probabilistic representations with the first-order expressiveness of ontology. But there remains a gap with current probabilistic ontologies since state-of-the-art provides no compatible representation for simultaneous handling of discrete and continuous quantities in ontology. This requirement is paramount, especially in smart homes, where continuous quantities cannot be avoided, and simply mapping continuous information to discrete states through quantization can cause a great deal of information loss. In this paper, we propose a hybrid probabilistic ontology that can simultaneously handle distributions over discrete and continuous quantities in ontology. We call this new framework HyProb-Ontology, and it specifies distributions over properties of classes, which serve as templates for instances of classes to inherit as well as overwrite some aspects. Since there cannot be restriction on the dependency topology of models that HyProb-Ontology can induce across different domains, we can achieve a unified Ground Hybrid Probabilistic Model by conditional Gaussian fuzzification of the distributions of the continuous variables in ontology. From the results of our experiments, this unified model can achieve exact inference with better performance over classical Bayesian networks.

  3. National electronic health record interoperability chronology.

    PubMed

    Hufnagel, Stephen P

    2009-05-01

    The federal initiative for electronic health record (EHR) interoperability began in 2000 and set the stage for the establishment of the 2004 Executive Order for EHR interoperability by 2014. This article discusses the chronology from the 2001 e-Government Consolidated Health Informatics (CHI) initiative through the current congressional mandates for an aligned, interoperable, and agile DoD AHLTA and VA VistA.

  4. Army S&T Investment in Interoperability

    DTIC Science & Technology

    2009-01-01

    interoperability-related investments required by additional code for programmable multi-channel radios or interfacing different Blue Force Tracking systems...the generic example given above poses a new vehicle voice radio system that will add interoperability simply by the numbers acquired. Another...For example, increasing the numbers of voice radio systems that include interoperable secure voice communications can be quantified against a target

  5. Interoperability for Space Mission Monitor and Control: Applying Technologies from Manufacturing Automation and Process Control Industries

    NASA Technical Reports Server (NTRS)

    Jones, Michael K.

    1998-01-01

    Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.

  6. HTML5 microdata as a semantic container for medical information exchange.

    PubMed

    Kimura, Eizen; Kobayashi, Shinji; Ishihara, Ken

    2014-01-01

    Achieving interoperability between clinical electronic medical records (EMR) systems and cloud computing systems is challenging because of the lack of a universal reference method as a standard for information exchange with a secure connection. Here we describe an information exchange scheme using HTML5 microdata, where the standard semantic container is an HTML document. We embed HL7 messages describing laboratory test results in the microdata. We also annotate items in the clinical research report with the microdata. We mapped the laboratory test result data into the clinical research report using an HL7 selector specified in the microdata. This scheme can provide secure cooperation between the cloud-based service and the EMR system.

  7. Generative Semantics

    ERIC Educational Resources Information Center

    Bagha, Karim Nazari

    2011-01-01

    Generative semantics is (or perhaps was) a research program within linguistics, initiated by the work of George Lakoff, John R. Ross, Paul Postal and later McCawley. The approach developed out of transformational generative grammar in the mid 1960s, but stood largely in opposition to work by Noam Chomsky and his students. The nature and genesis of…

  8. An Assessment System on the Semantic Web

    NASA Astrophysics Data System (ADS)

    Radenković, Sonja; Krdžavac, Nenad; Devedžić, Vladan

    This paper presents a way to develop a modern assessment system on the Semantic Web. The system is based on the IMS QTI standard (question and test interoperability) and designed by applying the model driven architecture software engineering standards. It uses the XML meta-data interchange specification and ontologies. We propose the framework for assessment systems that is reusable, extensible, and that facilitates interoperability between its component systems. The central idea here is using description logic reasoning techniques for intelligent analysis of students' solutions of the problems they are working on during assessment sessions with the system, in order to process open-ended questions. This innovative approach can be applied in the IMS QTI standard.

  9. Naval Logistics Integration Through Interoperable Supply Systems

    DTIC Science & Technology

    2014-06-13

    NAVAL LOGISTICS INTEGRATION THROUGH INTEROPERABLE SUPPLY SYSTEMS A thesis presented to the Faculty of the U.S. Army Command...Thesis 3. DATES COVERED (From - To) AUG 2013 – JUNE 2014 4. TITLE AND SUBTITLE Naval Logistics Integration through Interoperable Supply Systems...Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT This research investigates how the Navy and the Marine Corps could increase Naval Logistics

  10. Interoperation Modeling for Intelligent Domotic Environments

    NASA Astrophysics Data System (ADS)

    Bonino, Dario; Corno, Fulvio

    This paper introduces an ontology-based model for domotic device inter-operation. Starting from a previously published ontology (DogOnt) a refactoring and extension is described allowing to explicitly represent device capabilities, states and commands, and supporting abstract modeling of device inter-operation.

  11. Integrated semantics service platform for the Internet of Things: a case study of a smart office.

    PubMed

    Ryu, Minwoo; Kim, Jaeho; Yun, Jaeseok

    2015-01-19

    The Internet of Things (IoT) allows machines and devices in the world to connect with each other and generate a huge amount of data, which has a great potential to provide useful knowledge across service domains. Combining the context of IoT with semantic technologies, we can build integrated semantic systems to support semantic interoperability. In this paper, we propose an integrated semantic service platform (ISSP) to support ontological models in various IoT-based service domains of a smart city. In particular, we address three main problems for providing integrated semantic services together with IoT systems: semantic discovery, dynamic semantic representation, and semantic data repository for IoT resources. To show the feasibility of the ISSP, we develop a prototype service for a smart office using the ISSP, which can provide a preset, personalized office environment by interpreting user text input via a smartphone. We also discuss a scenario to show how the ISSP-based method would help build a smart city, where services in each service domain can discover and exploit IoT resources that are wanted across domains. We expect that our method could eventually contribute to providing people in a smart city with more integrated, comprehensive services based on semantic interoperability.

  12. Integrated Semantics Service Platform for the Internet of Things: A Case Study of a Smart Office

    PubMed Central

    Ryu, Minwoo; Kim, Jaeho; Yun, Jaeseok

    2015-01-01

    The Internet of Things (IoT) allows machines and devices in the world to connect with each other and generate a huge amount of data, which has a great potential to provide useful knowledge across service domains. Combining the context of IoT with semantic technologies, we can build integrated semantic systems to support semantic interoperability. In this paper, we propose an integrated semantic service platform (ISSP) to support ontological models in various IoT-based service domains of a smart city. In particular, we address three main problems for providing integrated semantic services together with IoT systems: semantic discovery, dynamic semantic representation, and semantic data repository for IoT resources. To show the feasibility of the ISSP, we develop a prototype service for a smart office using the ISSP, which can provide a preset, personalized office environment by interpreting user text input via a smartphone. We also discuss a scenario to show how the ISSP-based method would help build a smart city, where services in each service domain can discover and exploit IoT resources that are wanted across domains. We expect that our method could eventually contribute to providing people in a smart city with more integrated, comprehensive services based on semantic interoperability. PMID:25608216

  13. The DebugIT core ontology: semantic integration of antibiotics resistance patterns.

    PubMed

    Schober, Daniel; Boeker, Martin; Bullenkamp, Jessica; Huszka, Csaba; Depraetere, Kristof; Teodoro, Douglas; Nadah, Nadia; Choquet, Remy; Daniel, Christel; Schulz, Stefan

    2010-01-01

    Antibiotics resistance development poses a significant problem in today's hospital care. Massive amounts of clinical data are being collected and stored in proprietary and unconnected systems in heterogeneous format. The DebugIT EU project promises to make this data geographically and semantically interoperable for case-based knowledge analysis approaches aiming at the discovery of patterns that help to align antibiotics treatment schemes. The semantic glue for this endeavor is DCO, an application ontology that enables data miners to query distributed clinical information systems in a semantically rich and content driven manner. DCO will hence serve as the core component of the interoperability platform for the DebugIT project. Here we present DCO and an approach thet uses the semantic web query language SPARQL to bind and ontologically query hospital database content using DCO and information model mediators. We provide a query example that indicates that ontological querying over heterogeneous information models is feasible via SPARQL construct- and resource mapping queries.

  14. Building interoperable health information systems using agent and workflow technologies.

    PubMed

    Koufi, Vassiliki; Malamateniou, Flora; Vassilacopoulos, George

    2009-01-01

    Healthcare is an increasingly collaborative enterprise involving many individuals and organizations that coordinate their efforts toward promoting quality and efficient delivery of healthcare through the use of interoperable healthcare information systems. This paper presents a mediator-based approach for achieving data and service interoperability among disparate and geographically dispersed healthcare information systems. The proposed system architecture enables decoupling of the client applications and the server-side implementations while it ensures security in all transactions. It is a distributed system architecture based on the agent-oriented paradigm for communication and life cycle management while interactions are described according to the workflow metaphor. Thus robustness, high flexibility and fault tolerance are provided in an environment as dynamic and heterogeneous as healthcare.

  15. Semantic Clustering of Search Engine Results

    PubMed Central

    Soliman, Sara Saad; El-Sayed, Maged F.; Hassan, Yasser F.

    2015-01-01

    This paper presents a novel approach for search engine results clustering that relies on the semantics of the retrieved documents rather than the terms in those documents. The proposed approach takes into consideration both lexical and semantics similarities among documents and applies activation spreading technique in order to generate semantically meaningful clusters. This approach allows documents that are semantically similar to be clustered together rather than clustering documents based on similar terms. A prototype is implemented and several experiments are conducted to test the prospered solution. The result of the experiment confirmed that the proposed solution achieves remarkable results in terms of precision. PMID:26933673

  16. Interoperable Solar Data and Metadata via LISIRD 3

    NASA Astrophysics Data System (ADS)

    Wilson, A.; Lindholm, D. M.; Pankratz, C. K.; Snow, M. A.; Woods, T. N.

    2015-12-01

    LISIRD 3 is a major upgrade of the LASP Interactive Solar Irradiance Data Center (LISIRD), which serves several dozen space based solar irradiance and related data products to the public. Through interactive plots, LISIRD 3 provides data browsing supported by data subsetting and aggregation. Incorporating a semantically enabled metadata repository, LISIRD 3 users see current, vetted, consistent information about the datasets offered. Users can now also search for datasets based on metadata fields such as dataset type and/or spectral or temporal range. This semantic database enables metadata browsing, so users can discover the relationships between datasets, instruments, spacecraft, mission and PI. The database also enables creation and publication of metadata records in a variety of formats, such as SPASE or ISO, making these datasets more discoverable. The database also enables the possibility of a public SPARQL endpoint, making the metadata browsable in an automated fashion. LISIRD 3's data access middleware, LaTiS, provides dynamic, on demand reformatting of data and timestamps, subsetting and aggregation, and other server side functionality via a RESTful OPeNDAP compliant API, enabling interoperability between LASP datasets and many common tools. LISIRD 3's templated front end design, coupled with the uniform data interface offered by LaTiS, allows easy integration of new datasets. Consequently the number and variety of datasets offered by LISIRD has grown to encompass several dozen, with many more to come. This poster will discuss design and implementation of LISIRD 3, including tools used, capabilities enabled, and issues encountered.

  17. Open-Source Semantic and Schematic Mediation in Hydrogeologic Spatial Data Infrastructures

    NASA Astrophysics Data System (ADS)

    Boisvert, E.; Brodaric, B.

    2008-12-01

    A common task in cyber-based data environments, hydrogeologic or otherwise, is an initial search for data amongst distributed heterogeneous sources, followed by amalgamation of the multiple results into a single file organized using a common structure and perhaps standard content. For example, querying water well databases to obtain a list of the rock materials that occur beyond a certain ground depth, represented in some specific XML dialect. This task is often achieved with the aid of open geospatial technologies (OGC), which conveniently enable interoperability at the system and syntax levels by providing standard web service interfaces (WMS, WFS, WCS) and a standard data transfer language (GML). However, at present such technologies, which are mainly non-open source, provide minimal support for interoperating at the schematic and semantic levels, meaning it is difficult to query the data sources and obtain results in a common data structure populated with standard content. Classical data integration systems provide mediator and wrapper middleware to address this issue: mediators dispatch queries to distributed data repositories and integrate query results, while wrappers perform translation to common standards for both queries and results, and these actions are typically supported by ontologies. Under this classical scenario existing open geospatial services can be considered wrappers with minimal translation capacity, thus requiring a mediator to both integrate and translate. Consequently, we have used open source components to develop a re-usable mediator that operates as a virtual open geospatial web service (WFS), one that integrates and translates both query requests and results from OGC-wrapped data sources to common standards. The mediator is designed as a customizable XML processing pipeline that operates on declarative descriptions that support schematic and semantic translation. It is being implemented in virtual environments for hydrogeology to

  18. Interoperability Context-Setting Framework

    SciTech Connect

    Widergren, Steven E.; Hardin, Dave; Ambrosio, Ron; Drummond, R.; Gunther, E.; Gilchrist, Grant; Cohen, David

    2007-01-31

    -conditioning (HVAC) unit up several degrees. The resulting load reduction becomes part of an aggregated response from the electricity service provider to the bulk system operator who is now in a better position to manage total system load with available generation. Looking across the electric system, from generating plants, to transmission substations, to the distribution system, to factories, office parks, and buildings, automation is growing, and the opportunities for unleashing new value propositions are exciting. How can we facilitate this change and do so in a way that ensures the reliability of electric resources for the wellbeing of our economy and security? The GridWise Architecture Council (GWAC) mission is to enable interoperability among the many entities that interact with the electric power system. A good definition of interoperability is, “The capability of two or more networks, systems, devices, applications, or components to exchange information between them and to use the information so exchanged.” As a step in the direction of enabling interoperability, the GWAC proposes a context-setting framework to organize concepts and terminology so that interoperability issues can be identified and debated, improvements to address issues articulated, and actions prioritized and coordinated across the electric power community.

  19. Before you make the data interoperable you have to make the people interoperable

    NASA Astrophysics Data System (ADS)

    Jackson, I.

    2008-12-01

    In February 2006 a deceptively simple concept was put forward. Could we use the International Year of Planet Earth 2008 as a stimulus to begin the creation of a digital geological map of the planet at a target scale of 1:1 million? Could we design and initiate a project that uniquely mobilises geological surveys around the world to act as the drivers and sustainable data providers of this global dataset? Further, could we synergistically use this geoscientist-friendly vehicle of creating a tangible geological map to accelerate progress of an emerging global geoscience data model and interchange standard? Finally, could we use the project to transfer know-how to developing countries and reduce the length and expense of their learning curve, while at the same time producing geoscience maps and data that could attract interest and investment? These aspirations, plus the chance to generate a global digital geological dataset to assist in the understanding of global environmental problems and the opportunity to raise the profile of geoscience as part of IYPE seemed more than enough reasons to take the proposal to the next stage. In March 2007, in Brighton, UK, 81 delegates from 43 countries gathered together to consider the creation of this global interoperable geological map dataset. The participants unanimously agreed the Brighton "Accord" and kicked off "OneGeology", an initiative that now has the support of more than 85 nations. Brighton was never designed to be a scientific or technical meeting: it was overtly about people and their interaction - would these delegates, with their diverse cultural and technical backgrounds, be prepared to work together to achieve something which, while technically challenging, was not complex in the context of leading edge geoscience informatics. Could we scale up what is a simple informatics model at national level, to deliver global coverage and access? The major challenges for OneGeology (and the deployment of interoperability

  20. Satellite-Terrestrial Network Interoperability

    NASA Technical Reports Server (NTRS)

    vonDeak, Thomas C.

    1998-01-01

    The developing national and global information infrastructures (NII/GII) are being built upon the asynchronous transfer mode (ATM) telecommunications protocol and associated protocol standards. These protocols are themselves under development through the telecommunications standards process defined by the International Telecommunications Union (ITU), which as a body is sanctioned by the United Nations. All telecommunications manufacturers use these standards to create products that can interoperate. The ITU has recognized the ATM Forum as the instrument for the development of ATM protocols. This forum is a consortium of industry, academia, and government entities formed to quickly develop standards for the ATM infrastructure. However, because the participants represent a predominately terrestrial network viewpoint, the use of satellites in the national and global information infrastructures could be severely compromised. Consequently, through an ongoing task order, the NASA Lewis Research Center asked Sterling Software, Inc., to communicate with the ATM Forum in support of the interoperability of satellite-terrestrial networks. This year, Dr. Raj Jain of the Ohio State University, under contract to Sterling, authored or coauthored 32 explanatory documents delivered to the ATM Forum in the areas of Guaranteed Frame Rate for Transmission Control Protocol/Internet Protocol (TCP/IP), Available Bit Rate, performance testing, Variable Bit Rate voice over ATM, TCP over Unspecified Bit Rate+, Virtual Source/Virtual Destination, and network management. These contributions have had a significant impact on the content of the standards that the ATM Forum is developing. Some of the more significant accomplishments have been: (1) The adoption by the ATM Forum of a new definition for Message-In, Message-Out latency; and (2) Improved text (clearer wording and newly defined terms) for measurement procedures, foreground and background traffic, and scalable configuration in the

  1. Provenance in Data Interoperability for Multi-Sensor Intercomparison

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris; Leptoukh, Greg; Berrick, Steve; Shen, Suhung; Prados, Ana; Fox, Peter; Yang, Wenli; Min, Min; Holloway, Dan; Enloe, Yonsook

    2008-01-01

    As our inventory of Earth science data sets grows, the ability to compare, merge and fuse multiple datasets grows in importance. This requires a deeper data interoperability than we have now. Efforts such as Open Geospatial Consortium and OPeNDAP (Open-source Project for a Network Data Access Protocol) have broken down format barriers to interoperability; the next challenge is the semantic aspects of the data. Consider the issues when satellite data are merged, cross-calibrated, validated, inter-compared and fused. We must match up data sets that are related, yet different in significant ways: the phenomenon being measured, measurement technique, location in space-time or quality of the measurements. If subtle distinctions between similar measurements are not clear to the user, results can be meaningless or lead to an incorrect interpretation of the data. Most of these distinctions trace to how the data came to be: sensors, processing and quality assessment. For example, monthly averages of satellite-based aerosol measurements often show significant discrepancies, which might be due to differences in spatio- temporal aggregation, sampling issues, sensor biases, algorithm differences or calibration issues. Provenance information must be captured in a semantic framework that allows data inter-use tools to incorporate it and aid in the intervention of comparison or merged products. Semantic web technology allows us to encode our knowledge of measurement characteristics, phenomena measured, space-time representation, and data quality attributes in a well-structured, machine-readable ontology and rulesets. An analysis tool can use this knowledge to show users the provenance-related distrintions between two variables, advising on options for further data processing and analysis. An additional problem for workflows distributed across heterogeneous systems is retrieval and transport of provenance. Provenance may be either embedded within the data payload, or transmitted

  2. Report on the Second Catalog Interoperability Workshop

    NASA Technical Reports Server (NTRS)

    Thieman, James R.; James, Mary E.

    1988-01-01

    The events, resolutions, and recommendations of the Second Catalog Interoperability Workshop, held at JPL in January, 1988, are discussed. This workshop dealt with the issues of standardization and communication among directories, catalogs, and inventories in the earth and space science data management environment. The Directory Interchange Format, being constructed as a standard for the exchange of directory information among participating data systems, is discussed. Involvement in the Interoperability effort by NASA, NOAA, ISGS, and NSF is described, and plans for future interoperability considered. The NASA Master Directory prototype is presented and critiqued and options for additional capabilities debated.

  3. An architecture for interoperable GIS use in a local community environment

    NASA Astrophysics Data System (ADS)

    Stoimenov, Leonid; Djordjević-Kajan, Slobodanka

    2005-03-01

    Many organizations in the local community environment use and produce geospatial data. An increasing number of geodata producers and users have expressed the need for the integration of geodata and for interoperable geographic information systems (GIS). Each of these user groups has a different view of the world and available information is always distributed and mostly heterogeneous. Wrappers and mediation may resolve structural and syntactic heterogeneity. However, domain experts use the concepts and terminology specific to their fields of expertise, and use different parameters to express their model of a concept. Such semantic heterogeneity of data sources causes serious problems, which may be resolved by ontologies. The goal of our research is to define an architecture, called GeoNis, for semantic interoperability of distributed and heterogeneous GIS in a local community environment. The proposed architecture is based on mediation and ontologies. GeoNis solution to the problem of semantic heterogeneity is to formally specify the meaning of the terminology of each community using local ontology and to define a translation between each community terminologies and an intermediate terminology represented by top-level ontology and common data model.

  4. Semantic Web Service Framework to Intelligent Distributed Manufacturing

    SciTech Connect

    Kulvatunyou, Boonserm

    2005-12-01

    As markets become unexpectedly turbulent with a shortened product life cycle and a power shift towards buyers, the need for methods to develop products, production facilities, and supporting software rapidly and cost-effectively is becoming urgent. The use of a loosely integrated virtual enterprise based framework holds the potential of surviving changing market needs. However, its success requires reliable and large-scale interoperation among trading partners via a semantic web of trading partners services whose properties, capabilities, and interfaces are encoded in an unambiguous as well as computer-understandable form. This paper demonstrates a promising approach to integration and interoperation between a design house and a manufacturer that may or may not have prior relationship by developing semantic web services for business and engineering transactions. To this end, detailed activity and information flow diagrams are developed, in which the two trading partners exchange messages and documents. The properties and capabilities of the manufacturer sites are defined using DARPA Agent Markup Language (DAML) ontology definition language. The prototype development of semantic webs shows that enterprises can interoperate widely in an unambiguous and autonomous manner. This contributes towards the realization of virtual enterprises at a low cost.

  5. Application-Level Interoperability Across Grids and Clouds

    NASA Astrophysics Data System (ADS)

    Jha, Shantenu; Luckow, Andre; Merzky, Andre; Erdely, Miklos; Sehgal, Saurabh

    Application-level interoperability is defined as the ability of an application to utilize multiple distributed heterogeneous resources. Such interoperability is becoming increasingly important with increasing volumes of data, multiple sources of data as well as resource types. The primary aim of this chapter is to understand different ways in which application-level interoperability can be provided across distributed infrastructure. We achieve this by (i) using the canonical wordcount application, based on an enhanced version of MapReduce that scales-out across clusters, clouds, and HPC resources, (ii) establishing how SAGA enables the execution of wordcount application using MapReduce and other programming models such as Sphere concurrently, and (iii) demonstrating the scale-out of ensemble-based biomolecular simulations across multiple resources. We show user-level control of the relative placement of compute and data and also provide simple performance measures and analysis of SAGA-MapReduce when using multiple, different, heterogeneous infrastructures concurrently for the same problem instance. Finally, we discuss Azure and some of the system-level abstractions that it provides and show how it is used to support ensemble-based biomolecular simulations.

  6. Service Knowledge Spaces for Semantic Collaboration in Web-based Systems

    NASA Astrophysics Data System (ADS)

    Bianchini, Devis; de Antonellis, Valeria; Melchiori, Michele

    Semantic Web technologies have been applied to enable collaboration in open distributed systems, where interoperability issues raise due to the absence of a global view of the shared resources. Adoption of service-oriented technologies has improved interoperability at the application level by exporting systems functionalities as Web services. In fact, Service Oriented Architecture (SOA) constitutes an appropriate platform-independent approach to implement collaboration activities by means of automatic service discovery and composition. Recently, service discovery has been applied to collaborative environments such as the P2P one, where independent partners need cooperate through resource sharing without a stable network configuration and adopting different semantic models. Model-based techniques relying on Semantic Web need be defined to generate semantic service descriptions, allowing collaborative partners to export their functionalities in a semantic way. Semantic-based service matchmaking techniques are in charge of effectively and efficiently evaluating similarity between service requests and service offers in a huge, dynamic distributed environment. The result is an evolving service knowledge space where collaborative partners that provide similar services are semantically related and constitute synergic service centres in a given domain. Specific modeling requirements related to Semantic Web, service-oriented and P2P technologies must be considered.

  7. Provenance-Based Approaches to Semantic Web Service Discovery and Usage

    ERIC Educational Resources Information Center

    Narock, Thomas William

    2012-01-01

    The World Wide Web Consortium defines a Web Service as "a software system designed to support interoperable machine-to-machine interaction over a network." Web Services have become increasingly important both within and across organizational boundaries. With the recent advent of the Semantic Web, web services have evolved into semantic…

  8. Putting semantics into the semantic web: how well can it capture biology?

    PubMed

    Kazic, Toni

    2006-01-01

    Could the Semantic Web work for computations of biological interest in the way it's intended to work for movie reviews and commercial transactions? It would be wonderful if it could, so it's worth looking to see if its infrastructure is adequate to the job. The technologies of the Semantic Web make several crucial assumptions. I examine those assumptions; argue that they create significant problems; and suggest some alternative ways of achieving the Semantic Web's goals for biology.

  9. Reminiscing about 15 years of interoperability efforts

    DOE PAGES

    Van de Sompel, Herbert; Nelson, Michael L.

    2015-11-01

    Over the past fifteen years, our perspective on tackling information interoperability problems for web-based scholarship has evolved significantly. In this opinion piece, we look back at three efforts that we have been involved in that aptly illustrate this evolution: OAI-PMH, OAI-ORE, and Memento. Understanding that no interoperability specification is neutral, we attempt to characterize the perspectives and technical toolkits that provided the basis for these endeavors. With that regard, we consider repository-centric and web-centric interoperability perspectives, and the use of a Linked Data or a REST/HATEAOS technology stack, respectively. In addition, we lament the lack of interoperability across nodes thatmore » play a role in web-based scholarship, but end on a constructive note with some ideas regarding a possible path forward.« less

  10. Reminiscing about 15 years of interoperability efforts

    SciTech Connect

    Van de Sompel, Herbert; Nelson, Michael L.

    2015-11-01

    Over the past fifteen years, our perspective on tackling information interoperability problems for web-based scholarship has evolved significantly. In this opinion piece, we look back at three efforts that we have been involved in that aptly illustrate this evolution: OAI-PMH, OAI-ORE, and Memento. Understanding that no interoperability specification is neutral, we attempt to characterize the perspectives and technical toolkits that provided the basis for these endeavors. With that regard, we consider repository-centric and web-centric interoperability perspectives, and the use of a Linked Data or a REST/HATEAOS technology stack, respectively. In addition, we lament the lack of interoperability across nodes that play a role in web-based scholarship, but end on a constructive note with some ideas regarding a possible path forward.

  11. Scalability and interoperability within glideinWMS

    SciTech Connect

    Bradley, D.; Sfiligoi, I.; Padhi, S.; Frey, J.; Tannenbaum, T.; /Wisconsin U., Madison

    2010-01-01

    Physicists have access to thousands of CPUs in grid federations such as OSG and EGEE. With the start-up of the LHC, it is essential for individuals or groups of users to wrap together available resources from multiple sites across multiple grids under a higher user-controlled layer in order to provide a homogeneous pool of available resources. One such system is glideinWMS, which is based on the Condor batch system. A general discussion of glideinWMS can be found elsewhere. Here, we focus on recent advances in extending its reach: scalability and integration of heterogeneous compute elements. We demonstrate that the new developments exceed the design goal of over 10,000 simultaneous running jobs under a single Condor schedd, using strong security protocols across global networks, and sustaining a steady-state job completion rate of a few Hz. We also show interoperability across heterogeneous computing elements achieved using client-side methods. We discuss this technique and the challenges in direct access to NorduGrid and CREAM compute elements, in addition to Globus based systems.

  12. Scalability and interoperability within glideinWMS

    NASA Astrophysics Data System (ADS)

    Bradley, D.; Sfiligoi, I.; Padhi, S.; Frey, J.; Tannenbaum, T.

    2010-04-01

    Physicists have access to thousands of CPUs in grid federations such as OSG and EGEE. With the start-up of the LHC, it is essential for individuals or groups of users to wrap together available resources from multiple sites across multiple grids under a higher user-controlled layer in order to provide a homogeneous pool of available resources. One such system is glideinWMS, which is based on the Condor batch system. A general discussion of glideinWMS can be found elsewhere. Here, we focus on recent advances in extending its reach: scalability and integration of heterogeneous compute elements. We demonstrate that the new developments exceed the design goal of over 10,000 simultaneous running jobs under a single Condor schedd, using strong security protocols across global networks, and sustaining a steady-state job completion rate of a few Hz. We also show interoperability across heterogeneous computing elements achieved using client-side methods. We discuss this technique and the challenges in direct access to NorduGrid and CREAM compute elements, in addition to Globus based systems.

  13. GEOSS interoperability for Weather, Ocean and Water

    NASA Astrophysics Data System (ADS)

    Richardson, David; Nyenhuis, Michael; Zsoter, Ervin; Pappenberger, Florian

    2013-04-01

    "Understanding the Earth system — its weather, climate, oceans, atmosphere, water, land, geodynamics, natural resources, ecosystems, and natural and human-induced hazards — is crucial to enhancing human health, safety and welfare, alleviating human suffering including poverty, protecting the global environment, reducing disaster losses, and achieving sustainable development. Observations of the Earth system constitute critical input for advancing this understanding." With this in mind, the Group on Earth Observations (GEO) started implementing the Global Earth Observation System of Systems (GEOSS). GEOWOW, short for "GEOSS interoperability for Weather, Ocean and Water", is supporting this objective. GEOWOW's main challenge is to improve Earth observation data discovery, accessibility and exploitability, and to evolve GEOSS in terms of interoperability, standardization and functionality. One of the main goals behind the GEOWOW project is to demonstrate the value of the TIGGE archive in interdisciplinary applications, providing a vast amount of useful and easily accessible information to the users through the GEO Common Infrastructure (GCI). GEOWOW aims at developing funcionalities that will allow easy discovery, access and use of TIGGE archive data and of in-situ observations, e.g. from the Global Runoff Data Centre (GRDC), to support applications such as river discharge forecasting.TIGGE (THORPEX Interactive Grand Global Ensemble) is a key component of THORPEX: a World Weather Research Programme to accelerate the improvements in the accuracy of 1-day to 2 week high-impact weather forecasts for the benefit of humanity. The TIGGE archive consists of ensemble weather forecast data from ten global NWP centres, starting from October 2006, which has been made available for scientific research. The TIGGE archive has been used to analyse hydro-meteorological forecasts of flooding in Europe as well as in China. In general the analysis has been favourable in terms of

  14. Forcing Interoperability: An Intentionally Fractured Approach

    NASA Astrophysics Data System (ADS)

    Gallaher, D. W.; Brodzik, M.; Scambos, T.; Stroeve, J.

    2008-12-01

    The NSIDC is attempting to rebuild a significant portion of its public-facing cyberinfrastructure to better meet the needs expressed by the cryospheric community. The project initially addresses a specific science need - understanding Greenland's contribution to global sea level rise through comparison and analysis of variables such as temperature, albedo, melt, ice velocity and surface elevation. This project will ultimately be expanded to cover most of NSIDC's cryospheric data. Like many organizations, we need to provide users with data discovery interfaces, collaboration tools and mapping services. Complicating this effort is the need to reduce the volume of raw data delivered to the user. Data growth, especially with time-series data, will overwhelm our software, processors and network like never before. We need to provide the users the ability to perform first level analysis directly on our site. In order to accomplish this, the users should be free to modify the behavior of these tools as well as incorporate their own tools and analysis to meet their needs. Rather than building one monolithic project to build this system, we have chosen to build three semi-independent systems. One team is building a data discovery and web based distribution system, the second is building an advanced analysis and workflow system and the third is building a customized web mapping service. These systems will use the same underlying data structures and services but will employ different technologies and teams to build their objectives, schedules and user interfaces. Obviously, we are adding complexity and risk to the overall project however this may be the best method to achieve interoperability because the development teams will be required to build off each others work. The teams will be forced to design with other users in mind as opposed to building interoperability as an afterthought, which a tendency in monolithic systems. All three teams will take advantage of preexisting

  15. River Basin Standards Interoperability Pilot

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Masó, Joan; Stasch, Christoph

    2016-04-01

    There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture

  16. Joint Command and Control: Integration Not Interoperability

    DTIC Science & Technology

    2013-03-01

    field and operate distinct, in effect, stand-alone C2 systems. However, every operational level event is Joint. In order to incorporate their...capabilities into Joint operations, the Services have developed distinct C2 systems with various level of interoperability, but none of them are truly...separate computer and communication equipment. Besides having to engineer interoperability, the Services also must determine the level of

  17. Data interoperability software solution for emergency reaction in the Europe Union

    NASA Astrophysics Data System (ADS)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2015-07-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency

  18. Preserved Musical Semantic Memory in Semantic Dementia

    PubMed Central

    Weinstein, Jessica; Koenig, Phyllis; Gunawardena, Delani; McMillan, Corey; Bonner, Michael; Grossman, Murray

    2012-01-01

    Objective To understand the scope of semantic impairment in semantic dementia. Design Case study. Setting Academic medical center. Patient A man with semantic dementia, as demonstrated by clinical, neuropsychological, and imaging studies. Main Outcome Measures Music performance and magnetic resonance imaging results. Results Despite profoundly impaired semantic memory for words and objects due to left temporal lobe atrophy, this semiprofessional musician was creative and expressive in demonstrating preserved musical knowledge. Conclusion Long-term representations of words and objects in semantic memory may be dissociated from meaningful knowledge in other domains, such as music. PMID:21320991

  19. Maturity Model for Advancing Smart Grid Interoperability

    SciTech Connect

    Knight, Mark; Widergren, Steven E.; Mater, J.; Montgomery, Austin

    2013-10-28

    Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met with process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.

  20. Building biomedical web communities using a semantically aware content management system.

    PubMed

    Das, Sudeshna; Girard, Lisa; Green, Tom; Weitzman, Louis; Lewis-Bowen, Alister; Clark, Tim

    2009-03-01

    Web-based biomedical communities are becoming an increasingly popular vehicle for sharing information amongst researchers and are fast gaining an online presence. However, information organization and exchange in such communities is usually unstructured, rendering interoperability between communities difficult. Furthermore, specialized software to create such communities at low cost-targeted at the specific common information requirements of biomedical researchers-has been largely lacking. At the same time, a growing number of biological knowledge bases and biomedical resources are being structured for the Semantic Web. Several groups are creating reference ontologies for the biomedical domain, actively publishing controlled vocabularies and making data available in Resource Description Framework (RDF) language. We have developed the Science Collaboration Framework (SCF) as a reusable platform for advanced structured online collaboration in biomedical research that leverages these ontologies and RDF resources. SCF supports structured 'Web 2.0' style community discourse amongst researchers, makes heterogeneous data resources available to the collaborating scientist, captures the semantics of the relationship among the resources and structures discourse around the resources. The first instance of the SCF framework is being used to create an open-access online community for stem cell research-StemBook (http://www.stembook.org). We believe that such a framework is required to achieve optimal productivity and leveraging of resources in interdisciplinary scientific research. We expect it to be particularly beneficial in highly interdisciplinary areas, such as neurodegenerative disease and neurorepair research, as well as having broad utility across the natural sciences.

  1. Integrated Data Capturing Requirements for 3d Semantic Modelling of Cultural Heritage: the Inception Protocol

    NASA Astrophysics Data System (ADS)

    Di Giulio, R.; Maietti, F.; Piaia, E.; Medici, M.; Ferrari, F.; Turillazzi, B.

    2017-02-01

    The generation of high quality 3D models can be still very time-consuming and expensive, and the outcome of digital reconstructions is frequently provided in formats that are not interoperable, and therefore cannot be easily accessed. This challenge is even more crucial for complex architectures and large heritage sites, which involve a large amount of data to be acquired, managed and enriched by metadata. In this framework, the ongoing EU funded project INCEPTION - Inclusive Cultural Heritage in Europe through 3D semantic modelling proposes a workflow aimed at the achievements of efficient 3D digitization methods, post-processing tools for an enriched semantic modelling, web-based solutions and applications to ensure a wide access to experts and non-experts. In order to face these challenges and to start solving the issue of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP) has been set up. The purpose is to guide the processes of digitization of cultural heritage, respecting needs, requirements and specificities of cultural assets.

  2. SOLE: Applying Semantics and Social Web to Support Technology Enhanced Learning in Software Engineering

    NASA Astrophysics Data System (ADS)

    Colomo-Palacios, Ricardo; Jiménez-López, Diego; García-Crespo, Ángel; Blanco-Iglesias, Borja

    eLearning educative processes are a challenge for educative institutions and education professionals. In an environment in which learning resources are being produced, catalogued and stored using innovative ways, SOLE provides a platform in which exam questions can be produced supported by Web 2.0 tools, catalogued and labeled via semantic web and stored and distributed using eLearning standards. This paper presents, SOLE, a social network of exam questions sharing particularized for Software Engineering domain, based on semantics and built using semantic web and eLearning standards, such as IMS Question and Test Interoperability specification 2.1.

  3. Latest developments for the IAGOS database: Interoperability and metadata

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume

    2014-05-01

    In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for

  4. Code lists for interoperability - Principles and best practices in INSPIRE

    NASA Astrophysics Data System (ADS)

    Lutz, M.; Portele, C.; Cox, S.; Murray, K.

    2012-04-01

    external vocabulary. In the former case, for each value, an external identifier, one or more labels (possibly in different languages), a definition and other metadata should be specified. In the latter case, the external vocabulary should be characterised, e.g. by specifying the version to be used, the format(s) in which the vocabulary is available, possible constraints (e.g. if only as specific part of the external list is to be used), rules for using values in the encoding of instance data, and the maintenance rules applied to the external vocabulary. This information is crucial for enabling implementation and interoperability in distributed systems (such as SDIs) and should be made available through a code list registry. While thus the information on allowed code list values is usually managed outside the UML application schema, we recommend inclusion of «codeList»-stereotyped classes in the model for semantic clarity. Information on the obligation, extensibility and a reference to the specified values should be provided through tagged values. Acknowledgements: The authors would like to thank the INSPIRE Thematic Working Groups, the Data Specifications Drafting Team and the JRC Contact Points for their contributions to the discussions on code lists in INSPIRE and to this abstract.

  5. A web services choreography scenario for interoperating bioinformatics applications

    PubMed Central

    de Knikker, Remko; Guo, Youjun; Li, Jin-long; Kwan, Albert KH; Yip, Kevin Y; Cheung, David W; Cheung, Kei-Hoi

    2004-01-01

    Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1) the platforms on which the applications run are heterogeneous, 2) their web interface is not machine-friendly, 3) they use a non-standard format for data input and output, 4) they do not exploit standards to define application interface and message exchange, and 5) existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD) that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH) category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates with these web

  6. Geo-Information Catalog Services Interoperability: an Experimented Tool

    NASA Astrophysics Data System (ADS)

    Nativi, S.; Bigagli, L.; Mazzetti, P.; Mattia, U.

    2006-12-01

    Several geo-information communities (e.g. oceanography, atmospheric science, earth observation, etc.) have developed tailored metadata specifications for data discovery, evaluation and use. They conceived these models either profiling standard models (e.g. ISO 19115 metadata specification) or enriching existing and well- accepted data models (e.g. THREDDS/OPeNDAP/netCDF data model) in order to capture and describe more semantics. These metadata profiles have generated a set of related catalog services that characterize the different Communities, initiatives and projects (e.g. INSPIRE, MERSEA, LEAD, etc.). In addition, specific catalog services have been generated by profiling standard catalog services which were designed to accomplish the general requirements coming from the geo-information community (e.g. OGC CS-W). Indeed, to implement catalog services interoperability is a near-term challenge in support of fully functional and useful discovery and sharing infrastructures for spatial data. To implement catalog services interoperability requires metadata profiles harmonization and discovery protocols adaptation and mediation. In an over- simplified way, these solutions may be considered catalogue of catalogues or catalogue broker components. We conceived a solution for making several well-accepted catalogue services interoperable (e.g. OGC services, THREDDS, ESA EOLI, MERSEA CDI, etc.). This solution has been experimented as a stand-alone application tool, called GI-go. More recently, we re-engineered this approach as a service-oriented framework of modular components. We implemented a caching brokering catalog service which acts as a broker towards heterogeneous catalogues services dealing with IGCD (Imagery Gridded and Coverage Data). This service is called GI-cat; it implements metadata harmonization and discovery protocols adaptation. GI-cat supports query distribution allowing its clients to discover and evaluate the datasets, managed by the federated

  7. Differentiating Sense through Semantic Interaction Data

    PubMed Central

    Elizabeth Workman, T.; Weir, Charlene; Rindflesch, Thomas C.

    2016-01-01

    Words which have different representations but are semantically related, such as dementia and delirium, can pose difficult issues in understanding text. We explore the use of interaction frequency data between semantic elements as a means to differentiate concept pairs, using semantic predications extracted from the biomedical literature. We applied datasets of features drawn from semantic predications for semantically related pairs to two Expectation Maximization clustering processes (without, and with concept labels), then used all data to train and evaluate several concept classifying algorithms. For the unlabeled datasets, 80% displayed expected cluster count and similar or matching proportions; all labeled data exhibited similar or matching proportions when restricting cluster count to unique labels. The highest performing classifier achieved 89% accuracy, with F1 scores for individual concept classification ranging from 0.69 to 1. We conclude with a discussion on how these findings may be applied to natural language processing of clinical text. PMID:28269921

  8. Social Semantics for an Effective Enterprise

    NASA Technical Reports Server (NTRS)

    Berndt, Sarah; Doane, Mike

    2012-01-01

    An evolution of the Semantic Web, the Social Semantic Web (s2w), facilitates knowledge sharing with "useful information based on human contributions, which gets better as more people participate." The s2w reaches beyond the search box to move us from a collection of hyperlinked facts, to meaningful, real time context. When focused through the lens of Enterprise Search, the Social Semantic Web facilitates the fluid transition of meaningful business information from the source to the user. It is the confluence of human thought and computer processing structured with the iterative application of taxonomies, folksonomies, ontologies, and metadata schemas. The importance and nuances of human interaction are often deemphasized when focusing on automatic generation of semantic markup, which results in dissatisfied users and unrealized return on investment. Users consistently qualify the value of information sets through the act of selection, making them the de facto stakeholders of the Social Semantic Web. Employers are the ultimate beneficiaries of s2w utilization with a better informed, more decisive workforce; one not achieved with an IT miracle technology, but by improved human-computer interactions. Johnson Space Center Taxonomist Sarah Berndt and Mike Doane, principal owner of Term Management, LLC discuss the planning, development, and maintenance stages for components of a semantic system while emphasizing the necessity of a Social Semantic Web for the Enterprise. Identification of risks and variables associated with layering the successful implementation of a semantic system are also modeled.

  9. Scientific Digital Libraries, Interoperability, and Ontologies

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.

    2009-01-01

    Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.

  10. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies

    PubMed Central

    Yuksel, Mustafa; Gonul, Suat; Laleci Erturkmen, Gokce Banu; Sinaci, Ali Anil; Invernizzi, Paolo; Facchinetti, Sara; Migliavacca, Andrea; Bergvall, Tomas; Depraetere, Kristof; De Roo, Jos

    2016-01-01

    Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information. PMID:27123451

  11. The MED-SUV Multidisciplinary Interoperability Infrastructure

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo; D'Auria, Luca; Reitano, Danilo; Papeschi, Fabrizio; Roncella, Roberto; Puglisi, Giuseppe; Nativi, Stefano

    2016-04-01

    In accordance with the international Supersite initiative concept, the MED-SUV (MEDiterranean SUpersite Volcanoes) European project (http://med-suv.eu/) aims to enable long-term monitoring experiment in two relevant geologically active regions of Europe prone to natural hazards: Mt. Vesuvio/Campi Flegrei and Mt. Etna. This objective requires the integration of existing components, such as monitoring systems and data bases and novel sensors for the measurements of volcanic parameters. Moreover, MED-SUV is also a direct contribution to the Global Earth Observation System of Systems (GEOSS) as one the volcano Supersites recognized by the Group on Earth Observation (GEO). To achieve its goal, MED-SUV set up an advanced e-infrastructure allowing the discovery of and access to heterogeneous data for multidisciplinary applications, and the integration with external systems like GEOSS. The MED-SUV overall infrastructure is conceived as a three layer architecture with the lower layer (Data level) including the identified relevant data sources, the mid-tier (Supersite level) including components for mediation and harmonization , and the upper tier (Global level) composed of the systems that MED-SUV must serve, such as GEOSS and possibly other global/community systems. The Data level is mostly composed of existing data sources, such as space agencies satellite data archives, the UNAVCO system, the INGV-Rome data service. They share data according to different specifications for metadata, data and service interfaces, and cannot be changed. Thus, the only relevant MED-SUV activity at this level was the creation of a MED-SUV local repository based on Web Accessible Folder (WAF) technology, deployed in the INGV site in Catania, and hosting in-situ data and products collected and generated during the project. The Supersite level is at the core of the MED-SUV architecture, since it must mediate between the disparate data sources in the layer below, and provide a harmonized view to

  12. 47 CFR 90.525 - Administration of interoperability channels.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Administration of interoperability channels. 90... Frequencies in the 763-775 and 793-805 MHz Bands § 90.525 Administration of interoperability channels. (a) States are responsible for administration of the Interoperability channels in the 769-775 MHz and...

  13. 47 CFR 90.525 - Administration of interoperability channels.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 5 2011-10-01 2011-10-01 false Administration of interoperability channels. 90... Frequencies in the 763-775 and 793-805 MHz Bands § 90.525 Administration of interoperability channels. (a) States are responsible for administration of the Interoperability channels in the 769-775 MHz and...

  14. WS/PIDS: standard interoperable PIDS in web services environments.

    PubMed

    Vasilescu, E; Dorobanţu, M; Govoni, S; Padh, S; Mun, S K

    2008-01-01

    An electronic health record depends on the consistent handling of people's identities within and outside healthcare organizations. Currently, the Person Identification Service (PIDS), a CORBA specification, is the only well-researched standard that meets these needs. In this paper, we introduce WS/PIDS, a PIDS specification for Web Services (WS) that closely matches the original PIDS and improves on it by providing explicit support for medical multimedia attributes. WS/PIDS is currently supported by a test implementation, layered on top of a PIDS back-end, with Java- and NET-based, and Web clients. WS/PIDS is interoperable among platforms; it preserves PIDS semantics to a large extent, and it is intended to be fully compliant with established and emerging WS standards. The specification is open source and immediately usable in dynamic clinical systems participating in grid environments. WS/PIDS has been tested successfully with a comprehensive set of use cases, and it is being used in a clinical research setting.

  15. Semantics via Machine Translation

    ERIC Educational Resources Information Center

    Culhane, P. T.

    1977-01-01

    Recent experiments in machine translation have given the semantic elements of collocation in Russian more objective criteria. Soviet linguists in search of semantic relationships have attempted to devise a semantic synthesis for construction of a basic language for machine translation. One such effort is summarized. (CHK)

  16. SEMANTICS AND CRITICAL READING.

    ERIC Educational Resources Information Center

    FLANIGAN, MICHAEL C.

    PROFICIENCY IN CRITICAL READING CAN BE ACCELERATED BY MAKING STUDENTS AWARE OF VARIOUS SEMANTIC DEVICES THAT HELP CLARIFY MEANINGS AND PURPOSES. EXCERPTS FROM THE ARTICLE "TEEN-AGE CORRUPTION" FROM THE NINTH-GRADE SEMANTICS UNIT WRITTEN BY THE PROJECT ENGLISH DEMONSTRATION CENTER AT EUCLID, OHIO, ARE USED TO ILLUSTRATE HOW SEMANTICS RELATE TO…

  17. Project Integration Architecture: Formulation of Semantic Parameters

    NASA Technical Reports Server (NTRS)

    Jones, William Henry

    2005-01-01

    One of several key elements of the Project Integration Architecture (PIA) is the intention to formulate parameter objects which convey meaningful semantic information. In so doing, it is expected that a level of automation can be achieved in the consumption of information content by PIA-consuming clients outside the programmatic boundary of a presenting PIA-wrapped application. This paper discusses the steps that have been recently taken in formulating such semantically-meaningful parameters.

  18. Stuart Sutton, Associate Professor, University of Washington iSchool: From Discourse Communities to the Semantic Web.

    ERIC Educational Resources Information Center

    Forsythe, Kathleen

    2002-01-01

    In this interview Professor Stuart Sutton discusses proliferation of metadata schemas as an outgrowth of various discourse communities as they find their niche on the semantic Web. Highlights include interoperability; cataloging tools, including GEMCat; and the role of librarians and information science education in the development of Internet…

  19. Smart Grid Interoperability Maturity Model Beta Version

    SciTech Connect

    Widergren, Steven E.; Drummond, R.; Giroti, Tony; Houseman, Doug; Knight, Mark; Levinson, Alex; longcore, Wayne; Lowe, Randy; Mater, J.; Oliver, Terry V.; Slack, Phil; Tolk, Andreas; Montgomery, Austin

    2011-12-02

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across an information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.

  20. GIS interoperability: current activities and military implications

    NASA Astrophysics Data System (ADS)

    Lam, Sylvia

    1997-07-01

    Geographic information systems (GIS) are gaining importance in military operations because of their capability to spatially and visually integrate various kinds of information. In an era of limited resources, geospatial data must be shared efficiently whenever possible. The military-initiated Global Geospatial Information and Services (GGI&S) Project aims at developing the infrastructure for GIS interoperability for the military. Current activities in standardization and new technology have strong implications on the design and development of GGI&S. To facilitate data interoperability at both the national and international levels, standards and specifications in geospatial data sharing are being studied, developed and promoted. Of particular interest to the military community are the activities related to the NATO DIGEST, ISO TC/211 Geomatics standardization and the industry-led Open Geodata Interoperability Specifications (OGIS). Together with new information technology, standardization provides the infrastructure for interoperable GIS for both civilian and military environments. The first part of this paper describes the major activities in standardization. The second part presents the technologies developed at DREV in support of the GGI&S. These include the Open Geospatial Datastore Interface (OGDI) and the geospatial data warehouse. DREV has been working closely with Defence Geomatics and private industry in the research and development of new technology for the GGI&S project.

  1. Specific interoperability problems of security infrastructure services.

    PubMed

    Pharow, Peter; Blobel, Bernd

    2006-01-01

    Communication and co-operation in healthcare and welfare require a well-defined set of security services based on a standards-based interoperable security infrastructure and provided by a Trusted Third Party. Generally, the services describe status and relation of communicating principals, corresponding keys and attributes, and the access rights to both applications and data. Legal, social, behavioral and ethical requirements demand securely stored patient information and well-established access tools and tokens. Electronic signatures as means for securing integrity of messages and files, certified time stamps and time signatures are important for accessing and storing data in Electronic Health Record Systems. The key for all these services is a secure and reliable procedure for authentication (identification and verification). While mentioning technical problems (e.g. lifetime of the storage devices, migration of retrieval and presentation software), this paper aims at identifying harmonization and interoperability requirements of securing data items, files, messages, sets of archived items or documents, and life-long Electronic Health Records based on a secure certificate-based identification. It's commonly known that just relying on existing and emerging security standards does not necessarily guarantee interoperability of different security infrastructure approaches. So certificate separation can be a key to modern interoperable security infrastructure services.

  2. EVA safety: Space suit system interoperability

    NASA Technical Reports Server (NTRS)

    Skoog, A. I.; McBarron, J. W.; Abramov, L. P.; Zvezda, A. O.

    1995-01-01

    The results and the recommendations of the International Academy of Astronautics extravehicular activities (IAA EVA) Committee work are presented. The IAA EVA protocols and operation were analyzed for harmonization procedures and for the standardization of safety critical and operationally important interfaces. The key role of EVA and how to improve the situation based on the identified EVA space suit system interoperability deficiencies were considered.

  3. THE Interoperability Challenge for the Geosciences: Stepping up from Interoperability between Disciplinary Siloes to Creating Transdisciplinary Data Platforms.

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.; Evans, B. J. K.; Trenham, C.; Druken, K. A.; Wang, J.

    2015-12-01

    The National Computational Infrastructure (NCI) at the Australian National University (ANU) has collocated over 10 PB of national and international data assets within a HPC facility to create the National Environmental Research Data Interoperability Platform (NERDIP). The data span a wide range of fields from the earth systems and environment (climate, coasts, oceans, and geophysics) through to astronomy, bioinformatics, and the social sciences. These diverse data collections are collocated on a major data storage node that is linked to a Petascale HPC and Cloud facility. Users can search across all of the collections and either log in and access the data directly, or they can access the data via standards-based web services. These collocated petascale data collections are theoretically a massive resource for interdisciplinary science at scales and resolutions never hitherto possible. But once collocated, multiple barriers became apparent that make cross-domain data integration very difficult and often so time consuming, that either less ambitious research goals are attempted or the project is abandoned. Incompatible content is only one half of the problem: other showstoppers are differing access models, licences and issues of ownership of derived products. Brokers can enable interdisciplinary research but in reality are we just delaying the inevitable? A call to action is required adopt a transdiciplinary approach at the conception of development of new multi-disciplinary systems whereby those across all the scientific domains, the humanities, social sciences and beyond work together to create a unity of informatics plaforms that interoperate horizontally across the multiple discipline boundaries, and also operate vertically to enable a diversity of people to access data from high end researchers, to undergraduate, school students and the general public. Once we master such a transdisciplinary approach to our vast global information assets, we will then achieve

  4. Enhancing Data Interoperability with Web Services

    NASA Astrophysics Data System (ADS)

    Shrestha, S. R.; Zimble, D. A.; Wang, W.; Herring, D.; Halpert, M.

    2014-12-01

    In an effort to improve data access and interoperability of climate and weather data, the National Oceanic and Atmospheric Administration's (NOAA) Climate.gov and Climate Prediction Center (CPC) are exploring various platform solutions to enhance a user's ability to locate, preview, and acquire the data. The Climate.gov and CPC data team faces multiple challenges including the various kinds of data and formats, inconsistency of metadata records, variety of data service implementations, very large volumes of data and geographically distributed locations. We have created the Data Access and Interoperability project to design a web-based platform, where interoperability between systems can be leveraged to allow greater data discovery, access, visualization and delivery. In the interoperable data platform, systems can integrate with each other to support the synthesis of climate and weather data. Interoperability is the ability for users to discover the available climate and weather data, preview and interact with the data, and acquire the data in common digital formats through a simple web-based interface. The goal of the interoperable data platform is to leverage existing web services, implement the established standards and integrate with existing solutions across the earth sciences domain instead of creating new technologies. Towards this effort to improve the interoperability of the platform, we are collaborating with ESRI Inc. to provide climate and weather data via web services. In this presentation, we will discuss and demonstrate how to use ArcGIS to author RESTful based scientific web services using open standards. These web services are able to encapsulate the logic required to handle and describe scientific data through a variety of service types including, image, map, feature, geoprocessing, and their respective service methods. Combining these types of services and leveraging well-documented APIs, including the ArcGIS JavaScript API, we can afford to

  5. Interoperability Outlook in the Big Data Future

    NASA Astrophysics Data System (ADS)

    Kuo, K. S.; Ramachandran, R.

    2015-12-01

    The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center

  6. Biomedical semantics in the Semantic Web.

    PubMed

    Splendiani, Andrea; Burger, Albert; Paschke, Adrian; Romano, Paolo; Marshall, M Scott

    2011-03-07

    The Semantic Web offers an ideal platform for representing and linking biomedical information, which is a prerequisite for the development and application of analytical tools to address problems in data-intensive areas such as systems biology and translational medicine. As for any new paradigm, the adoption of the Semantic Web offers opportunities and poses questions and challenges to the life sciences scientific community: which technologies in the Semantic Web stack will be more beneficial for the life sciences? Is biomedical information too complex to benefit from simple interlinked representations? What are the implications of adopting a new paradigm for knowledge representation? What are the incentives for the adoption of the Semantic Web, and who are the facilitators? Is there going to be a Semantic Web revolution in the life sciences?We report here a few reflections on these questions, following discussions at the SWAT4LS (Semantic Web Applications and Tools for Life Sciences) workshop series, of which this Journal of Biomedical Semantics special issue presents selected papers from the 2009 edition, held in Amsterdam on November 20th.

  7. Processing biological literature with customizable Web services supporting interoperable formats.

    PubMed

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk.

  8. Processing biological literature with customizable Web services supporting interoperable formats

    PubMed Central

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. PMID:25006225

  9. Interoperability in JPIP and its standardization in JPEG 2000 Part 9

    NASA Astrophysics Data System (ADS)

    Richter, Thomas; Brower, Bernard; Martucci, Stephen; Tzannes, Alexis

    2009-08-01

    The ISO standard JPEG 2000 Part 9 (15444-9) specifies a versatile and flexible image browsing and delivering protocol that allows the interactive selection of regions of large images and their transmission over a narrow bandwidth connection. However, due to the enormous flexibility, achieving interoperability between software from differing vendors is not an easy task. To address this challenge, the JPEG committee started an initiative in the form of an amendment to 15444-9 to establish common grounds on which interoperability can be defined. The outcome of this work are recommendations which subsets of JPIP vendors should focus on, hopefully easing the adoption of JPIP by identifying the options the committee found in widespread use. In this paper, the design and evolution of JPIP interoperability will be discussed, the grounds on which interoperability can be achieved- variants and profiles- will be introduced, and their design will be motivated. The paper closes with an outlook how to extend this amendment for future applications.

  10. Data interoperability software solution for emergency reaction in the Europe Union

    NASA Astrophysics Data System (ADS)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2014-09-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision-making slower and more difficult. However, spread and development of networks and IT-based Emergency Management Systems (EMS) has improved emergency responses, becoming more coordinated. Despite improvements made in recent years, EMS have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision-making. In addition, from a technical perspective, the consolidation of current EMS and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMS surrounded by different contexts. To overcome these problems we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries cultural linguistic issues. To deal with the diversity of data protocols and formats, we have designed a Service Oriented Architecture for Data Interoperability (named DISASTER) providing a flexible extensible solution to solve the mediation issues. Web Services have been adopted as specific technology to implement such paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency first responders: the Netherlands-Germany border fire.

  11. The Semantic eScience Framework

    NASA Astrophysics Data System (ADS)

    Fox, P. A.; McGuinness, D. L.

    2009-12-01

    The goal of this effort is to design and implement a configurable and extensible semantic eScience framework (SESF). Configuration requires research into accommodating different levels of semantic expressivity and user requirements from use cases. Extensibility is being achieved in a modular approach to the semantic encodings (i.e. ontologies) performed in community settings, i.e. an ontology framework into which specific applications all the way up to communities can extend the semantics for their needs.We report on how we are accommodating the rapid advances in semantic technologies and tools and the sustainable software path for the future (certain) technical advances. In addition to a generalization of the current data science interface, we will present plans for an upper-level interface suitable for use by clearinghouses, and/or educational portals, digital libraries, and other disciplines.SESF builds upon previous work in the Virtual Solar-Terrestrial Observatory. The VSTO utilizes leading edge knowledge representation, query and reasoning techniques to support knowledge-enhanced search, data access, integration, and manipulation. It encodes term meanings and their inter-relationships in ontologies anduses these ontologies and associated inference engines to semantically enable the data services. The Semantically-Enabled Science Data Integration (SESDI) project implemented data integration capabilities among three sub-disciplines; solar radiation, volcanic outgassing and atmospheric structure using extensions to existingmodular ontolgies and used the VSTO data framework, while adding smart faceted search and semantic data registrationtools. The Semantic Provenance Capture in Data Ingest Systems (SPCDIS) has added explanation provenance capabilities to an observational data ingest pipeline for images of the Sun providing a set of tools to answer diverseend user questions such as ``Why does this image look bad?.

  12. The Semantic eScience Framework

    NASA Astrophysics Data System (ADS)

    McGuinness, Deborah; Fox, Peter; Hendler, James

    2010-05-01

    The goal of this effort is to design and implement a configurable and extensible semantic eScience framework (SESF). Configuration requires research into accommodating different levels of semantic expressivity and user requirements from use cases. Extensibility is being achieved in a modular approach to the semantic encodings (i.e. ontologies) performed in community settings, i.e. an ontology framework into which specific applications all the way up to communities can extend the semantics for their needs.We report on how we are accommodating the rapid advances in semantic technologies and tools and the sustainable software path for the future (certain) technical advances. In addition to a generalization of the current data science interface, we will present plans for an upper-level interface suitable for use by clearinghouses, and/or educational portals, digital libraries, and other disciplines.SESF builds upon previous work in the Virtual Solar-Terrestrial Observatory. The VSTO utilizes leading edge knowledge representation, query and reasoning techniques to support knowledge-enhanced search, data access, integration, and manipulation. It encodes term meanings and their inter-relationships in ontologies anduses these ontologies and associated inference engines to semantically enable the data services. The Semantically-Enabled Science Data Integration (SESDI) project implemented data integration capabilities among three sub-disciplines; solar radiation, volcanic outgassing and atmospheric structure using extensions to existingmodular ontolgies and used the VSTO data framework, while adding smart faceted search and semantic data registrationtools. The Semantic Provenance Capture in Data Ingest Systems (SPCDIS) has added explanation provenance capabilities to an observational data ingest pipeline for images of the Sun providing a set of tools to answer diverseend user questions such as ``Why does this image look bad?. http://tw.rpi.edu/portal/SESF

  13. Semantic Networks and Social Networks

    ERIC Educational Resources Information Center

    Downes, Stephen

    2005-01-01

    Purpose: To illustrate the need for social network metadata within semantic metadata. Design/methodology/approach: Surveys properties of social networks and the semantic web, suggests that social network analysis applies to semantic content, argues that semantic content is more searchable if social network metadata is merged with semantic web…

  14. BioC interoperability track overview.

    PubMed

    Comeau, Donald C; Batista-Navarro, Riza Theresa; Dai, Hong-Jie; Doğan, Rezarta Islamaj; Yepes, Antonio Jimeno; Khare, Ritu; Lu, Zhiyong; Marques, Hernani; Mattingly, Carolyn J; Neves, Mariana; Peng, Yifan; Rak, Rafal; Rinaldi, Fabio; Tsai, Richard Tzong-Han; Verspoor, Karin; Wiegers, Thomas C; Wu, Cathy H; Wilbur, W John

    2014-01-01

    BioC is a new simple XML format for sharing biomedical text and annotations and libraries to read and write that format. This promotes the development of interoperable tools for natural language processing (NLP) of biomedical text. The interoperability track at the BioCreative IV workshop featured contributions using or highlighting the BioC format. These contributions included additional implementations of BioC, many new corpora in the format, biomedical NLP tools consuming and producing the format and online services using the format. The ease of use, broad support and rapidly growing number of tools demonstrate the need for and value of the BioC format. Database URL: http://bioc.sourceforge.net/.

  15. Network effects, cascades and CCP interoperability

    NASA Astrophysics Data System (ADS)

    Feng, Xiaobing; Hu, Haibo; Pritsker, Matthew

    2014-03-01

    To control counterparty risk, financial regulations such as the Dodd Frank Act are increasingly requiring standardized derivatives trades to be cleared by central counterparties (CCPs). It is anticipated that in the near-term future, CCPs across the world will be linked through interoperability agreements that facilitate risk-sharing but also serve as a conduit for transmitting shocks. This paper theoretically studies a network with CCPs that are linked through interoperability arrangements, and studies the properties of the network that contribute to cascading failures. The magnitude of the cascading is theoretically related to the strength of network linkages, the size of the network, the logistic mapping coefficient, a stochastic effect and CCP's defense lines. Simulations indicate that larger network effects increase systemic risk from cascading failures. The size of the network N raises the threshold value of shock sizes that are required to generate cascades. Hence, the larger the network, the more robust it will be.

  16. BioC interoperability track overview

    PubMed Central

    Comeau, Donald C.; Batista-Navarro, Riza Theresa; Dai, Hong-Jie; Islamaj Doğan, Rezarta; Jimeno Yepes, Antonio; Khare, Ritu; Lu, Zhiyong; Marques, Hernani; Mattingly, Carolyn J.; Neves, Mariana; Peng, Yifan; Rak, Rafal; Rinaldi, Fabio; Tsai, Richard Tzong-Han; Verspoor, Karin; Wiegers, Thomas C.; Wu, Cathy H.; Wilbur, W. John

    2014-01-01

    BioC is a new simple XML format for sharing biomedical text and annotations and libraries to read and write that format. This promotes the development of interoperable tools for natural language processing (NLP) of biomedical text. The interoperability track at the BioCreative IV workshop featured contributions using or highlighting the BioC format. These contributions included additional implementations of BioC, many new corpora in the format, biomedical NLP tools consuming and producing the format and online services using the format. The ease of use, broad support and rapidly growing number of tools demonstrate the need for and value of the BioC format. Database URL: http://bioc.sourceforge.net/ PMID:24980129

  17. Coalition Interoperability Measurement Frameworks Literature Survey

    DTIC Science & Technology

    2011-08-01

    l DRDC CORA CR 2011-132 August 2011 Coalition Interoperability Measurement Frameworks Literature Survey Philip Bury, BEng, MBA, CMC...contemporain de bataille et de mission; que des recherches empiriques et sur le terrain soient entreprises aussitôt que possible afin de tirer parti des...They also proposed a number of measures of performance , but almost without exception they were not truly measures, as the authors did not propose

  18. Future Interoperability of Camp Protection Systems (FICAPS)

    NASA Astrophysics Data System (ADS)

    Caron, Sylvie; Gündisch, Rainer; Marchand, Alain; Stahl, Karl-Hermann

    2013-05-01

    The FICAPS Project has been established as a Project of the European Defence Agency based on an initiative of Germany and France. Goal of this Project was to derive Guidelines, which by a proper implementation in future developments improve Camp Protection Systems (CPS) by enabling and improving interoperability between Camp Protection Systems and its Equipments of different Nations involved in multinational missions. These Guidelines shall allow for: • Real-time information exchange between equipments and systems of different suppliers and nations (even via SatCom), • Quick and easy replacement of equipments (even of different Nations) at run-time in the field by means of plug and play capability, thus lowering the operational and logistic costs and making the system highly available, • Enhancement of system capabilities (open and modular systems) by adding new equipment with new capabilities (just plug-in, automatic adjustment of the HMI Human Machine Interface) without costly and time consuming validation and test on system level (validation and test can be done on Equipment level), Four scenarios have been identified to summarize the interoperability requirements from an operational viewpoint. To prove the definitions given in the Guideline Document, a French and a German Demonstration System, based on existing national assets, were realized. Demonstrations, showing the capabilities given by the defined interoperability requirements with respect to the operational scenarios, were performed. Demonstrations included remote control of a CPS by another CPS, remote sensor control (Electro-Optic/InfraRed EO/IR) and remote effector control. This capability can be applied to extend the protection area or to protect distant infrastructural assets Demonstrations have been performed. The required interoperability functionality was shown successfully. Even if the focus of the FICAPS project was on camp protection, the solution found is also appropriate for other

  19. Testbed for Satellite and Terrestrial Interoperability (TSTI)

    NASA Technical Reports Server (NTRS)

    Gary, J. Patrick

    1998-01-01

    Various issues associated with the "Testbed for Satellite and Terrestrial Interoperability (TSTI)" are presented in viewgraph form. Specific topics include: 1) General and specific scientific technical objectives; 2) ACTS experiment No. 118: 622 Mbps network tests between ATDNet and MAGIC via ACTS; 3) ATDNet SONET/ATM gigabit network; 4) Testbed infrastructure, collaborations and end sites in TSTI based evaluations; 5) the Trans-Pacific digital library experiment; and 6) ESDCD on-going network projects.

  20. Multiband multimode ultracommunications node for improved interoperability

    NASA Astrophysics Data System (ADS)

    Clingempeel, Michael F.

    1999-01-01

    This paper addresses advanced software radio architecture and technologies that promise to improve the communications aspect of the interoperability problem. Its intent is to inform the reader of technologies under development today that could provide the means to solve many of the issues that limit communication between agencies and jurisdictions of the public safety sector. It is not intended to address issues such as spectrum allocation or operational coordination. Basic elements of interoperability are addressed and basic definitions of the problem are provided. Issues such as multijurisdiction and multiagency coordination are addressed. The issue of the wide range of frequencies allocated for public safety use is of particular importance. A software radio architecture presentation follows with a discussion of the basic components and the benefits of this approach. The primary software radio benefit is the use of a single radio to provide communication on a number of different frequencies with different demodulation schemes. Unique approaches to radio frequency preselection, broad band signal down conversion, high-speed digital sampling techniques, and multichannel signals processing are discussed. Enabling technologies will then be introduced, briefly covering microelectromechanical systems radio frequency devices, silicon germanium radio frequency application specific integrated circuits, advanced analog to digital converters, state-of-the-art digital signal processing, and vertical cavity surface emitting lasers. The conclusion of the paper is that an advanced technology software radio is a viable, cost-effective solution for the interoperability problem in the public safety sector.

  1. Designing Interoperable Data Products with Community Conventions

    NASA Astrophysics Data System (ADS)

    Habermann, T.; Jelenak, A.; Lee, H.

    2015-12-01

    The HDF Product Designer (HPD) is a cloud-based client-server collaboration tool that can bring existing netCDF-3/4/CF, HDF4/5, and HDF-EOS2/5 products together to create new interoperable data products that serve the needs of the Earth Science community. The tool is designed to reduce the burden of creating and storing data in standards-compliant, interoperable HDF5 files and lower the technical and programming skill threshold needed to design such products by providing a user interface that combines the netCDF-4/HDF5 interoperable feature set with applicable metadata conventions. Users can collaborate quickly to devise new HDF5 products while at the same time seamlessly incorporating the latest best practices and conventions in their community by importing existing data products. The tool also incorporates some expert system features through CLIPS, allowing custom approaches in the file design, as well as easy transfer of preferred conventions as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from any interested parties is always welcome.

  2. The Challenges of Interoperable Data Discovery

    NASA Technical Reports Server (NTRS)

    Meaux, Melanie F.

    2005-01-01

    The Global Change Master Directory (GCMD) assists the oceanographic community in data discovery and access through its online metadata directory. The directory also offers data holders a means to post and search their oceanographic data through the GCMD portals, i.e. online customized subset metadata directories. The Gulf of Maine Ocean Data Partnership (GoMODP) has expressed interest in using the GCMD portals to increase the visibility of their data holding throughout the Gulf of Maine region and beyond. The purpose of the Gulf of Maine Ocean Data Partnership (GoMODP) is to "promote and coordinate the sharing, linking, electronic dissemination, and use of data on the Gulf of Maine region". The participants have decided that a "coordinated effort is needed to enable users throughout the Gulf of Maine region and beyond to discover and put to use the vast and growing quantities of data in their respective databases". GoMODP members have invited the GCMD to discuss further collaborations in view of this effort. This presentation. will focus on the GCMD GoMODP Portal - demonstrating its content and use for data discovery, and will discuss the challenges of interoperable data discovery. interoperability among metadata standards and vocabularies will be discussed. A short overview of the lessons learned at the Marine Metadata Interoperability (MMI) metadata workshop held in Boulder, Colorado on August 9-11, 2005 will be given.

  3. Interoperability of satellite-based augmentation systems for aircraft navigation

    NASA Astrophysics Data System (ADS)

    Dai, Donghai

    The Federal Aviation Administration (FAA) is pioneering a transformation of the national airspace system from its present ground based navigation and landing systems to a satellite based system using the Global Positioning System (GPS). To meet the critical safety-of-life aviation positioning requirements, a Satellite-Based Augmentation System (SBAS), the Wide Area Augmentation System (WAAS), is being implemented to support navigation for all phases of flight, including Category I precision approach. The system is designed to be used as a primary means of navigation, capable of meeting the Required Navigation Performance (RNP), and therefore must satisfy the accuracy, integrity, continuity and availability requirements. In recent years there has been international acceptance of Global Navigation Satellite Systems (GNSS), spurring widespread growth in the independent development of SBASs. Besides the FAA's WAAS, the European Geostationary Navigation Overlay Service System (EGNOS) and the Japan Civil Aviation Bureau's MTSAT-Satellite Augmentation System (MSAS) are also being actively developed. Although all of these SBASs can operate as stand-alone, regional systems, there is increasing interest in linking these SBASs together to reduce costs while improving service coverage. This research investigated the coverage and availability improvements due to cooperative efforts among regional SBAS networks. The primary goal was to identify the optimal interoperation strategies in terms of performance, complexity and practicality. The core algorithms associated with the most promising concepts were developed and demonstrated. Experimental verification of the most promising concepts was conducted using data collected from a joint international test between the National Satellite Test Bed (NSTB) and the EGNOS System Test Bed (ESTB). This research clearly shows that a simple switch between SBASs made by the airborne equipment is the most effective choice for achieving the

  4. IHE cross-enterprise document sharing for imaging: interoperability testing software

    PubMed Central

    2010-01-01

    Background With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties. PMID:20858241

  5. Telemedicine system interoperability architecture: concept description and architecture overview.

    SciTech Connect

    Craft, Richard Layne, II

    2004-05-01

    In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.

  6. Timing Aspects of GPS-Galileo Interoperability: Challenges and Solutions

    DTIC Science & Technology

    2004-12-01

    Interoperability is a complex problem that has been extensively analyzed during Galileo definition studies (e.g., EU-funded projects GALILEI and GEM, ESA-funded...36th Annual Precise Time and Time Interval (PTTI) Meeting 279 TIMING ASPECTS OF GPS- GALILEO INTEROPERABILITY: CHALLENGES AND...the drivers for Galileo definition and design. This paper is dedicated to the timing aspects of the interoperability, related challenges, and

  7. A Prototype Ontology Tool and Interface for Coastal Atlas Interoperability

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Bermudez, L.; O'Dea, L.; Haddad, T.; Cummins, V.

    2007-12-01

    While significant capacity has been built in the field of web-based coastal mapping and informatics in the last decade, little has been done to take stock of the implications of these efforts or to identify best practice in terms of taking lessons learned into consideration. This study reports on the second of two transatlantic workshops that bring together key experts from Europe, the United States and Canada to examine state-of-the-art developments in coastal web atlases (CWA), based on web enabled geographic information systems (GIS), along with future needs in mapping and informatics for the coastal practitioner community. While multiple benefits are derived from these tailor-made atlases (e.g. speedy access to multiple sources of coastal data and information; economic use of time by avoiding individual contact with different data holders), the potential exists to derive added value from the integration of disparate CWAs, to optimize decision-making at a variety of levels and across themes. The second workshop focused on the development of a strategy to make coastal web atlases interoperable by way of controlled vocabularies and ontologies. The strategy is based on web service oriented architecture and an implementation of Open Geospatial Consortium (OGC) web services, such as Web Feature Services (WFS) and Web Map Service (WMS). Atlases publishes Catalog Web Services (CSW) using ISO 19115 metadata and controlled vocabularies encoded as Uniform Resource Identifiers (URIs). URIs allows the terminology of each atlas to be uniquely identified and facilitates mapping of terminologies using semantic web technologies. A domain ontology was also created to formally represent coastal erosion terminology as a use case, and with a test linkage of those terms between the Marine Irish Digital Atlas and the Oregon Coastal Atlas. A web interface is being developed to discover coastal hazard themes in distributed coastal atlases as part of a broader International Coastal

  8. Toward an E-Government Semantic Platform

    NASA Astrophysics Data System (ADS)

    Sbodio, Marco Luca; Moulin, Claude; Benamou, Norbert; Barthès, Jean-Paul

    This chapter describes the major aspects of an e-government platform in which semantics underpins more traditional technologies in order to enable new capabilities and to overcome technical and cultural challenges. The design and development of such an e-government Semantic Platform has been conducted with the financial support of the European Commission through the Terregov research project: "Impact of e-government on Territorial Government Services" (Terregov 2008). The goal of this platform is to let local government and government agencies offer online access to their services in an interoperable way, and to allow them to participate in orchestrated processes involving services provided by multiple agencies. Implementing a business process through an electronic procedure is indeed a core goal in any networked organization. However, the field of e-government brings specific constraints to the operations allowed in procedures, especially concerning the flow of private citizens' data: because of legal reasons in most countries, such data are allowed to circulate only from agency to agency directly. In order to promote transparency and responsibility in e-government while respecting the specific constraints on data flows, Terregov supports the creation of centrally controlled orchestrated processes; while the cross agencies data flows are centrally managed, data flow directly across agencies.

  9. Semantically linking in silico cancer models.

    PubMed

    Johnson, David; Connor, Anthony J; McKeever, Steve; Wang, Zhihui; Deisboeck, Thomas S; Quaiser, Tom; Shochat, Eliezer

    2014-01-01

    Multiscale models are commonplace in cancer modeling, where individual models acting on different biological scales are combined within a single, cohesive modeling framework. However, model composition gives rise to challenges in understanding interfaces and interactions between them. Based on specific domain expertise, typically these computational models are developed by separate research groups using different methodologies, programming languages, and parameters. This paper introduces a graph-based model for semantically linking computational cancer models via domain graphs that can help us better understand and explore combinations of models spanning multiple biological scales. We take the data model encoded by TumorML, an XML-based markup language for storing cancer models in online repositories, and transpose its model description elements into a graph-based representation. By taking such an approach, we can link domain models, such as controlled vocabularies, taxonomic schemes, and ontologies, with cancer model descriptions to better understand and explore relationships between models. The union of these graphs creates a connected property graph that links cancer models by categorizations, by computational compatibility, and by semantic interoperability, yielding a framework in which opportunities for exploration and discovery of combinations of models become possible.

  10. Semantics-informed cartography: the case of Piemonte Geological Map

    NASA Astrophysics Data System (ADS)

    Piana, Fabrizio; Lombardo, Vincenzo; Mimmo, Dario; Giardino, Marco; Fubelli, Giandomenico

    2016-04-01

    In modern digital geological maps, namely those supported by a large geo-database and devoted to dynamical, interactive representation on WMS-WebGIS services, there is the need to provide, in an explicit form, the geological assumptions used for the design and compilation of the database of the Map, and to get a definition and/or adoption of semantic representation and taxonomies, in order to achieve a formal and interoperable representation of the geologic knowledge. These approaches are fundamental for the integration and harmonisation of geological information and services across cultural (e.g. different scientific disciplines) and/or physical barriers (e.g. administrative boundaries). Initiatives such as GeoScience Markup Language (last version is GeoSciML 4.0, 2015, http://www.geosciml.org) and the INSPIRE "Data Specification on Geology" http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_GE_v3.0rc3.pdf (an operative simplification of GeoSciML, last version is 3.0 rc3, 2013), as well as the recent terminological shepherding of the Geoscience Terminology Working Group (GTWG) have been promoting information exchange of the geologic knowledge. Grounded on these standard vocabularies, schemas and data models, we provide a shared semantic classification of geological data referring to the study case of the synthetic digital geological map of the Piemonte region (NW Italy), named "GEOPiemonteMap", developed by the CNR Institute of Geosciences and Earth Resources, Torino (CNR IGG TO) and hosted as a dynamical interactive map on the geoportal of ARPA Piemonte Environmental Agency. The Piemonte Geological Map is grounded on a regional-scale geo-database consisting of some hundreds of GeologicUnits whose thousands instances (Mapped Features, polygons geometry) widely occur in Piemonte region, and each one is bounded by GeologicStructures (Mapped Features, line geometry). GeologicUnits and GeologicStructures have been spatially

  11. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... communications systems. These requirements and procedures may involve such issues as interoperability, roaming, priority access, gateway functions and interfaces, interconnectivity of public safety broadband...

  12. The Semantic Learning Organization

    ERIC Educational Resources Information Center

    Sicilia, Miguel-Angel; Lytras, Miltiadis D.

    2005-01-01

    Purpose: The aim of this paper is introducing the concept of a "semantic learning organization" (SLO) as an extension of the concept of "learning organization" in the technological domain. Design/methodology/approach: The paper takes existing definitions and conceptualizations of both learning organizations and Semantic Web technology to develop…

  13. Aging and Semantic Activation.

    ERIC Educational Resources Information Center

    Howard, Darlene V.

    Three studies tested the theory that long term memory consists of a semantically organized network of concept nodes interconnected by leveled associations or relations, and that when a stimulus is processed, the corresponding concept node is assumed to be temporarily activated and this activation spreads to nearby semantically related nodes. In…

  14. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    DTIC Science & Technology

    2010-10-01

    available resources, and to build collaborations to achieve MD PnP objectives. TATRC’s commitment has enabled us to attract additional program... Building on what has been accomplished to date, we have sought to leverage areas of traction around five key themes identified for this work...device interoperability for December 2009 or Q1 2010. Program Development and Management  Continue to build collaborations with patient safety and

  15. Order Theoretical Semantic Recommendation

    SciTech Connect

    Joslyn, Cliff A.; Hogan, Emilie A.; Paulson, Patrick R.; Peterson, Elena S.; Stephan, Eric G.; Thomas, Dennis G.

    2013-07-23

    Mathematical concepts of order and ordering relations play multiple roles in semantic technologies. Discrete totally ordered data characterize both input streams and top-k rank-ordered recommendations and query output, while temporal attributes establish numerical total orders, either over time points or in the more complex case of startend temporal intervals. But also of note are the fully partially ordered data, including both lattices and non-lattices, which actually dominate the semantic strcuture of ontological systems. Scalar semantic similarities over partially-ordered semantic data are traditionally used to return rank-ordered recommendations, but these require complementation with true metrics available over partially ordered sets. In this paper we report on our work in the foundations of partial order measurement in ontologies, with application to top-k semantic recommendation in workflows.

  16. Enhancing medical database semantics.

    PubMed Central

    Leão, B. de F.; Pavan, A.

    1995-01-01

    Medical Databases deal with dynamic, heterogeneous and fuzzy data. The modeling of such complex domain demands powerful semantic data modeling methodologies. This paper describes GSM-Explorer a Case Tool that allows for the creation of relational databases using semantic data modeling techniques. GSM Explorer fully incorporates the Generic Semantic Data Model-GSM enabling knowledge engineers to model the application domain with the abstraction mechanisms of generalization/specialization, association and aggregation. The tool generates a structure that implements persistent database-objects through the automatic generation of customized SQL ANSI scripts that sustain the semantics defined in the higher lever. This paper emphasizes the system architecture and the mapping of the semantic model into relational tables. The present status of the project and its further developments are discussed in the Conclusions. PMID:8563288

  17. S3QL: A distributed domain specific language for controlled semantic integration of life sciences data

    PubMed Central

    2011-01-01

    Background The value and usefulness of data increases when it is explicitly interlinked with related data. This is the core principle of Linked Data. For life sciences researchers, harnessing the power of Linked Data to improve biological discovery is still challenged by a need to keep pace with rapidly evolving domains and requirements for collaboration and control as well as with the reference semantic web ontologies and standards. Knowledge organization systems (KOSs) can provide an abstraction for publishing biological discoveries as Linked Data without complicating transactions with contextual minutia such as provenance and access control. We have previously described the Simple Sloppy Semantic Database (S3DB) as an efficient model for creating knowledge organization systems using Linked Data best practices with explicit distinction between domain and instantiation and support for a permission control mechanism that automatically migrates between the two. In this report we present a domain specific language, the S3DB query language (S3QL), to operate on its underlying core model and facilitate management of Linked Data. Results Reflecting the data driven nature of our approach, S3QL has been implemented as an application programming interface for S3DB systems hosting biomedical data, and its syntax was subsequently generalized beyond the S3DB core model. This achievement is illustrated with the assembly of an S3QL query to manage entities from the Simple Knowledge Organization System. The illustrative use cases include gastrointestinal clinical trials, genomic characterization of cancer by The Cancer Genome Atlas (TCGA) and molecular epidemiology of infectious diseases. Conclusions S3QL was found to provide a convenient mechanism to represent context for interoperation between public and private datasets hosted at biomedical research institutions and linked data formalisms. PMID:21756325

  18. Semantic search via concept annealing

    NASA Astrophysics Data System (ADS)

    Dunkelberger, Kirk A.

    2007-04-01

    Annealing, in metallurgy and materials science, is a heat treatment wherein the microstructure of a material is altered, causing changes in its properties such as strength and hardness. We define concept annealing as a lexical, syntactic, and semantic expansion capability (the removal of defects and the internal stresses that cause term- and phrase-based search failure) coupled with a directed contraction capability (semantically-related terms, queries, and concepts nucleate and grow to replace those originally deformed by internal stresses). These two capabilities are tied together in a control loop mediated by the information retrieval precision and recall metrics coupled with intuition provided by the operator. The specific representations developed have been targeted at facilitating highly efficient and effective semantic indexing and searching. This new generation of Find capability enables additional processing (i.e. all-source tracking, relationship extraction, and total system resource management) at rates, precisions, and accuracies previously considered infeasible. In a recent experiment, an order magnitude reduction in time to actionable intelligence and nearly three orderss magnitude reduction in false alarm rate was achieved.

  19. Intelligent Discovery for Learning Objects Using Semantic Web Technologies

    ERIC Educational Resources Information Center

    Hsu, I-Ching

    2012-01-01

    The concept of learning objects has been applied in the e-learning field to promote the accessibility, reusability, and interoperability of learning content. Learning Object Metadata (LOM) was developed to achieve these goals by describing learning objects in order to provide meaningful metadata. Unfortunately, the conventional LOM lacks the…

  20. 78 FR 46582 - Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-01

    ... COMMISSION Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council... Communications Commission's (FCC or Commission) Communications Security, Reliability, and Interoperability... to ensure the security, reliability, and interoperability of communications systems. On March...

  1. 77 FR 67815 - Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-14

    ... COMMISSION Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council... Communications Commission's (FCC) Communications Security, Reliability, and Interoperability Council (CSRIC) will..., reliability, and interoperability of communications systems. On March 19, 2011, the FCC, pursuant to...

  2. 77 FR 48153 - Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-13

    ... COMMISSION Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council... Communications Commission's (FCC or Commission) Communications Security, Reliability, and Interoperability... practices and actions the FCC can take to ensure the security, reliability, and interoperability...

  3. 75 FR 63462 - Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-15

    ... Energy Regulatory Commission Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid Interoperability Standards October 7, 2010. 1. The Energy Independence and Security Act of... interoperability of smart grid devices and systems, including protocols and model standards for...

  4. Semantics, Pragmatics, and the Nature of Semantic Theories

    ERIC Educational Resources Information Center

    Spewak, David Charles, Jr.

    2013-01-01

    The primary concern of this dissertation is determining the distinction between semantics and pragmatics and how context sensitivity should be accommodated within a semantic theory. I approach the question over how to distinguish semantics from pragmatics from a new angle by investigating what the objects of a semantic theory are, namely…

  5. 77 FR 19575 - Promoting Interoperability in the 700 MHz Commercial Spectrum; Interoperability of Mobile User...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-02

    ... meaningful access to a wide range of advanced devices. The result, they argue, is that this spectrum is being... interoperability requirement in the 700 MHz band is necessary to obtain affordable, advanced mobile devices to... on the Deployment of Advanced Broadband Services. The record to date suggests that, unless...

  6. Chandrayaan-1 Data Interoperability using PDAP

    NASA Astrophysics Data System (ADS)

    Thakkar, Navita; Crichton, Daniel; Heather, David; Gopala Krishna, Barla; Srinivasan, T. P.; Prashar, Ajay

    Indian Space Science Data Center (ISSDC) at Bangalore is the custodian of all the data sets of the current and future science missions of ISRO.Chandrayaan-1 is the first among the planetary missions launched by ISRO. The data collected from all the instruments during the life time of Chandrayaan-1 is peer-reviewed and archived as a Long Term Archive(LTA)using the Planetary Data System standards (PDS 3) at the ISSDC. In order to increase the use of the data archived, it needs to be made accessible to the scientific community and academia in a seamless manner across the globe. The IPDA (International Planetary Data Alliance), among its objectives, has to allow the interoperability and interchange of planetary scientific data among the planetary community. It has recommended PDAP (Planetary Data Access Protocol) v1.0 for implementation as an interoperability protocol for accessing planetary data archives. PDAP is a simple protocol for retrieving planetary data from repositories through a uniform interface.PDAP compliance requires an access web service to be maintained with thecharacteristics of the Metadata Query web method and the Data Retrieval web method. The PDAP interface will provide the metadata services for Chandrayaan-1 datasets and return a list of candidate hits formatted as a VOTable. For each candidate hit, an access reference URL will is used to retrieve the real data.This will be integrated with the IPDA Registry and Search Services.This paper presents the prototype of interoperable systems for Chandrayaan-1 planetary datasets using PDAP.

  7. Semantic Visualization Mapping for Illustrative Volume Visualization

    NASA Astrophysics Data System (ADS)

    Rautek, P.; Bruckner, S.; Gröller, M. E.

    2009-04-01

    Measured and simulated data is usually divided into several meaningful intervals that are relevant to the domain expert. Examples from medicine are the specific semantics for different measuring modalities. A PET scan of a brain measures brain activity. It shows regions of homogeneous activity that are labeled by experts with semantic values such as low brain activity or high brain activity. Diffusion MRI data provides information about the healthiness of tissue regions and is classified by experts with semantic values like healthy, diseased, or necrotic. Medical CT data encode the measured density values in Hounsfield units. Specific intervals of the Hounsfield scale refer to different tissue types like air, soft tissue, bone, contrast enhanced vessels, etc. However, the semantic parameters from expert domains are not necessarily used to describe a mapping between the volume attributes and visual appearance. Volume rendering techniques commonly map attributes of the underlying data on visual appearance via a transfer function. Transfer functions are a powerful tool to achieve various visualization mappings. The specification of transfer functions is a complex task. The user has to have expert knowledge about the underlying rendering technique to achieve the desired results. Especially the specification of higher-dimensional transfer functions is challenging. Common user interfaces provide methods to brush in two dimensions. While brushing is an intuitive method to select regions of interest or to specify features, user interfaces for higher-dimensions are more challenging and often non-intuitive. For seismic data the situation is even more difficult since the data typically consists of many more volumetric attributes than for example medical datasets. Scientific illustrators are experts in conveying information by visual means. They also make use of semantics in a natural way describing visual abstractions such as shading, tone, rendering style, saturation

  8. Electronic Healthcare Record and clinical research in cardiovascular radiology. HL7 CDA and CDISC ODM interoperability.

    PubMed

    El Fadly, A; Daniel, C; Bousquet, C; Dart, T; Lastic, P-Y; Degoulet, P

    2007-10-11

    Integrating clinical research data entry with patient care data entry is a challenging issue. At the G. Pompidou European Hospital (HEGP), cardiovascular radiology reports are captured twice, first in the Electronic Health Record (EHR) and then in a national clinical research server. Informatics standards are different for EHR (HL7 CDA) and clinical research (CDISC ODM). The objective of this work is to feed both the EHR and a Clinical Research Data Management System (CDMS) from a single multipurpose form. We adopted and compared two approaches. First approach consists in implementing the single "care-research" form within the EHR and aligning XML structures of HL7 CDA document and CDISC ODM message to export relevant data from EHR to CDMS. Second approach consists in displaying a single "care-research" XForms form within the EHR and generating both HL7 CDA document and CDISC message to feed both EHR and CDMS. The solution based on XForms avoids overloading both EHR and CDMS with irrelevant information. Beyond syntactic interoperability, a perspective is to address the issue of semantic interoperability between both domains.

  9. Interoperability Standards for Medical Simulation Systems

    NASA Technical Reports Server (NTRS)

    Tolk, Andreas; Diallo, Saikou Y.; Padilla, Jose J.

    2012-01-01

    The Modeling and Simulation Community successfully developed and applied interoperability standards like the Distributed Interactive Simulation (DIS) protocol (IEEE 1278) and the High Level Architecture (HLA) (IEEE 1516). These standards were applied for world-wide distributed simulation events for several years. However, this paper shows that some of the assumptions and constraints underlying the philosophy of these current standards are not valid for Medical Simulation Systems. This paper describes the standards, the philosophy and the limits for medical applications and recommends necessary extensions of the standards to support medical simulation.

  10. Master data directories and Catalog Interoperability

    NASA Technical Reports Server (NTRS)

    Thieman, J. R.

    1990-01-01

    While the 'Catalog Interoperability' (CI) project began as a NASA effort to facilitate identification, location, and access to data of interest to space and earth sciences researchers, it now has a membership encompassing numerous U.S. and international agencies as well as academic institutions. CI is creating a global network of interconnected directory, catalog, and inventory systems. Its directories contain brief summary information about data sets, and can either furnish automated links to other information systems yielding greater detail on matters of interest or indicate to whom requests for additional information can go.

  11. OTF CCSDS SM and C Interoperability Prototype

    NASA Technical Reports Server (NTRS)

    Reynolds, Walter F.; Lucord, Steven A.; Stevens, John E.

    2008-01-01

    A presentation is provided to demonstrate the interoperability between two space flight Mission Operation Centers (MOCs) and to emulate telemetry, actions, and alert flows between the two centers. One framework uses a COTS C31 system that uses CORBA to interface to the local OTF data network. The second framework relies on current Houston MCC frameworks and ad hoc clients. Messaging relies on SM and C MAL, Core and Common Service formats, while the transport layer uses AMS. A centralized SM and C Registry uses HTTP/XML for transport/encoding. The project's status and progress are reviewed.

  12. Interoperable PKI Data Distribution in Computational Grids

    SciTech Connect

    Pala, Massimiliano; Cholia, Shreyas; Rea, Scott A.; Smith, Sean W.

    2008-07-25

    One of the most successful working examples of virtual organizations, computational grids need authentication mechanisms that inter-operate across domain boundaries. Public Key Infrastructures(PKIs) provide sufficient flexibility to allow resource managers to securely grant access to their systems in such distributed environments. However, as PKIs grow and services are added to enhance both security and usability, users and applications must struggle to discover available resources-particularly when the Certification Authority (CA) is alien to the relying party. This article presents how to overcome these limitations of the current grid authentication model by integrating the PKI Resource Query Protocol (PRQP) into the Grid Security Infrastructure (GSI).

  13. A health analytics semantic ETL service for obesity surveillance.

    PubMed

    Poulymenopoulou, M; Papakonstantinou, D; Malamateniou, F; Vassilacopoulos, G

    2015-01-01

    The increasingly large amount of data produced in healthcare (e.g. collected through health information systems such as electronic medical records - EMRs or collected through novel data sources such as personal health records - PHRs, social media, web resources) enable the creation of detailed records about people's health, sentiments and activities (e.g. physical activity, diet, sleep quality) that can be used in the public health area among others. However, despite the transformative potential of big data in public health surveillance there are several challenges in integrating big data. In this paper, the interoperability challenge is tackled and a semantic Extract Transform Load (ETL) service is proposed that seeks to semantically annotate big data to result into valuable data for analysis. This service is considered as part of a health analytics engine on the cloud that interacts with existing healthcare information exchange networks, like the Integrating the Healthcare Enterprise (IHE), PHRs, sensors, mobile applications, and other web resources to retrieve patient health, behavioral and daily activity data. The semantic ETL service aims at semantically integrating big data for use by analytic mechanisms. An illustrative implementation of the service on big data which is potentially relevant to human obesity, enables using appropriate analytic techniques (e.g. machine learning, text mining) that are expected to assist in identifying patterns and contributing factors (e.g. genetic background, social, environmental) for this social phenomenon and, hence, drive health policy changes and promote healthy behaviors where residents live, work, learn, shop and play.

  14. Heterogeneity and Context in Semantic-Web-Enabled HCLS Systems

    NASA Astrophysics Data System (ADS)

    Zimmermann, Antoine; Sahay, Ratnesh; Fox, Ronan; Polleres, Axel

    The need for semantics preserving integration of complex data has been widely recognized in the healthcare domain. While standards such as Health Level Seven (HL7) have been developed in this direction, they have mostly been applied in limited, controlled environments, still being used incoherently across countries, organizations, or hospitals. In a more mobile and global society, data and knowledge are going to be commonly exchanged between various systems at Web scale. Specialists in this domain have increasingly argued in favor of using Semantic Web technologies for modeling healthcare data in a well formalized way. This paper provides a reality check in how far current Semantic Web standards can tackle interoperability issues arising in such systems driven by the modeling of concrete use cases on exchanging clinical data and practices. Recognizing the insufficiency of standard OWL to model our scenario, we survey theoretical approaches to extend OWL by modularity and context towards handling heterogeneity in Semantic-Web-enabled health care and life sciences (HCLS) systems. We come to the conclusion that none of these approaches addresses all of our use case heterogeneity aspects in its entirety. We finally sketch paths on how better approaches could be devised by combining several existing techniques.

  15. Investigating the capabilities of semantic enrichment of 3D CityEngine data

    NASA Astrophysics Data System (ADS)

    Solou, Dimitra; Dimopoulou, Efi

    2016-08-01

    In recent years the development of technology and the lifting of several technical limitations, has brought the third dimension to the fore. The complexity of urban environments and the strong need for land administration, intensify the need of using a three-dimensional cadastral system. Despite the progress in the field of geographic information systems and 3D modeling techniques, there is no fully digital 3D cadastre. The existing geographic information systems and the different methods of three-dimensional modeling allow for better management, visualization and dissemination of information. Nevertheless, these opportunities cannot be totally exploited because of deficiencies in standardization and interoperability in these systems. Within this context, CityGML was developed as an international standard of the Open Geospatial Consortium (OGC) for 3D city models' representation and exchange. CityGML defines geometry and topology for city modeling, also focusing on semantic aspects of 3D city information. The scope of CityGML is to reach common terminology, also addressing the imperative need for interoperability and data integration, taking into account the number of available geographic information systems and modeling techniques. The aim of this paper is to develop an application for managing semantic information of a model generated based on procedural modeling. The model was initially implemented in CityEngine ESRI's software, and then imported to ArcGIS environment. Final goal was the original model's semantic enrichment and then its conversion to CityGML format. Semantic information management and interoperability seemed to be feasible by the use of the 3DCities Project ESRI tools, since its database structure ensures adding semantic information to the CityEngine model and therefore automatically convert to CityGML for advanced analysis and visualization in different application areas.

  16. A Semantic Graph Query Language

    SciTech Connect

    Kaplan, I L

    2006-10-16

    Semantic graphs can be used to organize large amounts of information from a number of sources into one unified structure. A semantic query language provides a foundation for extracting information from the semantic graph. The graph query language described here provides a simple, powerful method for querying semantic graphs.

  17. A Defense of Semantic Minimalism

    ERIC Educational Resources Information Center

    Kim, Su

    2012-01-01

    Semantic Minimalism is a position about the semantic content of declarative sentences, i.e., the content that is determined entirely by syntax. It is defined by the following two points: "Point 1": The semantic content is a complete/truth-conditional proposition. "Point 2": The semantic content is useful to a theory of…

  18. Towards Automatic Semantic Labelling of 3D City Models

    NASA Astrophysics Data System (ADS)

    Rook, M.; Biljecki, F.; Diakité, A. A.

    2016-10-01

    The lack of semantic information in many 3D city models is a considerable limiting factor in their use, as a lot of applications rely on semantics. Such information is not always available, since it is not collected at all times, it might be lost due to data transformation, or its lack may be caused by non-interoperability in data integration from other sources. This research is a first step in creating an automatic workflow that semantically labels plain 3D city model represented by a soup of polygons, with semantic and thematic information, as defined in the CityGML standard. The first step involves the reconstruction of the topology, which is used in a region growing algorithm that clusters upward facing adjacent triangles. Heuristic rules, embedded in a decision tree, are used to compute a likeliness score for these regions that either represent the ground (terrain) or a RoofSurface. Regions with a high likeliness score, to one of the two classes, are used to create a decision space, which is used in a support vector machine (SVM). Next, topological relations are utilised to select seeds that function as a start in a region growing algorithm, to create regions of triangles of other semantic classes. The topological relationships of the regions are used in the aggregation of the thematic building features. Finally, the level of detail is detected to generate the correct output in CityGML. The results show an accuracy between 85 % and 99 % in the automatic semantic labelling on four different test datasets. The paper is concluded by indicating problems and difficulties implying the next steps in the research.

  19. Interoperability of Demand Response Resources Demonstration in NY

    SciTech Connect

    Wellington, Andre

    2014-03-31

    The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.

  20. 47 CFR 27.75 - Basic interoperability requirement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 2 2014-10-01 2014-10-01 false Basic interoperability requirement. 27.75 Section 27.75 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES MISCELLANEOUS WIRELESS COMMUNICATIONS SERVICES Technical Standards § 27.75 Basic interoperability...

  1. 47 CFR 90.547 - Narrowband Interoperability channel capability requirement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Narrowband Interoperability channel capability... Frequencies in the 763-775 and 793-805 MHz Bands § 90.547 Narrowband Interoperability channel capability... channels in the 769-775 MHz and 799-805 MHz frequency bands must be capable of operating on all of...

  2. 47 CFR 90.547 - Narrowband Interoperability channel capability requirement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 5 2011-10-01 2011-10-01 false Narrowband Interoperability channel capability... Frequencies in the 763-775 and 793-805 MHz Bands § 90.547 Narrowband Interoperability channel capability... channels in the 769-775 MHz and 799-805 MHz frequency bands must be capable of operating on all of...

  3. Class Translator for the Federation Interoperability Object Model (FIOM)

    DTIC Science & Technology

    2002-03-01

    Melnik et al. Introducing the Generic Interoperability Framework, Working Draft, 1999. [http://www- diglib.stanford.edu/ diglib /ginf/WD/ginf-overview...Melnik+] Sergey Melnik et al. Generic Interoperability Framework (GINF) Middleware. [http://www-diglib.stanford.edu/ diglib /ginf/WD/ginf- 60

  4. Development of a Ground Water Data Portal for Interoperable Data Exchange within the U.S. National Ground Water Monitoring Network and Beyond

    NASA Astrophysics Data System (ADS)

    Booth, N. L.; Brodaric, B.; Lucido, J. M.; Kuo, I.; Boisvert, E.; Cunningham, W. L.

    2011-12-01

    using the OGC Sensor Observation Service (SOS) standard. Ground Water Markup Language (GWML) encodes well log, lithology and construction information and is exchanged using the OGC Web Feature Service (WFS) standard. Within the NGWMN Data Portal, data exchange between distributed data provider repositories is achieved through the use of these web services and a central mediation hub, which performs both format (syntactic) and nomenclature (semantic) mediation, conforming heterogeneous inputs into common standards-based outputs. Through these common standards, interoperability between the U.S. NGWMN and Canada's Groundwater Information Network (GIN) is achieved, advancing a ground water virtual observatory across North America.

  5. Food product tracing technology capabilities and interoperability.

    PubMed

    Bhatt, Tejas; Zhang, Jianrong Janet

    2013-12-01

    Despite the best efforts of food safety and food defense professionals, contaminated food continues to enter the food supply. It is imperative that contaminated food be removed from the supply chain as quickly as possible to protect public health and stabilize markets. To solve this problem, scores of technology companies purport to have the most effective, economical product tracing system. This study sought to compare and contrast the effectiveness of these systems at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. It also determined if these systems can work together to better secure the food supply (their interoperability). Institute of Food Technologists (IFT) hypothesized that when technology providers are given a full set of supply-chain data, even for a multi-ingredient product, their systems will generally be able to trace a contaminated product forward and backward through the supply chain. However, when provided with only a portion of supply-chain data, even for a product with a straightforward supply chain, it was expected that interoperability of the systems will be lacking and that there will be difficulty collaborating to identify sources and/or recipients of potentially contaminated product. IFT provided supply-chain data for one complex product to 9 product tracing technology providers, and then compared and contrasted their effectiveness at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. A vertically integrated foodservice restaurant agreed to work with IFT to secure data from its supply chain for both a multi-ingredient and a simpler product. Potential multi-ingredient products considered included canned tuna, supreme pizza, and beef tacos. IFT ensured that all supply-chain data collected did not include any proprietary information or information that would otherwise

  6. NASA's Geospatial Interoperability Office(GIO)Program

    NASA Technical Reports Server (NTRS)

    Weir, Patricia

    2004-01-01

    NASA produces vast amounts of information about the Earth from satellites, supercomputer models, and other sources. These data are most useful when made easily accessible to NASA researchers and scientists, to NASA's partner Federal Agencies, and to society as a whole. A NASA goal is to apply its data for knowledge gain, decision support and understanding of Earth, and other planetary systems. The NASA Earth Science Enterprise (ESE) Geospatial Interoperability Office (GIO) Program leads the development, promotion and implementation of information technology standards that accelerate and expand the delivery of NASA's Earth system science research through integrated systems solutions. Our overarching goal is to make it easy for decision-makers, scientists and citizens to use NASA's science information. NASA's Federal partners currently participate with NASA and one another in the development and implementation of geospatial standards to ensure the most efficient and effective access to one another's data. Through the GIO, NASA participates with its Federal partners in implementing interoperability standards in support of E-Gov and the associated President's Management Agenda initiatives by collaborating on standards development. Through partnerships with government, private industry, education and communities the GIO works towards enhancing the ESE Applications Division in the area of National Applications and decision support systems. The GIO provides geospatial standards leadership within NASA, represents NASA on the Federal Geographic Data Committee (FGDC) Coordination Working Group and chairs the FGDC's Geospatial Applications and Interoperability Working Group (GAI) and supports development and implementation efforts such as Earth Science Gateway (ESG), Space Time Tool Kit and Web Map Services (WMS) Global Mosaic. The GIO supports NASA in the collection and dissemination of geospatial interoperability standards needs and progress throughout the agency including

  7. NASA and Industry Benefits of ACTS High Speed Network Interoperability Experiments

    NASA Technical Reports Server (NTRS)

    Zernic, M. J.; Beering, D. R.; Brooks, D. E.

    2000-01-01

    This paper provides synopses of the design. implementation, and results of key high data rate communications experiments utilizing the technologies of NASA's Advanced Communications Technology Satellite (ACTS). Specifically, the network protocol and interoperability performance aspects will be highlighted. The objectives of these key experiments will be discussed in their relevant context to NASA missions, as well as, to the comprehensive communications industry. Discussion of the experiment implementation will highlight the technical aspects of hybrid network connectivity, a variety of high-speed interoperability architectures, a variety of network node platforms, protocol layers, internet-based applications, and new work focused on distinguishing between link errors and congestion. In addition, this paper describes the impact of leveraging government-industry partnerships to achieve technical progress and forge synergistic relationships. These relationships will be the key to success as NASA seeks to combine commercially available technology with its own internal technology developments to realize more robust and cost effective communications for space operations.

  8. Trusting Crowdsourced Geospatial Semantics

    NASA Astrophysics Data System (ADS)

    Goodhue, P.; McNair, H.; Reitsma, F.

    2015-08-01

    The degree of trust one can place in information is one of the foremost limitations of crowdsourced geospatial information. As with the development of web technologies, the increased prevalence of semantics associated with geospatial information has increased accessibility and functionality. Semantics also provides an opportunity to extend indicators of trust for crowdsourced geospatial information that have largely focused on spatio-temporal and social aspects of that information. Comparing a feature's intrinsic and extrinsic properties to associated ontologies provides a means of semantically assessing the trustworthiness of crowdsourced geospatial information. The application of this approach to unconstrained semantic submissions then allows for a detailed assessment of the trust of these features whilst maintaining the descriptive thoroughness this mode of information submission affords. The resulting trust rating then becomes an attribute of the feature, providing not only an indication as to the trustworthiness of a specific feature but is able to be aggregated across multiple features to illustrate the overall trustworthiness of a dataset.

  9. Are Meaningful Use Stage 2 certified EHRs ready for interoperability? Findings from the SMART C-CDA Collaborative

    PubMed Central

    D'Amore, John D; Mandel, Joshua C; Kreda, David A; Swain, Ashley; Koromia, George A; Sundareswaran, Sumesh; Alschuler, Liora; Dolin, Robert H; Mandl, Kenneth D; Kohane, Isaac S; Ramoni, Rachel B

    2014-01-01

    Background and objective Upgrades to electronic health record (EHR) systems scheduled to be introduced in the USA in 2014 will advance document interoperability between care providers. Specifically, the second stage of the federal incentive program for EHR adoption, known as Meaningful Use, requires use of the Consolidated Clinical Document Architecture (C-CDA) for document exchange. In an effort to examine and improve C-CDA based exchange, the SMART (Substitutable Medical Applications and Reusable Technology) C-CDA Collaborative brought together a group of certified EHR and other health information technology vendors. Materials and methods We examined the machine-readable content of collected samples for semantic correctness and consistency. This included parsing with the open-source BlueButton.js tool, testing with a validator used in EHR certification, scoring with an automated open-source tool, and manual inspection. We also conducted group and individual review sessions with participating vendors to understand their interpretation of C-CDA specifications and requirements. Results We contacted 107 health information technology organizations and collected 91 C-CDA sample documents from 21 distinct technologies. Manual and automated document inspection led to 615 observations of errors and data expression variation across represented technologies. Based upon our analysis and vendor discussions, we identified 11 specific areas that represent relevant barriers to the interoperability of C-CDA documents. Conclusions We identified errors and permissible heterogeneity in C-CDA documents that will limit semantic interoperability. Our findings also point to several practical opportunities to improve C-CDA document quality and exchange in the coming years. PMID:24970839

  10. Deep Aesthetic Quality Assessment With Semantic Information.

    PubMed

    Kao, Yueying; He, Ran; Huang, Kaiqi

    2017-03-01

    Human beings often assess the aesthetic quality of an image coupled with the identification of the image's semantic content. This paper addresses the correlation issue between automatic aesthetic quality assessment and semantic recognition. We cast the assessment problem as the main task among a multi-task deep model, and argue that semantic recognition task offers the key to address this problem. Based on convolutional neural networks, we employ a single and simple multi-task framework to efficiently utilize the supervision of aesthetic and semantic labels. A correlation item between these two tasks is further introduced to the framework by incorporating the inter-task relationship learning. This item not only provides some useful insight about the correlation but also improves assessment accuracy of the aesthetic task. In particular, an effective strategy is developed to keep a balance between the two tasks, which facilitates to optimize the parameters of the framework. Extensive experiments on the challenging Aesthetic Visual Analysis dataset and Photo.net dataset validate the importance of semantic recognition in aesthetic quality assessment, and demonstrate that multitask deep models can discover an effective aesthetic representation to achieve the state-of-the-art results.

  11. Secure Interoperable Open Smart Grid Demonstration Project

    SciTech Connect

    Magee, Thoman

    2014-12-28

    The Consolidated Edison, Inc., of New York (Con Edison) Secure Interoperable Open Smart Grid Demonstration Project (SGDP), sponsored by the United States (US) Department of Energy (DOE), demonstrated that the reliability, efficiency, and flexibility of the grid can be improved through a combination of enhanced monitoring and control capabilities using systems and resources that interoperate within a secure services framework. The project demonstrated the capability to shift, balance, and reduce load where and when needed in response to system contingencies or emergencies by leveraging controllable field assets. The range of field assets includes curtailable customer loads, distributed generation (DG), battery storage, electric vehicle (EV) charging stations, building management systems (BMS), home area networks (HANs), high-voltage monitoring, and advanced metering infrastructure (AMI). The SGDP enables the seamless integration and control of these field assets through a common, cyber-secure, interoperable control platform, which integrates a number of existing legacy control and data systems, as well as new smart grid (SG) systems and applications. By integrating advanced technologies for monitoring and control, the SGDP helps target and reduce peak load growth, improves the reliability and efficiency of Con Edison’s grid, and increases the ability to accommodate the growing use of distributed resources. Con Edison is dedicated to lowering costs, improving reliability and customer service, and reducing its impact on the environment for its customers. These objectives also align with the policy objectives of New York State as a whole. To help meet these objectives, Con Edison’s long-term vision for the distribution grid relies on the successful integration and control of a growing penetration of distributed resources, including demand response (DR) resources, battery storage units, and DG. For example, Con Edison is expecting significant long-term growth of DG

  12. Interoperability of electronic health records and personal health records: key interoperability issues associated with information exchange.

    PubMed

    Pringle, Simone; Lippitt, Alex

    2009-01-01

    As patients receive medical care, their clinical history may be tracked and recorded by multiple electronic systems developed by independent vendors. Medical providers might use electronic health record (EHR) software tailored to the needs of trained medical personnel, whereas patients may interact with personal health records (PHR). The purpose of this essay is to identify the key interoperability issues associated with the information exchange between these two types of systems and offer an approach for enhancing interoperability. This article is part of a series of unpublished essays titled A Community View on How Personal Health Records Can Improve Patient Care and Outcomes in Many Healthcare Settings, a collaborative project of Northern Illinois Physicians For Connectivity and the Coalition for Quality and Patient Safety of Chicagoland. For further information on how you can obtain copies of the complete work, contact the principle Dr. Stasia Kahn at Stash5@sbcglobal.net.

  13. Frame semantics-based study of verbs across medical genres.

    PubMed

    Wandji Tchami, Ornella; L'Homme, Marie-Claude; Grabar, Natalia

    2014-01-01

    The field of medicine gathers actors with different levels of expertise. These actors must interact, although their mutual understanding is not always completely successful. We propose to study corpora (with high and low levels of expertise) in order to observe their specificities. More specifically, we perform a contrastive analysis of verbs, and of the syntactic and semantic features of their participants, based on the Frame Semantics framework and the methodology implemented in FrameNet. In order to achieve this, we use an existing medical terminology to automatically annotate the semantics classes of participants of verbs, which we assume are indicative of semantics roles. Our results indicate that verbs show similar or very close semantics in some contexts, while in other contexts they behave differently. These results are important for studying the understanding of medical information by patients and for improving the communication between patients and medical doctors.

  14. Optimizing QoS-Aware Semantic Web Service Composition

    NASA Astrophysics Data System (ADS)

    Lécué, Freddy

    Ranking and optimization of web service compositions are some of the most interesting challenges at present. Since web services can be enhanced with formal semantic descriptions, forming the "semantic web services", it becomes conceivable to exploit the quality of semantic links between services (of any composition) as one of the optimization criteria. For this we propose to use the semantic similarities between output and input parameters of web services. Coupling this with other criteria such as quality of service (QoS) allow us to rank and optimize compositions achieving the same goal. Here we suggest an innovative and extensible optimization model designed to balance semantic fit (or functional quality) with non-functional QoS metrics. To allow the use of this model in the context of a large number of services as foreseen by the strategic EC-funded project SOA4All we propose and test the use of Genetic Algorithms.

  15. Semantics in NETMAR (open service NETwork for MARine environmental data)

    NASA Astrophysics Data System (ADS)

    Leadbetter, Adam; Lowry, Roy; Clements, Oliver

    2010-05-01

    Over recent years, there has been a proliferation of environmental data portals utilising a wide range of systems and services, many of which cannot interoperate. The European Union Framework 7 project NETMAR (that commenced February 2010) aims to provide a toolkit for building such portals in a coherent manner through the use of chained Open Geospatial Consortium Web Services (WxS), OPeNDAP file access and W3C standards controlled by a Business Process Execution Language workflow. As such, the end product will be configurable by user communities interested in developing a portal for marine environmental data, and will offer search, download and integration tools for a range of satellite, model and observed data from open ocean and coastal areas. Further processing of these data will also be available in order to provide statistics and derived products suitable for decision making in the chosen environmental domain. In order to make the resulting portals truly interoperable, the NETMAR programme requires a detailed definition of the semantics of the services being called and the data which are being requested. A key goal of the NETMAR programme is, therefore, to develop a multi-domain and multilingual ontology of marine data and services. This will allow searches across both human languages and across scientific domains. The approach taken will be to analyse existing semantic resources and provide mappings between them, gluing together the definitions, semantics and workflows of the WxS services. The mappings between terms aim to be more general than the standard "narrower than", "broader than" type seen in the thesauri or simple ontologies implemented by previous programmes. Tools for the development and population of ontologoies will also be provided by NETMAR as there will be instances in which existing resources cannot sufficiently describe newly encountered data or services.

  16. Wrapping and interoperating bioinformatics resources using CORBA.

    PubMed

    Stevens, R; Miller, C

    2000-02-01

    Bioinformaticians seeking to provide services to working biologists are faced with the twin problems of distribution and diversity of resources. Bioinformatics databases are distributed around the world and exist in many kinds of storage forms, platforms and access paradigms. To provide adequate services to biologists, these distributed and diverse resources have to interoperate seamlessly within single applications. The Common Object Request Broker Architecture (CORBA) offers one technical solution to these problems. The key component of CORBA is its use of object orientation as an intermediate form to translate between different representations. This paper concentrates on an explanation of object orientation and how it can be used to overcome the problems of distribution and diversity by describing the interfaces between objects.

  17. SHARP/PRONGHORN Interoperability: Mesh Generation

    SciTech Connect

    Avery Bingham; Javier Ortensi

    2012-09-01

    Progress toward collaboration between the SHARP and MOOSE computational frameworks has been demonstrated through sharing of mesh generation and ensuring mesh compatibility of both tools with MeshKit. MeshKit was used to build a three-dimensional, full-core very high temperature reactor (VHTR) reactor geometry with 120-degree symmetry, which was used to solve a neutron diffusion critical eigenvalue problem in PRONGHORN. PRONGHORN is an application of MOOSE that is capable of solving coupled neutron diffusion, heat conduction, and homogenized flow problems. The results were compared to a solution found on a 120-degree, reflected, three-dimensional VHTR mesh geometry generated by PRONGHORN. The ability to exchange compatible mesh geometries between the two codes is instrumental for future collaboration and interoperability. The results were found to be in good agreement between the two meshes, thus demonstrating the compatibility of the SHARP and MOOSE frameworks. This outcome makes future collaboration possible.

  18. Flexible solution for interoperable cloud healthcare systems.

    PubMed

    Vida, Mihaela Marcella; Lupşe, Oana Sorina; Stoicu-Tivadar, Lăcrămioara; Bernad, Elena

    2012-01-01

    It is extremely important for the healthcare domain to have a standardized communication because will improve the quality of information and in the end the resulting benefits will improve the quality of patients' life. The standards proposed to be used are: HL7 CDA and CCD. For a better access to the medical data a solution based on cloud computing (CC) is investigated. CC is a technology that supports flexibility, seamless care, and reduced costs of the medical act. To ensure interoperability between healthcare information systems a solution creating a Web Custom Control is presented. The control shows the database tables and fields used to configure the two standards. This control will facilitate the work of the medical staff and hospital administrators, because they can configure the local system easily and prepare it for communication with other systems. The resulted information will have a higher quality and will provide knowledge that will support better patient management and diagnosis.

  19. Towards virtual knowledge broker services for semantic integration of life science literature and data sources.

    PubMed

    Harrow, Ian; Filsell, Wendy; Woollard, Peter; Dix, Ian; Braxenthaler, Michael; Gedye, Richard; Hoole, David; Kidd, Richard; Wilson, Jabe; Rebholz-Schuhmann, Dietrich

    2013-05-01

    Research in the life sciences requires ready access to primary data, derived information and relevant knowledge from a multitude of sources. Integration and interoperability of such resources are crucial for sharing content across research domains relevant to the life sciences. In this article we present a perspective review of data integration with emphasis on a semantics driven approach to data integration that pushes content into a shared infrastructure, reduces data redundancy and clarifies any inconsistencies. This enables much improved access to life science data from numerous primary sources. The Semantic Enrichment of the Scientific Literature (SESL) pilot project demonstrates feasibility for using already available open semantic web standards and technologies to integrate public and proprietary data resources, which span structured and unstructured content. This has been accomplished through a precompetitive consortium, which provides a cost effective approach for numerous stakeholders to work together to solve common problems.

  20. Biodiversity information platforms: From standards to interoperability

    PubMed Central

    Berendsohn, W. G.; Güntsch, A.; Hoffmann, N.; Kohlbecker, A.; Luther, K.; Müller, A.

    2011-01-01

    Abstract One of the most serious bottlenecks in the scientific workflows of biodiversity sciences is the need to integrate data from different sources, software applications, and services for analysis, visualisation and publication. For more than a quarter of a century the TDWG Biodiversity Information Standards organisation has a central role in defining and promoting data standards and protocols supporting interoperability between disparate and locally distributed systems.Although often not sufficiently recognized, TDWG standards are the foundation of many popular Biodiversity Informatics applications and infrastructures ranging from small desktop software solutions to large scale international data networks. However, individual scientists and groups of collaborating scientist have difficulties in fully exploiting the potential of standards that are often notoriously complex, lack non-technical documentations, and use different representations and underlying technologies. In the last few years, a series of initiatives such as Scratchpads, the EDIT Platform for Cybertaxonomy, and biowikifarm have started to implement and set up virtual work platforms for biodiversity sciences which shield their users from the complexity of the underlying standards. Apart from being practical work-horses for numerous working processes related to biodiversity sciences, they can be seen as information brokers mediating information between multiple data standards and protocols.The ViBRANT project will further strengthen the flexibility and power of virtual biodiversity working platforms by building software interfaces between them, thus facilitating essential information flows needed for comprehensive data exchange, data indexing, web-publication, and versioning. This work will make an important contribution to the shaping of an international, interoperable, and user-oriented biodiversity information infrastructure. PMID:22207807

  1. Semantic Services for Wikipedia

    NASA Astrophysics Data System (ADS)

    Wang, Haofen; Penin, Thomas; Fu, Linyun; Liu, Qiaoling; Xue, Guirong; Yu, Yong

    Wikipedia, a killer application in Web 2.0, has embraced the power of collaborative editing to harness collective intelligence. It features many attractive characteristics, like entity-based link graph, abundant categorization and semi-structured layout, and can serve as an ideal data source to extract high quality and well-structured data. In this chapter, we first propose several solutions to extract knowledge from Wikipedia. We do not only consider information from the relational summaries of articles (infoboxes) but also semi-automatically extract it from the article text using the structured content available. Due to differences with information extraction from the Web, it is necessary to tackle new problems, like the lack of redundancy in Wikipedia that is dealt with by extending traditional machine learning algorithms to work with few labeled data. Furthermore, we also exploit the widespread categories as a complementary way to discover additional knowledge. Benefiting from both structured and textural information, we additionally provide a suggestion service for Wikipedia authoring. With the aim to facilitate semantic reuse, our proposal provides users with facilities such as link, categories and infobox content suggestions. The proposed enhancements can be applied to attract more contributors and lighten the burden of professional editors. Finally, we developed an enhanced search system, which can ease the process of exploiting Wikipedia. To provide a user-friendly interface, it extends the faceted search interface with relation navigation and let the user easily express his complex information needs in an interactive way. In order to achieve efficient query answering, it extends scalable IR engines to index and search both the textual and structured information with an integrated ranking support.

  2. SMASH: A Data-driven Informatics Method to Assist Experts in Characterizing Semantic Heterogeneity among Data Elements

    PubMed Central

    Brown, William; Weng, Chunhua; Vawdrey, David K.; Carballo-Diéguez, Alex; Bakken, Suzanne

    2016-01-01

    Semantic heterogeneity (SH) is detrimental to data interoperability and integration in healthcare. Assessing SH is difficult, yet fundamental to addressing the problem. Using expert-based and data-driven methods we assessed SH among HIV-associated data elements (DEs). Using Clinicaltrials.gov, we identified and obtained eight data dictionaries, and created a DE inventory. We vectorized DEs by study, and developed a new method, String Metric-assisted Assessment of Semantic Heterogeneity (SMASH), to find DEs: similar in An and Bn, unique to An, and unique to Bn. An HIV expert assessed pairs for semantic equivalence. Heterogeneous DEs were either semantically-equivalent/syntactically-different (HIV-positive/HIV+/Seropositive), or syntactically-equivalent/semantically-different (“Partner” [sexual]/“Partner”[relationship]). Context of usage was considered. SMASH aided identification of SH. Of 1,175 DE from pairs, 1,048 (87%) were semantically heterogeneous and 127 (13%) were homogeneous. Most heterogeneous pairs (97%) were semantically-equivalent/syntactically-different. Expert-based and data-driven methods are complementary for assessing SH, especially among semantically-equivalent/syntactically-different DE. Similar expert-based/data-driven solutions are recommended for resolving SH. PMID:28269930

  3. Enabling interoperability in Geoscience with GI-suite

    NASA Astrophysics Data System (ADS)

    Boldrini, Enrico; Papeschi, Fabrizio; Santoro, Mattia; Nativi, Stefano

    2015-04-01

    GI-suite is a brokering framework targeting interoperability of heterogeneous systems in the Geoscience domain. The framework is composed by different brokers each one focusing on a specific functionality: discovery, access and semantics (i.e. GI-cat, GI-axe, GI-sem). The brokering takes place between a set of heterogeneous publishing services and a set of heterogeneous consumer applications: the brokering target is represented by resources (e.g. coverages, features, or metadata information) required to seamlessly flow from the providers to the consumers. Different international and community standards are now supported by GI-suite, making possible the successful deployment of GI-suite in many international projects and initiatives (such as GEOSS, NSF BCube and several EU funded projects). As for the publisher side more than 40 standards and implementations are supported (e.g. Dublin Core, OAI-PMH, OGC W*S, Geonetwork, THREDDS Data Server, Hyrax Server, etc.). The support for each individual standard is provided by means of specific GI-suite components, called accessors. As for the consumer applications side more than 15 standards and implementations are supported (e.g. ESRI ArcGIS, Openlayers, OGC W*S, OAI-PMH clients, etc.). The support for each individual standard is provided by means of specific profiler components. The GI-suite can be used in different scenarios by different actors: - A data provider having a pre-existent data repository can deploy and configure GI-suite to broker it and making thus available its data resources through different protocols to many different users (e.g. for data discovery and/or data access) - A data consumer can use GI-suite to discover and/or access resources from a variety of publishing services that are already publishing data according to well-known standards. - A community can deploy and configure GI-suite to build a community (or project-specific) broker: GI-suite can broker a set of community related repositories and

  4. Non-semantic contributions to "semantic" redundancy gain.

    PubMed

    Shepherdson, Peter; Miller, Jeff

    2016-01-01

    Recently, two groups of researchers have reported redundancy gains (enhanced performance with multiple, redundant targets) in tasks requiring semantic categorization. Here we report two experiments aimed at determining whether the gains found by one of these groups resulted from some form of semantic coactivation. We asked undergraduate psychology students to complete choice RT tasks requiring the semantic categorization of visually presented words, and compared performance with redundant targets from the same semantic category to performance with redundant targets from different semantic categories. If the redundancy gains resulted from the combination of information at a semantic level, they should have been greater in the former than the latter situation. However, our results showed no significant differences in redundancy gain (for latency and accuracy) between same-category and different-category conditions, despite gains appearing in both conditions. Thus, we suggest that redundancy gain in the semantic categorization task may result entirely from statistical facilitation or combination of information at non-semantic levels.

  5. Interoperability framework for communication between processes running on different mobile operating systems

    NASA Astrophysics Data System (ADS)

    Gal, A.; Filip, I.; Dragan, F.

    2016-02-01

    As we live in an era where mobile communication is everywhere around us, the necessity to communicate between the variety of the devices we have available becomes even more of an urge. The major impediment to be able to achieve communication between the available devices is the incompatibility between the operating systems running on these devices. In the present paper we propose a framework that will make possible the ability to inter-operate between processes running on different mobile operating systems. The interoperability process will make use of any communication environment which is made available by the mobile devices where the processes are installed. The communication environment is chosen so as the process is optimal in terms of transferring the data between the mobile devices. The paper defines the architecture of the framework, expanding the functionality and interrelation between modules that make up the framework. For the proof of concept, we propose to use three different mobile operating systems installed on three different types of mobile devices. Depending on the various factors related to the structure of the mobile devices and the data type to be transferred, the framework will establish a data transfer protocol that will be used. The framework automates the interoperability process, user intervention being limited to a simple selection from the options that the framework suggests based on the full analysis of structural and functional elements of the mobile devices used in the process.

  6. Enabling Interoperable Space Robots With the Joint Technical Architecture for Robotic Systems (JTARS)

    NASA Technical Reports Server (NTRS)

    Bradley, Arthur; Dubowsky, Steven; Quinn, Roger; Marzwell, Neville

    2005-01-01

    Robots that operate independently of one another will not be adequate to accomplish the future exploration tasks of long-distance autonomous navigation, habitat construction, resource discovery, and material handling. Such activities will require that systems widely share information, plan and divide complex tasks, share common resources, and physically cooperate to manipulate objects. Recognizing the need for interoperable robots to accomplish the new exploration initiative, NASA s Office of Exploration Systems Research & Technology recently funded the development of the Joint Technical Architecture for Robotic Systems (JTARS). JTARS charter is to identify the interface standards necessary to achieve interoperability among space robots. A JTARS working group (JTARS-WG) has been established comprising recognized leaders in the field of space robotics including representatives from seven NASA centers along with academia and private industry. The working group s early accomplishments include addressing key issues required for interoperability, defining which systems are within the project s scope, and framing the JTARS manuals around classes of robotic systems.

  7. COEUS: “semantic web in a box” for biomedical applications

    PubMed Central

    2012-01-01

    Background As the “omics” revolution unfolds, the growth in data quantity and diversity is bringing about the need for pioneering bioinformatics software, capable of significantly improving the research workflow. To cope with these computer science demands, biomedical software engineers are adopting emerging semantic web technologies that better suit the life sciences domain. The latter’s complex relationships are easily mapped into semantic web graphs, enabling a superior understanding of collected knowledge. Despite increased awareness of semantic web technologies in bioinformatics, their use is still limited. Results COEUS is a new semantic web framework, aiming at a streamlined application development cycle and following a “semantic web in a box” approach. The framework provides a single package including advanced data integration and triplification tools, base ontologies, a web-oriented engine and a flexible exploration API. Resources can be integrated from heterogeneous sources, including CSV and XML files or SQL and SPARQL query results, and mapped directly to one or more ontologies. Advanced interoperability features include REST services, a SPARQL endpoint and LinkedData publication. These enable the creation of multiple applications for web, desktop or mobile environments, and empower a new knowledge federation layer. Conclusions The platform, targeted at biomedical application developers, provides a complete skeleton ready for rapid application deployment, enhancing the creation of new semantic information systems. COEUS is available as open source at http://bioinformatics.ua.pt/coeus/. PMID:23244467

  8. Global Interoperability Using Semantics, Standards, Science and Technology (GIS3T)

    DTIC Science & Technology

    2009-01-01

    multiple data elements, for example, declarative statements. Knowledge goes beyond the data level to include a mixture of concepts and the...Proceeding of the 2004 International Knowledge Discovery and Ontology Work- shop (KDO 2004), Pisa , Italy, Sep. 2004, pp. 20–24. [12] M.G. Ceruti, A

  9. Semantic Interoperable Electronic Patient Records: The Unfolding of Consensus based Archetypes.

    PubMed

    Pedersen, Rune; Wynn, Rolf; Ellingsen, Gunnar

    2015-01-01

    This paper is a status report from a large-scale openEHR-based EPR project from the North Norway Regional Health Authority encouraged by the unfolding of a national repository for openEHR archetypes. Clinicians need to engage in, and be responsible for the production of archetypes. The consensus processes have so far been challenged by a low number of active clinicians, a lack of critical specialties to reach consensus, and a cumbersome review process (3 or 4 review rounds) for each archetype. The goal is to have several clinicians from each specialty as a backup if one is hampered to participate. Archetypes and their importance for structured data and sharing of information has to become more visible for the clinicians through more sharpened information practice.

  10. Mediation, Alignment, and Information Services for Semantic interoperability (MAISSI): A Trade Study

    DTIC Science & Technology

    2007-06-01

    Java Messaging Service.................................................................................37 5.2.2 ICE Internet Communications Engine...16. Context lattice for Internet Movie Database. ............................................................. 30 Figure 18. The OMG Object... things like servers and software.) Simple vs. complex mappings. An important distinction between matchers is whether or not complex mappings are

  11. Towards Data Repository Interoperability: The Data Conservancy Data Packaging Specification

    NASA Astrophysics Data System (ADS)

    DiLauro, T.; Duerr, R.; Thessen, A. E.; Rippin, M.; Pralle, B.; Choudhury, G. S.

    2013-12-01

    description, the DCS instance will be able to provide default mappings for the directories and files within the package payload and enable support for deposited content at a lower level of service. Internally, the DCS will map these hybrid package serializations to its own internal business objects and their properties. Thus, this approach is highly extensible, as other packaging formats could be mapped in a similar manner. In addition, this scheme supports establishing the fixity of the payload while still supporting update of the semantic overlay data. This allows a data producer with scarce resources or an archivist who acquires a researcher's data to package the data for deposit with the intention of augmenting the resource description in the future. The Data Conservancy is partnering with the Sustainable Environment Actionable Data[4] project to test the interoperability of this new packaging mechanism. [1] Data Conservancy: http://dataconservancy.org/ [2] BagIt: https://datatracker.ietf.org/doc/draft-kunze-bagit/ [3] OAI-ORE: http://www.openarchives.org/ore/1.0/ [4] SEAD: http://sead-data.net/

  12. Standardization and Interoperability Problems of European Electronic Tolling Service (EETS)

    NASA Astrophysics Data System (ADS)

    Nowacki, Gabriel; Mitraszewska, Izabella; Kamiński, Tomasz; Potapczuk, Włodzimierz; Kallweit, Thomas

    The paper refers to some standardization and interoperability problems of the European Electronic Toll Service (EETS) implementation in European Union. The existing EETS systems in the European Union member states are not interoperable due to many differences among them. European Commission has taken bold steps to address that issue. The first one was the 2004/52/EC Directive on the interoperability in the Community. The second one was the decision to launch Europe's own Galileo system. The third was the EC decision from 6th October 2009, based on Research Charging Interoperability (RCI) and the Common Electronic Fee Collection System for a Road Tolling European Service(CESARE) projects. Furthermore, the Motor Transport Institute researches, concerning the mentioned matters have been presented too.

  13. Interoperability of Repositories: The Simple Query Interface in ARIADNE

    ERIC Educational Resources Information Center

    Ternier, Stefaan; Duval, Erik

    2006-01-01

    This article reports on our experiences in providing interoperability between the ARIADNE knowledge pool system (KPS) (Duval, Forte, Cardinaels, Verhoeven, Van Durm, Hendrickx et al., 2001) and several other heterogeneous learning object repositories and referatories.

  14. Reuse and Interoperability of Avionics for Space Systems

    NASA Technical Reports Server (NTRS)

    Hodson, Robert F.

    2007-01-01

    The space environment presents unique challenges for avionics. Launch survivability, thermal management, radiation protection, and other factors are important for successful space designs. Many existing avionics designs use custom hardware and software to meet the requirements of space systems. Although some space vendors have moved more towards a standard product line approach to avionics, the space industry still lacks similar standards and common practices for avionics development. This lack of commonality manifests itself in limited reuse and a lack of interoperability. To address NASA s need for interoperable avionics that facilitate reuse, several hardware and software approaches are discussed. Experiences with existing space boards and the application of terrestrial standards is outlined. Enhancements and extensions to these standards are considered. A modular stack-based approach to space avionics is presented. Software and reconfigurable logic cores are considered for extending interoperability and reuse. Finally, some of the issues associated with the design of reusable interoperable avionics are discussed.

  15. iPad: Semantic annotation and markup of radiological images.

    PubMed

    Rubin, Daniel L; Rodriguez, Cesar; Shah, Priyanka; Beaulieu, Chris

    2008-11-06

    Radiological images contain a wealth of information,such as anatomy and pathology, which is often not explicit and computationally accessible. Information schemes are being developed to describe the semantic content of images, but such schemes can be unwieldy to operationalize because there are few tools to enable users to capture structured information easily as part of the routine research workflow. We have created iPad, an open source tool enabling researchers and clinicians to create semantic annotations on radiological images. iPad hides the complexity of the underlying image annotation information model from users, permitting them to describe images and image regions using a graphical interface that maps their descriptions to structured ontologies semi-automatically. Image annotations are saved in a variety of formats,enabling interoperability among medical records systems, image archives in hospitals, and the Semantic Web. Tools such as iPad can help reduce the burden of collecting structured information from images, and it could ultimately enable researchers and physicians to exploit images on a very large scale and glean the biological and physiological significance of image content.

  16. Temporal Representation in Semantic Graphs

    SciTech Connect

    Levandoski, J J; Abdulla, G M

    2007-08-07

    A wide range of knowledge discovery and analysis applications, ranging from business to biological, make use of semantic graphs when modeling relationships and concepts. Most of the semantic graphs used in these applications are assumed to be static pieces of information, meaning temporal evolution of concepts and relationships are not taken into account. Guided by the need for more advanced semantic graph queries involving temporal concepts, this paper surveys the existing work involving temporal representations in semantic graphs.

  17. Image sharing: evolving solutions in the age of interoperability.

    PubMed

    Mendelson, David S; Erickson, Bradley J; Choy, Garry

    2014-12-01

    Interoperability is a major focus of the quickly evolving world of Health IT. Easy, yet secure and confidential exchange of imaging exams and the associated reports must be a part of the solutions that are implemented. The availability of historical exams is essential in providing a quality interpretation and reducing inappropriate utilization of imaging services. Today, the exchange of imaging exams is most often achieved via a compact disc. We describe the virtues of this solution as well as challenges that have surfaced. Internet- and cloud-based technologies employed for many consumer services can provide a better solution. Vendors are making these solutions available. Standards for Internet-based exchange are emerging. Just as radiology converged on DICOM as a standard to store and view images, we need a common exchange standard. We will review the existing standards and how they are organized into useful workflows through Integrating the Healthcare Enterprise profiles. Integrating the Healthcare Enterprise and standards development processes are discussed. Health care and the domain of radiology must stay current with quickly evolving Internet standards. The successful use of the "cloud" will depend on both the technologies and the policies put into place around them, both of which we discuss. The radiology community must lead the way and provide a solution that works for radiologists and clinicians with use of the electronic medical record. We describe features we believe radiologists should consider when adding Internet-based exchange solutions to their practice.

  18. A Proposed Information Architecture for Telehealth System Interoperability

    SciTech Connect

    Warren, S.; Craft, R.L.; Parks, R.C.; Gallagher, L.K.; Garcia, R.J.; Funkhouser, D.R.

    1999-04-07

    Telemedicine technology is rapidly evolving. Whereas early telemedicine consultations relied primarily on video conferencing, consultations today may utilize video conferencing, medical peripherals, store-and-forward capabilities, electronic patient record management software, and/or a host of other emerging technologies. These remote care systems rely increasingly on distributed, collaborative information technology during the care delivery process, in its many forms. While these leading-edge systems are bellwethers for highly advanced telemedicine, the remote care market today is still immature. Most telemedicine systems are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver entire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. We propose a secure, object-oriented information architecture for telemedicine systems that promotes plug-and-play interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a lego-like fashion to achieve the desired device or system functionality. The architecture will support various ongoing standards work in the medical device arena.

  19. A Proposed Information Architecture for Telehealth System Interoperability

    SciTech Connect

    Craft, R.L.; Funkhouser, D.R.; Gallagher, L.K.; Garica, R.J.; Parks, R.C.; Warren, S.

    1999-04-20

    We propose an object-oriented information architecture for telemedicine systems that promotes secure `plug-and-play' interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a ''lego-like'' fashion to achieve the desired device or system functionality. Introduction Telemedicine systems today rely increasingly on distributed, collaborative information technology during the care delivery process. While these leading-edge systems are bellwethers for highly advanced telemedicine, most are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver en- tire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. This paper proposes a reference architecture for plug-and-play telemedicine systems that addresses these issues.

  20. An incremental database access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nicholas; Sellis, Timos

    1994-01-01

    We investigated a number of design and performance issues of interoperable database management systems (DBMS's). The major results of our investigation were obtained in the areas of client-server database architectures for heterogeneous DBMS's, incremental computation models, buffer management techniques, and query optimization. We finished a prototype of an advanced client-server workstation-based DBMS which allows access to multiple heterogeneous commercial DBMS's. Experiments and simulations were then run to compare its performance with the standard client-server architectures. The focus of this research was on adaptive optimization methods of heterogeneous database systems. Adaptive buffer management accounts for the random and object-oriented access methods for which no known characterization of the access patterns exists. Adaptive query optimization means that value distributions and selectives, which play the most significant role in query plan evaluation, are continuously refined to reflect the actual values as opposed to static ones that are computed off-line. Query feedback is a concept that was first introduced to the literature by our group. We employed query feedback for both adaptive buffer management and for computing value distributions and selectivities. For adaptive buffer management, we use the page faults of prior executions to achieve more 'informed' management decisions. For the estimation of the distributions of the selectivities, we use curve-fitting techniques, such as least squares and splines, for regressing on these values.

  1. Ensuring Sustainable Data Interoperability Across the Natural and Social Sciences

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.

    2015-12-01

    Both the natural and social science data communities are attempting to address the long-term sustainability of their data infrastructures in rapidly changing research, technological, and policy environments. Many parts of these communities are also considering how to improve the interoperability and integration of their data and systems across natural, social, health, and other domains. However, these efforts have generally been undertaken in parallel, with little thought about how different sustainability approaches may impact long-term interoperability from scientific, legal, or economic perspectives, or vice versa, i.e., how improved interoperability could enhance—or threaten—infrastructure sustainability. Scientific progress depends substantially on the ability to learn from the legacy of previous work available for current and future scientists to study, often by integrating disparate data not previously assembled. Digital data are less likely than scientific publications to be usable in the future unless they are managed by science-oriented repositories that can support long-term data access with the documentation and services needed for future interoperability. We summarize recent discussions in the social and natural science communities on emerging approaches to sustainability and relevant interoperability activities, including efforts by the Belmont Forum E-Infrastructures project to address global change data infrastructure needs; the Group on Earth Observations to further implement data sharing and improve data management across diverse societal benefit areas; and the Research Data Alliance to develop legal interoperability principles and guidelines and to address challenges faced by domain repositories. We also examine emerging needs for data interoperability in the context of the post-2015 development agenda and the expected set of Sustainable Development Goals (SDGs), which set ambitious targets for sustainable development, poverty reduction, and

  2. Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101

    DTIC Science & Technology

    2012-07-01

    Statement A. Approved for public release. How does UGV IOP relate to Navy AEODRS Program? • The Advanced EOD Robotic System (AEODRS) Inc. I program...IOPS) 101 Mark Mazzara, Interoperability Lead Robotic Systems Joint Project Office (RS JPO) UNCLASSIFIED: Distribution Statement A. Approved for...09-07-2012 4. TITLE AND SUBTITLE Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101 5a. CONTRACT NUMBER 5b. GRANT

  3. Moving Towards Unmanned Systems Live Virtual & Constructive Interoperability

    DTIC Science & Technology

    2009-01-01

    to standard message formats B406-005-0049 5 ( STANAG 4586), common data links ( STANAG 7085 ), and a plethora of other open format standards...promotion of interoperability for Unmanned Systems and has defined the STANAG 4586 protocol. This protocol identifies the standard interfaces required...Using the STANAG 4586 interoperability standard, IBST’s Air Combat Environment Test and Evaluation Facility (ACETEF) has integrated a generic

  4. Semantic Search of Web Services

    ERIC Educational Resources Information Center

    Hao, Ke

    2013-01-01

    This dissertation addresses semantic search of Web services using natural language processing. We first survey various existing approaches, focusing on the fact that the expensive costs of current semantic annotation frameworks result in limited use of semantic search for large scale applications. We then propose a vector space model based service…

  5. ISAIA: Interoperable Systems for Archival Information Access

    NASA Technical Reports Server (NTRS)

    Hanisch, Robert J.

    2002-01-01

    The ISAIA project was originally proposed in 1999 as a successor to the informal AstroBrowse project. AstroBrowse, which provided a data location service for astronomical archives and catalogs, was a first step toward data system integration and interoperability. The goals of ISAIA were ambitious: '...To develop an interdisciplinary data location and integration service for space science. Building upon existing data services and communications protocols, this service will allow users to transparently query hundreds or thousands of WWW-based resources (catalogs, data, computational resources, bibliographic references, etc.) from a single interface. The service will collect responses from various resources and integrate them in a seamless fashion for display and manipulation by the user.' Funding was approved only for a one-year pilot study, a decision that in retrospect was wise given the rapid changes in information technology in the past few years and the emergence of the Virtual Observatory initiatives in the US and worldwide. Indeed, the ISAIA pilot study was influential in shaping the science goals, system design, metadata standards, and technology choices for the virtual observatory. The ISAIA pilot project also helped to cement working relationships among the NASA data centers, US ground-based observatories, and international data centers. The ISAIA project was formed as a collaborative effort between thirteen institutions that provided data to astronomers, space physicists, and planetary scientists. Among the fruits we ultimately hoped would come from this project would be a central site on the Web that any space scientist could use to efficiently locate existing data relevant to a particular scientific question. Furthermore, we hoped that the needed technology would be general enough to allow smaller, more-focused community within space science could use the same technologies and standards to provide more specialized services. A major challenge to searching

  6. Differential equation dynamical system based assessment model in GNSS interoperability

    NASA Astrophysics Data System (ADS)

    Han, Tao; Lu, XiaoChun; Wang, Xue; Rao, YongNan; Zou, DeCai; Yang, JianFei; Wu, YangYang

    2011-06-01

    With the development of Global Navigation Satellite System (GNSS), the idea of GNSS interoperability is born and has become the focus of study in the field of satellite navigation. The popularity for GNSS to augment the interoperability with the existing ones necessitates the study of the assessment algorithm of this idea. In this paper, an assessment algorithm for interoperability comprehensive benefits based on the differential equation dynamical system is discussed. There are two important aspects in GNSS that interoperability will affect: one is the performance advancement; the other one is the cost of adopting interoperability. While researching the complex relationship between the performance and cost, we found this relationship is similar as what between prey and predator in biomathematics, so the Lotka-Volterra model used to depict the prey-predator relationship is a felicitous tool. After building a differential dynamical model, we analyze the existence and stability of the positive equilibrium in the model. Then a Cost-Effective Function of GNSS is constructed based on the positive equilibrium, which is employed to assess the interoperability, qualitatively and quantitatively. Finally, the paper demonstrates the significance of the model and its application by citing a numerical example.

  7. Key pillars of data interoperability in Earth Sciences - INSPIRE and beyond

    NASA Astrophysics Data System (ADS)

    Tomas, Robert; Lutz, Michael

    2013-04-01

    The well-known heterogeneity and fragmentation of data models, formats and controlled vocabularies of environmental data limit potential data users from utilising the wealth of environmental information available today across Europe. The main aim of INSPIRE1 is to improve this situation and give users possibility to access, use and correctly interpret environmental data. Over the past years number of INSPIRE technical guidelines (TG) and implementing rules (IR) for interoperability have been developed, involving hundreds of domain experts from across Europe. The data interoperability specifications, which have been developed for all 34 INSPIRE spatial data themes2, are the central component of the TG and IR. Several of these themes are related to the earth sciences, e.g. geology (including hydrogeology, geophysics and geomorphology), mineral and energy resources, soil science, natural hazards, meteorology, oceanography, hydrology and land cover. The following main pillars for data interoperability and harmonisation have been identified during the development of the specifications: Conceptual data models describe the spatial objects and their properties and relationships for the different spatial data themes. To achieve cross-domain harmonization, the data models for all themes are based on a common modelling framework (the INSPIRE Generic Conceptual Model3) and managed in a common UML repository. Harmonised vocabularies (or code lists) are to be used in data exchange in order to overcome interoperability issues caused by heterogeneous free-text and/or multi-lingual content. Since a mapping to a harmonized vocabulary could be difficult, the INSPIRE data models typically allow the provision of more specific terms from local vocabularies in addition to the harmonized terms - utilizing either the extensibility options or additional terminological attributes. Encoding. Currently, specific XML profiles of the Geography Markup Language (GML) are promoted as the standard

  8. Towards a Semantic Web of Things: A Hybrid Semantic Annotation, Extraction, and Reasoning Framework for Cyber-Physical System.

    PubMed

    Wu, Zhenyu; Xu, Yuan; Yang, Yunong; Zhang, Chunhong; Zhu, Xinning; Ji, Yang

    2017-02-20

    Web of Things (WoT) facilitates the discovery and interoperability of Internet of Things (IoT) devices in a cyber-physical system (CPS). Moreover, a uniform knowledge representation of physical resources is quite necessary for further composition, collaboration, and decision-making process in CPS. Though several efforts have integrated semantics with WoT, such as knowledge engineering methods based on semantic sensor networks (SSN), it still could not represent the complex relationships between devices when dynamic composition and collaboration occur, and it totally depends on manual construction of a knowledge base with low scalability. In this paper, to addresses these limitations, we propose the semantic Web of Things (SWoT) framework for CPS (SWoT4CPS). SWoT4CPS provides a hybrid solution with both ontological engineering methods by extending SSN and machine learning methods based on an entity linking (EL) model. To testify to the feasibility and performance, we demonstrate the framework by implementing a temperature anomaly diagnosis and automatic control use case in a building automation system. Evaluation results on the EL method show that linking domain knowledge to DBpedia has a relative high accuracy and the time complexity is at a tolerant level. Advantages and disadvantages of SWoT4CPS with future work are also discussed.

  9. Towards a Semantic Web of Things: A Hybrid Semantic Annotation, Extraction, and Reasoning Framework for Cyber-Physical System

    PubMed Central

    Wu, Zhenyu; Xu, Yuan; Yang, Yunong; Zhang, Chunhong; Zhu, Xinning; Ji, Yang

    2017-01-01

    Web of Things (WoT) facilitates the discovery and interoperability of Internet of Things (IoT) devices in a cyber-physical system (CPS). Moreover, a uniform knowledge representation of physical resources is quite necessary for further composition, collaboration, and decision-making process in CPS. Though several efforts have integrated semantics with WoT, such as knowledge engineering methods based on semantic sensor networks (SSN), it still could not represent the complex relationships between devices when dynamic composition and collaboration occur, and it totally depends on manual construction of a knowledge base with low scalability. In this paper, to addresses these limitations, we propose the semantic Web of Things (SWoT) framework for CPS (SWoT4CPS). SWoT4CPS provides a hybrid solution with both ontological engineering methods by extending SSN and machine learning methods based on an entity linking (EL) model. To testify to the feasibility and performance, we demonstrate the framework by implementing a temperature anomaly diagnosis and automatic control use case in a building automation system. Evaluation results on the EL method show that linking domain knowledge to DBpedia has a relative high accuracy and the time complexity is at a tolerant level. Advantages and disadvantages of SWoT4CPS with future work are also discussed. PMID:28230725

  10. Semantator: semantic annotator for converting biomedical text to linked data.

    PubMed

    Tao, Cui; Song, Dezhao; Sharma, Deepak; Chute, Christopher G

    2013-10-01

    More than 80% of biomedical data is embedded in plain text. The unstructured nature of these text-based documents makes it challenging to easily browse and query the data of interest in them. One approach to facilitate browsing and querying biomedical text is to convert the plain text to a linked web of data, i.e., converting data originally in free text to structured formats with defined meta-level semantics. In this paper, we introduce Semantator (Semantic Annotator), a semantic-web-based environment for annotating data of interest in biomedical documents, browsing and querying the annotated data, and interactively refining annotation results if needed. Through Semantator, information of interest can be either annotated manually or semi-automatically using plug-in information extraction tools. The annotated results will be stored in RDF and can be queried using the SPARQL query language. In addition, semantic reasoners can be directly applied to the annotated data for consistency checking and knowledge inference. Semantator has been released online and was used by the biomedical ontology community who provided positive feedbacks. Our evaluation results indicated that (1) Semantator can perform the annotation functionalities as designed; (2) Semantator can be adopted in real applications in clinical and transactional research; and (3) the annotated results using Semantator can be easily used in Semantic-web-based reasoning tools for further inference.

  11. Semantic Web meets Integrative Biology: a survey.

    PubMed

    Chen, Huajun; Yu, Tong; Chen, Jake Y

    2013-01-01

    Integrative Biology (IB) uses experimental or computational quantitative technologies to characterize biological systems at the molecular, cellular, tissue and population levels. IB typically involves the integration of the data, knowledge and capabilities across disciplinary boundaries in order to solve complex problems. We identify a series of bioinformatics problems posed by interdisciplinary integration: (i) data integration that interconnects structured data across related biomedical domains; (ii) ontology integration that brings jargons, terminologies and taxonomies from various disciplines into a unified network of ontologies; (iii) knowledge integration that integrates disparate knowledge elements from multiple sources; (iv) service integration that build applications out of services provided by different vendors. We argue that IB can benefit significantly from the integration solutions enabled by Semantic Web (SW) technologies. The SW enables scientists to share content beyond the boundaries of applications and websites, resulting into a web of data that is meaningful and understandable to any computers. In this review, we provide insight into how SW technologies can be used to build open, standardized and interoperable solutions for interdisciplinary integration on a global basis. We present a rich set of case studies in system biology, integrative neuroscience, bio-pharmaceutics and translational medicine, to highlight the technical features and benefits of SW applications in IB.

  12. Interoperable Data Sharing for Diverse Scientific Disciplines

    NASA Astrophysics Data System (ADS)

    Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean

    2016-04-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.

  13. The advanced microgrid. Integration and interoperability

    SciTech Connect

    Bower, Ward Isaac; Ton, Dan T.; Guttromson, Ross; Glover, Steven F; Stamp, Jason Edwin; Bhatnagar, Dhruv; Reilly, Jim

    2014-02-01

    This white paper focuses on "advanced microgrids," but sections do, out of necessity, reference today's commercially available systems and installations in order to clearly distinguish the differences and advances. Advanced microgrids have been identified as being a necessary part of the modern electrical grid through a two DOE microgrid workshops, the National Institute of Standards and Technology, Smart Grid Interoperability Panel and other related sources. With their grid-interconnectivity advantages, advanced microgrids will improve system energy efficiency and reliability and provide enabling technologies for grid-independence to end-user sites. One popular definition that has been evolved and is used in multiple references is that a microgrid is a group of interconnected loads and distributed-energy resources within clearly defined electrical boundaries that acts as a single controllable entity with respect to the grid. A microgrid can connect and disconnect from the grid to enable it to operate in both grid-connected or island-mode. Further, an advanced microgrid can then be loosely defined as a dynamic microgrid.

  14. Documenting Models for Interoperability and Reusability ...

    EPA Pesticide Factsheets

    Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration between scientific communities, since component-based modeling can integrate models from different disciplines. Integrated Environmental Modeling (IEM) systems focus on transferring information between components by capturing a conceptual site model; establishing local metadata standards for input/output of models and databases; managing data flow between models and throughout the system; facilitating quality control of data exchanges (e.g., checking units, unit conversions, transfers between software languages); warning and error handling; and coordinating sensitivity/uncertainty analyses. Although many computational software systems facilitate communication between, and execution of, components, there are no common approaches, protocols, or standards for turn-key linkages between software systems and models, especially if modifying components is not the intent. Using a standard ontology, this paper reviews how models can be described for discovery, understanding, evaluation, access, and implementation to facilitate interoperability and reusability. In the proceedings of the International Environmental Modelling and Software Society (iEMSs), 8th International Congress on Environmental Mod

  15. Embedding of Semantic Predications.

    PubMed

    Cohen, Trevor; Widdows, Dominic

    2017-03-08

    This paper concerns the generation of distributed vector representations of biomedical concepts from structured knowledge, in the form of subject-relation-object triplets known as semantic predications. Specifically, we evaluate the extent to which a representational approach we have developed for this purpose previously, known as Predication-based Semantic Indexing (PSI), might benefit from insights gleaned from neural-probabilistic language models, which have enjoyed a surge in popularity in recent years as a means to generate distributed vector representations of terms from free text. To do so, we develop a novel neural-probabilistic approach to encoding predications, called Embedding of Semantic Predications (ESP), by adapting aspects of the Skipgram with Negative Sampling (SGNS) algorithm to this purpose. We compare ESP and PSI across a number of tasks including recovery of encoded information, estimation of semantic similarity and relatedness, and identification of potentially therapeutic and harmful relationships using both analogical retrieval and supervised learning. We find advantages for ESP in some, but not all of these tasks, revealing the contexts in which the additional computational work of neural-probabilistic modeling is justified.

  16. Environmental Attitudes Semantic Differential.

    ERIC Educational Resources Information Center

    Mehne, Paul R.; Goulard, Cary J.

    This booklet is an evaluation instrument which utilizes semantic differential data to assess environmental attitudes. Twelve concepts are included: regulated access to beaches, urban planning, dune vegetation, wetlands, future cities, reclaiming wetlands for building development, city parks, commercial development of beaches, existing cities,…

  17. Semantically Grounded Briefings

    DTIC Science & Technology

    2005-12-01

    occurring relations. AeroText and consequently AeroDAML can be tailored to particular domains through training sessions with annotated corpuses...the complexities of semantic markup by using mnemonic names for URIs, hiding unnamed intermediate objects (represented by “ GenSym ” identifiers), and

  18. Semantic and Lexical Coherence.

    ERIC Educational Resources Information Center

    Fahnestock, Jeanne

    Helping students understand coherence in terms of the lexical ties and semantic relations possible between clauses and sentences formalizes an area of writing instruction that has been somewhat vague before and makes the process of creating a coherent paragraph less mysterious. Many students do not have the intuitive knowledge base for absorbing…

  19. "Dyslexia": Toward Semantical Clarification.

    ERIC Educational Resources Information Center

    Manzo, Anthony V.; Duffelmeyer, Fred

    A formulated definition of the term dyslexia is proposed in this paper in order to clarify the semantical confusion which exists among both specialists and the general public. Dyslexia is explained as a generic term for severe and puzzling reading disability, found to be both acute (where reading-age lags 25 percent or more below mental age) and…

  20. Latent Semantic Analysis.

    ERIC Educational Resources Information Center

    Dumais, Susan T.

    2004-01-01

    Presents a literature review that covers the following topics related to Latent Semantic Analysis (LSA): (1) LSA overview; (2) applications of LSA, including information retrieval (IR), information filtering, cross-language retrieval, and other IR-related LSA applications; (3) modeling human memory, including the relationship of LSA to other…

  1. Semantic Web Development

    DTIC Science & Technology

    2006-09-01

    many documents are not expressible in logica at all, and many in logic but not in N3. However, we are building a system for which a prime goal is the...demonstrate that conventional logica programming tools are efficent and straightforwradly adapted to semantic web work. • Jena RDF toolkit now accepts N3 as

  2. Semantic Shot Classification in Sports Video

    NASA Astrophysics Data System (ADS)

    Duan, Ling-Yu; Xu, Min; Tian, Qi

    2003-01-01

    In this paper, we present a unified framework for semantic shot classification in sports videos. Unlike previous approaches, which focus on clustering by aggregating shots with similar low-level features, the proposed scheme makes use of domain knowledge of a specific sport to perform a top-down video shot classification, including identification of video shot classes for each sport, and supervised learning and classification of the given sports video with low-level and middle-level features extracted from the sports video. It is observed that for each sport we can predefine a small number of semantic shot classes, about 5~10, which covers 90~95% of sports broadcasting video. With the supervised learning method, we can map the low-level features to middle-level semantic video shot attributes such as dominant object motion (a player), camera motion patterns, and court shape, etc. On the basis of the appropriate fusion of those middle-level shot classes, we classify video shots into the predefined video shot classes, each of which has a clear semantic meaning. The proposed method has been tested over 4 types of sports videos: tennis, basketball, volleyball and soccer. Good classification accuracy of 85~95% has been achieved. With correctly classified sports video shots, further structural and temporal analysis, such as event detection, video skimming, table of content, etc, will be greatly facilitated.

  3. Academic Research Library as Broker in Addressing Interoperability Challenges for the Geosciences

    NASA Astrophysics Data System (ADS)

    Smith, P., II

    2015-12-01

    Data capture is an important process in the research lifecycle. Complete descriptive and representative information of the data or database is necessary during data collection whether in the field or in the research lab. The National Science Foundation's (NSF) Public Access Plan (2015) mandates the need for federally funded projects to make their research data more openly available. Developing, implementing, and integrating metadata workflows into to the research process of the data lifecycle facilitates improved data access while also addressing interoperability challenges for the geosciences such as data description and representation. Lack of metadata or data curation can contribute to (1) semantic, (2) ontology, and (3) data integration issues within and across disciplinary domains and projects. Some researchers of EarthCube funded projects have identified these issues as gaps. These gaps can contribute to interoperability data access, discovery, and integration issues between domain-specific and general data repositories. Academic Research Libraries have expertise in providing long-term discovery and access through the use of metadata standards and provision of access to research data, datasets, and publications via institutional repositories. Metadata crosswalks, open archival information systems (OAIS), trusted-repositories, data seal of approval, persistent URL, linking data, objects, resources, and publications in institutional repositories and digital content management systems are common components in the library discipline. These components contribute to a library perspective on data access and discovery that can benefit the geosciences. The USGS Community for Data Integration (CDI) has developed the Science Support Framework (SSF) for data management and integration within its community of practice for contribution to improved understanding of the Earth's physical and biological systems. The USGS CDI SSF can be used as a reference model to map to Earth

  4. Moving Controlled Vocabularies into the Semantic Web

    NASA Astrophysics Data System (ADS)

    Thomas, R.; Lowry, R. K.; Kokkinaki, A.

    2015-12-01

    . Having placed Linked Data tooling over a single SPARQL end point the obvious future development for this system is to support semantic interoperability outside NVS by the incorporation of federated SPARQL end points in the USA and Australia during the ODIP II project. 1https://vocab.nerc.ac.uk/sparql 2 https://www.bodc.ac.uk/data/codes_and_formats/vocabulary_search/

  5. Semantic similarity measure in biomedical domain leverage web search engine.

    PubMed

    Chen, Chi-Huang; Hsieh, Sheau-Ling; Weng, Yung-Ching; Chang, Wen-Yung; Lai, Feipei

    2010-01-01

    Semantic similarity measure plays an essential role in Information Retrieval and Natural Language Processing. In this paper we propose a page-count-based semantic similarity measure and apply it in biomedical domains. Previous researches in semantic web related applications have deployed various semantic similarity measures. Despite the usefulness of the measurements in those applications, measuring semantic similarity between two terms remains a challenge task. The proposed method exploits page counts returned by the Web Search Engine. We define various similarity scores for two given terms P and Q, using the page counts for querying P, Q and P AND Q. Moreover, we propose a novel approach to compute semantic similarity using lexico-syntactic patterns with page counts. These different similarity scores are integrated adapting support vector machines, to leverage the robustness of semantic similarity measures. Experimental results on two datasets achieve correlation coefficients of 0.798 on the dataset provided by A. Hliaoutakis, 0.705 on the dataset provide by T. Pedersen with physician scores and 0.496 on the dataset provided by T. Pedersen et al. with expert scores.

  6. Context-Aware Adaptive Hybrid Semantic Relatedness in Biomedical Science

    NASA Astrophysics Data System (ADS)

    Emadzadeh, Ehsan

    Text mining of biomedical literature and clinical notes is a very active field of research in biomedical science. Semantic analysis is one of the core modules for different Natural Language Processing (NLP) solutions. Methods for calculating semantic relatedness of two concepts can be very useful in solutions solving different problems such as relationship extraction, ontology creation and question / answering [1--6]. Several techniques exist in calculating semantic relatedness of two concepts. These techniques utilize different knowledge sources and corpora. So far, researchers attempted to find the best hybrid method for each domain by combining semantic relatedness techniques and data sources manually. In this work, attempts were made to eliminate the needs for manually combining semantic relatedness methods targeting any new contexts or resources through proposing an automated method, which attempted to find the best combination of semantic relatedness techniques and resources to achieve the best semantic relatedness score in every context. This may help the research community find the best hybrid method for each context considering the available algorithms and resources.

  7. GEO Standard and Interoperability Forum (SIF) European Team

    NASA Astrophysics Data System (ADS)

    Nativi, Stefano

    2010-05-01

    The European GEO SIF has been initiated by the GIGAS project in an effort to better coordinate European requirements for GEO and GEOSS related activities, and is recognised by GEO as a regional SIF. To help advance the interoperability goals of the Global Earth Observing System of Systems (GEOSS), the Group on Earth Observations (GEO) Architecture and Data Committee (ADC) has established a Standards and Interoperability Forum (SIF) to support GEO organizations offering components and services to GEOSS. The SIF will help GEOSS contributors understand how to work with the GEOSS interoperability guidelines and how to enter their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) into the GEOSS registries. This will greatly facilitate the utility of GEOSS and encourage significant increase in participation. To carry out its work most effectively, the SIF promotes to form Regional Teams. They will help to organize and optimize the support coming from the different parts of the World and reach out regional and multi-disciplinary Scientific Communities. This will allow to have true global representation in supporting GEOSS interoperability. A SIF European Team is foreseen. The main role of the SIF is facilitating interoperability and working with members and participating organizations as they offer data and information services to the users of GEOSS. In this framework, the purpose of having a European Regional Team is to increase efficiency in carrying out the work of the SIF. Experts can join the SIF European Team by registering at the SIF European Team wiki site: http://www.thegigasforum.eu/sif/

  8. A Conceptual Framework to Enhance the Interoperability of Observatories among Countries, Continents and the World

    NASA Astrophysics Data System (ADS)

    Loescher, H.; Fundamental Instrument Unit

    2013-05-01

    , GEO-BON, NutNet, etc.) and domestically, (e.g., NSF-CZO, USDA-LTAR, DOE-NGEE, Soil Carbon Network, etc.), there is a strong and mutual desire to assure interoperability of data. Developing interoperability is the degree by which each of the following is mapped between observatories (entities), defined by linking i) science requirements with science questions, ii) traceability of measurements to nationally and internationally accepted standards, iii) how data product are derived, i.e., algorithms, procedures, and methods, and iv) the bioinformatics which broadly include data formats, metadata, controlled vocabularies, and semantics. Here, we explore the rationale and focus areas for interoperability, the governance and work structures, example projects (NSF-NEON, EU-ICOS, and AU-TERN), and the emergent roles of scientists in these endeavors.

  9. Semantic Web integration of Cheminformatics resources with the SADI framework

    PubMed Central

    2011-01-01

    Background The diversity and the largely independent nature of chemical research efforts over the past half century are, most likely, the major contributors to the current poor state of chemical computational resource and database interoperability. While open software for chemical format interconversion and database entry cross-linking have partially addressed database interoperability, computational resource integration is hindered by the great diversity of software interfaces, languages, access methods, and platforms, among others. This has, in turn, translated into limited reproducibility of computational experiments and the need for application-specific computational workflow construction and semi-automated enactment by human experts, especially where emerging interdisciplinary fields, such as systems chemistry, are pursued. Fortunately, the advent of the Semantic Web, and the very recent introduction of RESTful Semantic Web Services (SWS) may present an opportunity to integrate all of the existing computational and database resources in chemistry into a machine-understandable, unified system that draws on the entirety of the Semantic Web. Results We have created a prototype framework of Semantic Automated Discovery and Integration (SADI) framework SWS that exposes the QSAR descriptor functionality of the Chemistry Development Kit. Since each of these services has formal ontology-defined input and output classes, and each service consumes and produces RDF graphs, clients can automatically reason about the services and available reference information necessary to complete a given overall computational task specified through a simple SPARQL query. We demonstrate this capability by carrying out QSAR analysis backed by a simple formal ontology to determine whether a given molecule is drug-like. Further, we discuss parameter-based control over the execution of SADI SWS. Finally, we demonstrate the value of computational resource envelopment as SADI services through

  10. Interoperable atlases of the human brain.

    PubMed

    Amunts, K; Hawrylycz, M J; Van Essen, D C; Van Horn, J D; Harel, N; Poline, J-B; De Martino, F; Bjaalie, J G; Dehaene-Lambertz, G; Dehaene, S; Valdes-Sosa, P; Thirion, B; Zilles, K; Hill, S L; Abrams, M B; Tass, P A; Vanduffel, W; Evans, A C; Eickhoff, S B

    2014-10-01

    The last two decades have seen an unprecedented development of human brain mapping approaches at various spatial and temporal scales. Together, these have provided a large fundus of information on many different aspects of the human brain including micro- and macrostructural segregation, regional specialization of function, connectivity, and temporal dynamics. Atlases are central in order to integrate such diverse information in a topographically meaningful way. It is noteworthy, that the brain mapping field has been developed along several major lines such as structure vs. function, postmortem vs. in vivo, individual features of the brain vs. population-based aspects, or slow vs. fast dynamics. In order to understand human brain organization, however, it seems inevitable that these different lines are integrated and combined into a multimodal human brain model. To this aim, we held a workshop to determine the constraints of a multi-modal human brain model that are needed to enable (i) an integration of different spatial and temporal scales and data modalities into a common reference system, and (ii) efficient data exchange and analysis. As detailed in this report, to arrive at fully interoperable atlases of the human brain will still require much work at the frontiers of data acquisition, analysis, and representation. Among them, the latter may provide the most challenging task, in particular when it comes to representing features of vastly different scales of space, time and abstraction. The potential benefits of such endeavor, however, clearly outweigh the problems, as only such kind of multi-modal human brain atlas may provide a starting point from which the complex relationships between structure, function, and connectivity may be explored.

  11. Advances in Multi-disciplinary Interoperability

    NASA Astrophysics Data System (ADS)

    Pearlman, J.; Nativi, S.; Craglia, M.; Huerta, J.; Rubio-Iglesias, J. M.; Serrano, J. J.

    2012-04-01

    The challenge for addressing issues such as climate change, food security or ecosystem sustainability is that they require multi-disciplinary collaboration and the ability to integrate information across scientific domains. Multidisciplinary collaborations are difficult because each discipline has its own "language", protocols and formats for communicating within its community and handling data and information. EuroGEOSS demonstrates the added value to the scientific community and to society of making existing systems and applications interoperable and useful within the GEOSS and INSPIRE frameworks. In 2010, the project built an initial operating capacity of a multi-disciplinary Information System addressing three areas: drought, forestry and biodiversity. It is now furthering this development into an advanced operating capacity (http://www.eurogeoss.eu). The key to this capability is the creation of a broker that supports access to multiple resources through a common user interface and the automation of data search and access using state of the art information technology. EuroGEOSS hosted a conference on information systems and multi-disciplinary applications of science and technology. "EuroGEOSS: advancing the vision of GEOSS" provided a forum for developers, users and decision-makers working with advanced multi-disciplinary information systems to improve science and decisions for complex societal issues. In particular, the Conference addressed: Information systems for supporting multi-disciplinary research; Information systems and modeling for biodiversity, drought, forestry and related societal benefit areas; and Case studies of multi-disciplinary applications and outcomes. This paper will discuss the major finding of the conference and the directions for future development.

  12. From Data to Semantic Information

    NASA Astrophysics Data System (ADS)

    Floridi, Luciano

    2003-06-01

    There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudo-information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarises the main results of the paper and indicates the important implications of the revised definition for the analysis of the deflationary theories of truth, the standard definition of knowledge and the classic, quantitative theory of semantic information.

  13. Latent Semantic Indexing of medical diagnoses using UMLS semantic structures.

    PubMed Central

    Chute, C. G.; Yang, Y.; Evans, D. A.

    1991-01-01

    The relational files within the UMLS Metathesaurus contain rich semantic associations to main concepts. We invoked the technique of Latent Semantic Indexing to generate information matrices based on these relationships and created "semantic vectors" using singular value decomposition. Evaluations were made on the complete set and subsets of Metathesaurus main concepts with the semantic type "Disease or Syndrome." Real number matrices were created with main concepts, lexical variants, synonyms, and associated expressions. Ancestors, children, siblings, and related terms were added to alternative matrices, preserving the hierarchical direction of the relation as the imaginary component of a complex number. Preliminary evaluation suggests that this technique is robust. A major advantage is the exploitation of semantic features which derive from a statistical decomposition of UMLS structures, possibly reducing dependence on the tedious construction of semantic frames by humans. PMID:1807584

  14. Restructuring an EHR system and the Medical Markup Language (MML) standard to improve interoperability by archetype technology.

    PubMed

    Kobayashi, Shinji; Kume, Naoto; Yoshihara, Hiroyuki

    2015-01-01

    In 2001, we developed an EHR system for regional healthcare information inter-exchange and to provide individual patient data to patients. This system was adopted in three regions in Japan. We also developed a Medical Markup Language (MML) standard for inter- and intra-hospital communications. The system was built on a legacy platform, however, and had not been appropriately maintained or updated to meet clinical requirements. To improve future maintenance costs, we reconstructed the EHR system using archetype technology on the Ruby on Rails platform, and generated MML equivalent forms from archetypes. The system was deployed as a cloud-based system for preliminary use as a regional EHR. The system now has the capability to catch up with new requirements, maintaining semantic interoperability with archetype technology. It is also more flexible than the legacy EHR system.

  15. Model-based semantic dictionaries for medical language understanding.

    PubMed Central

    Rassinoux, A. M.; Baud, R. H.; Ruch, P.; Trombert-Paviot, B.; Rodrigues, J. M.

    1999-01-01

    Semantic dictionaries are emerging as a major cornerstone towards achieving sound natural language understanding. Indeed, they constitute the main bridge between words and conceptual entities that reflect their meanings. Nowadays, more and more wide-coverage lexical dictionaries are electronically available in the public domain. However, associating a semantic content with lexical entries is not a straightforward task as it is subordinate to the existence of a fine-grained concept model of the treated domain. This paper presents the benefits and pitfalls in building and maintaining multilingual dictionaries, the semantics of which is directly established on an existing concept model. Concrete cases, handled through the GALEN-IN-USE project, illustrate the use of such semantic dictionaries for the analysis and generation of multilingual surgical procedures. PMID:10566333

  16. Complex Semantic Networks

    NASA Astrophysics Data System (ADS)

    Teixeira, G. M.; Aguiar, M. S. F.; Carvalho, C. F.; Dantas, D. R.; Cunha, M. V.; Morais, J. H. M.; Pereira, H. B. B.; Miranda, J. G. V.

    Verbal language is a dynamic mental process. Ideas emerge by means of the selection of words from subjective and individual characteristics throughout the oral discourse. The goal of this work is to characterize the complex network of word associations that emerge from an oral discourse from a discourse topic. Because of that, concepts of associative incidence and fidelity have been elaborated and represented the probability of occurrence of pairs of words in the same sentence in the whole oral discourse. Semantic network of words associations were constructed, where the words are represented as nodes and the edges are created when the incidence-fidelity index between pairs of words exceeds a numerical limit (0.001). Twelve oral discourses were studied. The networks generated from these oral discourses present a typical behavior of complex networks and their indices were calculated and their topologies characterized. The indices of these networks obtained from each incidence-fidelity limit exhibit a critical value in which the semantic network has maximum conceptual information and minimum residual associations. Semantic networks generated by this incidence-fidelity limit depict a pattern of hierarchical classes that represent the different contexts used in the oral discourse.

  17. Attention trees and semantic paths

    NASA Astrophysics Data System (ADS)

    Giusti, Christian; Pieroni, Goffredo G.; Pieroni, Laura

    2007-02-01

    In the last few decades several techniques for image content extraction, often based on segmentation, have been proposed. It has been suggested that under the assumption of very general image content, segmentation becomes unstable and classification becomes unreliable. According to recent psychological theories, certain image regions attract the attention of human observers more than others and, generally, the image main meaning appears concentrated in those regions. Initially, regions attracting our attention are perceived as a whole and hypotheses on their content are formulated; successively the components of those regions are carefully analyzed and a more precise interpretation is reached. It is interesting to observe that an image decomposition process performed according to these psychological visual attention theories might present advantages with respect to a traditional segmentation approach. In this paper we propose an automatic procedure generating image decomposition based on the detection of visual attention regions. A new clustering algorithm taking advantage of the Delaunay- Voronoi diagrams for achieving the decomposition target is proposed. By applying that algorithm recursively, starting from the whole image, a transformation of the image into a tree of related meaningful regions is obtained (Attention Tree). Successively, a semantic interpretation of the leaf nodes is carried out by using a structure of Neural Networks (Neural Tree) assisted by a knowledge base (Ontology Net). Starting from leaf nodes, paths toward the root node across the Attention Tree are attempted. The task of the path consists in relating the semantics of each child-parent node pair and, consequently, in merging the corresponding image regions. The relationship detected in this way between two tree nodes generates, as a result, the extension of the interpreted image area through each step of the path. The construction of several Attention Trees has been performed and partial

  18. Data Access, Discovery and Interoperability in the European Context

    NASA Astrophysics Data System (ADS)

    Genova, Francoise

    2015-12-01

    European Virtual Observatory (VO) activities have been coordinated by a series of projects funded by the European Commission. Three pillar were identified: support to the data providers for implementation of their data in the VO framework; support to the astronomical community for their usage of VO-enabled data and tools; technological work for updating the VO framework of interoperability standards and tools. A new phase is beginning with the ASTERICS cluster project. ASTERICS Work Package "Data Access, Discovery and Interoperability" aims at making the data from the ESFRI projects and their pathfinders available for discovery and usage, interoperable in the VO framework and accessible with VO-enabled common tools. VO teams and representatives of ESFRI and pathfinder projects and of EGO/VIRGO are engaged together in the Work Package. ESO is associated to the project which is also working closely with ESA. The three pillars identified for coordinating Europaen VO activities are tackled.

  19. Metadata behind the Interoperability of Wireless Sensor Networks

    PubMed Central

    Ballari, Daniela; Wachowicz, Monica; Callejo, Miguel Angel Manso

    2009-01-01

    Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability. PMID:22412330

  20. Metadata behind the Interoperability of Wireless Sensor Networks.

    PubMed

    Ballari, Daniela; Wachowicz, Monica; Callejo, Miguel Angel Manso

    2009-01-01

    Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability.

  1. Interoperable and standard e-Health solution over Bluetooth.

    PubMed

    Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J

    2010-01-01

    The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.

  2. Operational Interoperability Challenges on the Example of GEOSS and WIS

    NASA Astrophysics Data System (ADS)

    Heene, M.; Buesselberg, T.; Schroeder, D.; Brotzer, A.; Nativi, S.

    2015-12-01

    The following poster highlights the operational interoperability challenges on the example of Global Earth Observation System of Systems (GEOSS) and World Meteorological Organization Information System (WIS). At the heart of both systems is a catalogue of earth observation data, products and services but with different metadata management concepts. While in WIS a strong governance with an own metadata profile for the hundreds of thousands metadata records exists, GEOSS adopted a more open approach for the ten million records. Furthermore, the development of WIS - as an operational system - follows a roadmap with committed downwards compatibility while the GEOSS development process is more agile. The poster discusses how the interoperability can be reached for the different metadata management concepts and how a proxy concept helps to couple two different systems which follow a different development methodology. Furthermore, the poster highlights the importance of monitoring and backup concepts as a verification method for operational interoperability.

  3. OGC and Grid Interoperability in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and

  4. OMOGENIA: A Semantically Driven Collaborative Environment

    NASA Astrophysics Data System (ADS)

    Liapis, Aggelos

    Ontology creation can be thought of as a social procedure. Indeed the concepts involved in general need to be elicited from communities of domain experts and end-users by teams of knowledge engineers. Many problems in ontology creation appear to resemble certain problems in software design, particularly with respect to the setup of collaborative systems. For instance, the resolution of conceptual conflicts between formalized ontologies is a major engineering problem as ontologies move into widespread use on the semantic web. Such conflict resolution often requires human collaboration and cannot be achieved by automated methods with the exception of simple cases. In this chapter we discuss research in the field of computer-supported cooperative work (CSCW) that focuses on classification and which throws light on ontology building. Furthermore, we present a semantically driven collaborative environment called OMOGENIA as a natural way to display and examine the structure of an evolving ontology in a collaborative setting.

  5. Metaworkflows and Workflow Interoperability for Heliophysics

    NASA Astrophysics Data System (ADS)

    Pierantoni, Gabriele; Carley, Eoin P.

    2014-06-01

    Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They

  6. 77 FR 28387 - Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-14

    ... COMMISSION Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council... Communications Commission's (FCC) Communications Security, Reliability, and Interoperability Council (CSRIC) will... emergency alerting systems such as promoting E9-1-1 reliability and alerting platforms--Emergency...

  7. 76 FR 4102 - Smart Grid Interoperability Standards; Supplemental Notice of Technical Conference

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-24

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Smart Grid Interoperability Standards; Supplemental Notice of Technical... Technical Conference on Smart Grid Interoperability Standards will be held on Monday, January 31,...

  8. 76 FR 72922 - Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-28

    ... location accuracy, and network security. The FCC will attempt to accommodate as many attendees as possible... COMMISSION Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council... Communications Commission's (FCC) third Communications Security, Reliability, and Interoperability Council...

  9. 78 FR 10169 - Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-13

    ... COMMISSION Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council... Communications Commission's (FCC) Communications Security, Reliability, and Interoperability Council (CSRIC) will hold its final meeting. Working groups Next Generation Alerting, E9-1-1 Location Accuracy,...

  10. Development of high performance scientific components for interoperability of computing packages

    SciTech Connect

    Gulabani, Teena Pratap

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

  11. Interoperable Archetypes With a Three Folded Terminology Governance.

    PubMed

    Pederson, Rune; Ellingsen, Gunnar

    2015-01-01

    The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded.

  12. Bringing Interoperability to Warfighters Intelligent Unmanned Systems: Air, Land, and Sea

    DTIC Science & Technology

    2012-10-30

    Navy EOD robot MOCU display Unmanned surface vehicle MOCU display Interoperability across UxV domains 2005 Interoperability Demo...control van • More coordinated target acquisition/engagement • … not true interoperability Stowed ISR Payload Robotic Driver Teleoperated HMMWV...Multiple Robot Host Architecture (MRHA) command-and- control (C2) architecture robotic -security • C2 architecture to oversee multiple UGVs

  13. UGV Interoperability Profile (IOP) - Overarching Profile JAUS Profiling Rules, Version 0

    DTIC Science & Technology

    2011-12-21

    Attribute [Selectable, Multiple] ..................26 4.3.7 Leader Follower (LF) Interoperability Attribute [Selectable...27 Table 4.3-12: Components and Services for Leader Follower Interoperability...28 Table 4.3-13: Components and Services for Leader Follower Interoperability Attribute .............28 Table 4.3-14

  14. Postmarketing Safety Study Tool: A Web Based, Dynamic, and Interoperable System for Postmarketing Drug Surveillance Studies

    PubMed Central

    Sinaci, A. Anil; Laleci Erturkmen, Gokce B.; Gonul, Suat; Yuksel, Mustafa; Invernizzi, Paolo; Thakrar, Bharat; Pacaci, Anil; Cinar, H. Alper; Cicekli, Nihan Kesim

    2015-01-01

    Postmarketing drug surveillance is a crucial aspect of the clinical research activities in pharmacovigilance and pharmacoepidemiology. Successful utilization of available Electronic Health Record (EHR) data can complement and strengthen postmarketing safety studies. In terms of the secondary use of EHRs, access and analysis of patient data across different domains are a critical factor; we address this data interoperability problem between EHR systems and clinical research systems in this paper. We demonstrate that this problem can be solved in an upper level with the use of common data elements in a standardized fashion so that clinical researchers can work with different EHR systems independently of the underlying information model. Postmarketing Safety Study Tool lets the clinical researchers extract data from different EHR systems by designing data collection set schemas through common data elements. The tool interacts with a semantic metadata registry through IHE data element exchange profile. Postmarketing Safety Study Tool and its supporting components have been implemented and deployed on the central data warehouse of the Lombardy region, Italy, which contains anonymized records of about 16 million patients with over 10-year longitudinal data on average. Clinical researchers in Roche validate the tool with real life use cases. PMID:26543873

  15. Macrocognition in Teams - Macrocognition in Collaboration and Knowledge Interoperability

    DTIC Science & Technology

    2007-01-01

    tracing, automated latent semantic analysis, automated communication flow analysis, and dynamic modeling of communication data MACROCOGNITION IN...Team revises solution option if option does not meet goal. • Cognitive maps • Discourse analysis • Think aloud • Latent Semantic Analysis

  16. Interoperability in hospital information systems: a return-on-investment study comparing CPOE with and without laboratory integration.

    PubMed

    Meyer, Rodolphe; Lovis, Christian

    2011-01-01

    Despite its many advantages, using a computerized patient record is still considered as a time consuming activity for care providers. In numerous situations, time is wasted because of the lack of interoperability between systems. In this study, we aim to assess the time gains that nursing teams could achieve with a tightly integrated computerized order entry system. Using a time-motion method, we compared expected versus effective time spent managing laboratory orders for two different computerized systems: one integrated, the other not integrated. Our results tend to show that nurses will complete their task an average of five times faster than their expected performance (p<0.001). We also showed that a tightly integrated system provides a threefold speed gain for nurses compared to a non-integrated CPOE with the laboratory information system (p<0.001). We evaluated the economic benefit of this gain, therefore arguing for a strong interoperability of systems, in addition to patient safety benefits.

  17. Semantic enrichment of medical forms - semi-automated coding of ODM-elements via web services.

    PubMed

    Breil, Bernhard; Watermann, Andreas; Haas, Peter; Dziuballe, Philipp; Dugas, Martin

    2012-01-01

    Semantic interoperability is an unsolved problem which occurs while working with medical forms from different information systems or institutions. Standards like ODM or CDA assure structural homogenization but in order to compare elements from different data models it is necessary to use semantic concepts and codes on an item level of those structures. We developed and implemented a web-based tool which enables a domain expert to perform semi-automated coding of ODM-files. For each item it is possible to inquire web services which result in unique concept codes without leaving the context of the document. Although it was not feasible to perform a totally automated coding we have implemented a dialog based method to perform an efficient coding of all data elements in the context of the whole document. The proportion of codable items was comparable to results from previous studies.

  18. The UMLS Semantic Network and the Semantic Web.

    PubMed

    Kashyap, Vipul

    2003-01-01

    The Unified Medical Language System is an extensive source of biomedical knowledge developed and maintained by the US National Library of Medicine (NLM) and is being currently used in a wide variety of biomedical applications. The Semantic Network, a component of the UMLS is a structured description of core biomedical knowledge consisting of well defined semantic types and relationships between them. We investigate the expressiveness of DAML+OIL, a markup language proposed for ontologies on the Semantic Web, for representing the knowledge contained in the Semantic Network. Requirements specific to the Semantic Network, such as polymorphic relationships and blocking relationship inheritance are discussed and approaches to represent these in DAML+OIL are presented. Finally, conclusions are presented along with a discussion of ongoing and future work.

  19. EarthCube - Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.

    2014-12-01

    In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a

  20. Semantic Feature Distinctiveness and Frequency

    ERIC Educational Resources Information Center

    Lamb, Katherine M.

    2012-01-01

    Lexical access is the process in which basic components of meaning in language, the lexical entries (words) are activated. This activation is based on the organization and representational structure of the lexical entries. Semantic features of words, which are the prominent semantic characteristics of a word concept, provide important information…

  1. Semantic Tools in Information Retrieval.

    ERIC Educational Resources Information Center

    Rubinoff, Morris; Stone, Don C.

    This report discusses the problem of the meansings of words used in information retrieval systems, and shows how semantic tools can aid in the communication which takes place between indexers and searchers via index terms. After treating the differing use of semantic tools in different types of systems, two tools (classification tables and…

  2. Indexing by Latent Semantic Analysis.

    ERIC Educational Resources Information Center

    Deerwester, Scott; And Others

    1990-01-01

    Describes a new method for automatic indexing and retrieval called latent semantic indexing (LSI). Problems with matching query words with document words in term-based information retrieval systems are discussed, semantic structure is examined, singular value decomposition (SVD) is explained, and the mathematics underlying the SVD model is…

  3. Semantic Processing of Mathematical Gestures

    ERIC Educational Resources Information Center

    Lim, Vanessa K.; Wilson, Anna J.; Hamm, Jeff P.; Phillips, Nicola; Iwabuchi, Sarina J.; Corballis, Michael C.; Arzarello, Ferdinando; Thomas, Michael O. J.

    2009-01-01

    Objective: To examine whether or not university mathematics students semantically process gestures depicting mathematical functions (mathematical gestures) similarly to the way they process action gestures and sentences. Semantic processing was indexed by the N400 effect. Results: The N400 effect elicited by words primed with mathematical gestures…

  4. Information tables with neighborhood semantics

    NASA Astrophysics Data System (ADS)

    Yao, Yiyu

    2000-04-01

    Information tables provide a convenient and useful tool for representing a set of objects using a group of attributes. This notion is enriched by introducing neighborhood systems on attribute values. The neighborhood systems represent the semantics relationships between, and knowledge about, attribute values. With added semantics, neighborhood based information tables may provide a more general framework for knowledge discovery, data mining, and information retrieval.

  5. The semantic planetary data system

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Daniel; Kelly, Sean; Mattmann, Chris

    2005-01-01

    This paper will provide a brief overview of the PDS data model and the PDS catalog. It will then describe the implentation of the Semantic PDS including the development of the formal ontology, the generation of RDFS/XML and RDF/XML data sets, and the buiding of the semantic search application.

  6. Exploring Interoperability as a Multidimensional Challenge for Effective Emergency Response

    ERIC Educational Resources Information Center

    Santisteban, Hiram

    2010-01-01

    Purpose. The purpose of this research was to further an understanding of how the federal government is addressing the challenges of interoperability for emergency response or crisis management (FEMA, 2009) by informing the development of standards through the review of current congressional law, commissions, studies, executive orders, and…

  7. Global Interoperability of Broadband Networks (GIBN): Project Overview

    NASA Technical Reports Server (NTRS)

    DePaula, Ramon P.

    1998-01-01

    Various issues associated with the Global Interoperability of Broadband Networks (GIBN) are presented in viewgraph form. Specific topics include GIBN principles, objectives and goals, and background. GIBN/NASA status, the Transpacific High Definition Video experiment, GIBN experiment selection criteria, satellite industry involvement, and current experiments associated with GIBN are also discussed.

  8. The role of markup for enabling interoperability in health informatics

    PubMed Central

    McKeever, Steve; Johnson, David

    2015-01-01

    Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable. PMID:26042043

  9. Sensor Web Interoperability Testbed Results Incorporating Earth Observation Satellites

    NASA Technical Reports Server (NTRS)

    Frye, Stuart; Mandl, Daniel J.; Alameh, Nadine; Bambacus, Myra; Cappelaere, Pat; Falke, Stefan; Derezinski, Linda; Zhao, Piesheng

    2007-01-01

    This paper describes an Earth Observation Sensor Web scenario based on the Open Geospatial Consortium s Sensor Web Enablement and Web Services interoperability standards. The scenario demonstrates the application of standards in describing, discovering, accessing and tasking satellites and groundbased sensor installations in a sequence of analysis activities that deliver information required by decision makers in response to national, regional or local emergencies.

  10. 75 FR 28206 - Establishment of an Emergency Response Interoperability Center

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-20

    ... the 700 MHz public safety broadband wireless network will be fully operable and interoperable on a... 700 MHz public safety broadband wireless network. The Commission also anticipates that over time, ERIC... public safety broadband wireless network and other public safety communications systems....

  11. Putting the School Interoperability Framework to the Test

    ERIC Educational Resources Information Center

    Mercurius, Neil; Burton, Glenn; Hopkins, Bill; Larsen, Hans

    2004-01-01

    The Jurupa Unified School District in Southern California recently partnered with Microsoft, Dell and the Zone Integration Group for the implementation of a School Interoperability Framework (SIF) database repository model throughout the district (Magner 2002). A two-week project--the Integrated District Education Applications System, better known…

  12. Interoperability Gap Challenges for Learning Object Repositories & Learning Management Systems

    ERIC Educational Resources Information Center

    Mason, Robert T.

    2011-01-01

    An interoperability gap exists between Learning Management Systems (LMSs) and Learning Object Repositories (LORs). Learning Objects (LOs) and the associated Learning Object Metadata (LOM) that is stored within LORs adhere to a variety of LOM standards. A common LOM standard found in LORs is the Sharable Content Object Reference Model (SCORM)…

  13. Nexus: An interoperability layer for parallel and distributed computer systems

    SciTech Connect

    Foster, I.; Kesselman, C.; Olson, R.; Tuecke, S.

    1994-05-01

    Nexus is a set of services that can be used to implement various task-parallel languages, data-parallel languages, and message-passing libraries. Nexus is designed to permit the efficient portable implementation of individual parallel programming systems and the interoperability of programs developed with different tools. Nexus supports lightweight threading and active message technology, allowing integration of message passing and threads.

  14. The next generation of interoperability agents in healthcare.

    PubMed

    Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel; Abelha, António; Machado, José

    2014-05-16

    Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  15. The Next Generation of Interoperability Agents in Healthcare

    PubMed Central

    Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel ; Abelha, António; Machado, José

    2014-01-01

    Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA. PMID:24840351

  16. A Space Acquisition Leading Indicator Based on System Interoperation Maturity

    DTIC Science & Technology

    2010-12-01

    mute or deaf , the interoperability between the conversational parties could be described...reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching ...Patterson Air Force Base, Ohio APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. The views expressed in this thesis are those

  17. The role of markup for enabling interoperability in health informatics.

    PubMed

    McKeever, Steve; Johnson, David

    2015-01-01

    Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  18. 47 CFR 64.621 - Interoperability and portability.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 3 2014-10-01 2014-10-01 false Interoperability and portability. 64.621 Section 64.621 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) MISCELLANEOUS RULES RELATING TO COMMON CARRIERS Telecommunications Relay Services and...

  19. 47 CFR 64.621 - Interoperability and portability.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 3 2013-10-01 2013-10-01 false Interoperability and portability. 64.621 Section 64.621 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) MISCELLANEOUS RULES RELATING TO COMMON CARRIERS Telecommunications Relay Services and...

  20. Interoperability: The Standards Challenge for the 1990s.

    ERIC Educational Resources Information Center

    Lynch, Clifford

    1993-01-01

    Discusses information retrieval standards as tools for interoperability among systems and traces the development of de jure and de facto standards in the U.S. and international telecommunications environment. Implementation of the Open Systems Interconnection (OSI) and Z39.50 standards is described. (EA)

  1. [Semantic information. Internal language. Thinking].

    PubMed

    Azcoaga, J E

    1993-06-01

    Semantic information has reached an objective condition after a lengthy history of semantic inquiries that instrumental neurophysiological devices--such as event-related potentials, electroencephalographic spectral analysis, regional brain circulation, PET scan, deep brain electrodes, and other--have made easier. In turn, internal language, as screened according to Vigotsky's perspective, is considered a product of semantic information circulation understood as neurosemae interconnection. Finally, in normal adults, thinking processes are assumed to be made up by both sensoperceptive information (proprioceptive information included) and semantic information. Thus, an "extraverbal thinking" can be distinguished, whose activity is hardly describable in healthy adults but should be considered as a condition of non-educated deaf persons, and a "verbal thinking", or internal language, made up by semantic information.

  2. Hierarchical abstract semantic model for image classification

    NASA Astrophysics Data System (ADS)

    Ye, Zhipeng; Liu, Peng; Zhao, Wei; Tang, Xianglong

    2015-09-01

    Semantic gap limits the performance of bag-of-visual-words. To deal with this problem, a hierarchical abstract semantics method that builds abstract semantic layers, generates semantic visual vocabularies, measures semantic gap, and constructs classifiers using the Adaboost strategy is proposed. First, abstract semantic layers are proposed to narrow the semantic gap between visual features and their interpretation. Then semantic visual words are extracted as features to train semantic classifiers. One popular form of measurement is used to quantify the semantic gap. The Adaboost training strategy is used to combine weak classifiers into strong ones to further improve performance. For a testing image, the category is estimated layer-by-layer. Corresponding abstract hierarchical structures for popular datasets, including Caltech-101 and MSRC, are proposed for evaluation. The experimental results show that the proposed method is capable of narrowing semantic gaps effectively and performs better than other categorization methods.

  3. Semantic Mediation via Access Broker: the OWS-9 experiment

    NASA Astrophysics Data System (ADS)

    Santoro, Mattia; Papeschi, Fabrizio; Craglia, Massimo; Nativi, Stefano

    2013-04-01

    Even with the use of common data models standards to publish and share geospatial data, users may still face semantic inconsistencies when they use Spatial Data Infrastructures - especially in multidisciplinary contexts. Several semantic mediation solutions exist to address this issue; they span from simple XSLT documents to transform from one data model schema to another, to more complex services based on the use of ontologies. This work presents the activity done in the context of the OGC Web Services Phase 9 (OWS-9) Cross Community Interoperability to develop a semantic mediation solution by enhancing the GEOSS Discovery and Access Broker (DAB). This is a middleware component that provides harmonized access to geospatial datasets according to client applications preferred service interface (Nativi et al. 2012, Vaccari et al. 2012). Given a set of remote feature data encoded in different feature schemas, the objective of the activity was to use the DAB to enable client applications to transparently access the feature data according to one single schema. Due to the flexible architecture of the Access Broker, it was possible to introduce a new transformation type in the configured chain of transformations. In fact, the Access Broker already provided the following transformations: Coordinate Reference System (CRS), spatial resolution, spatial extent (e.g., a subset of a data set), and data encoding format. A new software module was developed to invoke the needed external semantic mediation service and harmonize the accessed features. In OWS-9 the Access Broker invokes a SPARQL WPS to retrieve mapping rules for the OWS-9 schemas: USGS, and NGA schema. The solution implemented to address this problem shows the flexibility and extensibility of the brokering framework underpinning the GEO DAB: new services can be added to augment the number of supported schemas without the need to modify other components and/or software modules. Moreover, all other transformations (CRS

  4. Requirements Development for Interoperability Simulation Capability for Law Enforcement

    SciTech Connect

    Holter, Gregory M.

    2004-05-19

    The National Counterdrug Center (NCC) was initially authorized by Congress in FY 1999 appropriations to create a simulation-based counterdrug interoperability training capability. As the lead organization for Research and Analysis to support the NCC, the Pacific Northwest National Laboratory (PNNL) was responsible for developing the requirements for this interoperability simulation capability. These requirements were structured to address the hardware and software components of the system, as well as the deployment and use of the system. The original set of requirements was developed through a process of conducting a user-based survey of requirements for the simulation capability, coupled with an analysis of similar development efforts. The user-based approach ensured that existing concerns with respect to interoperability within the law enforcement community would be addressed. Law enforcement agencies within the designated pilot area of Cochise County, Arizona, were surveyed using interviews and ride-alongs during actual operations. The results of this survey were then accumulated, organized, and validated with the agencies to ensure the accuracy of the results. These requirements were then supplemented by adapting operational requirements from existing systems to ensure system reliability and operability. The NCC adopted a development approach providing incremental capability through the fielding of a phased series of progressively more capable versions of the system. This allowed for feedback from system users to be incorporated into subsequent revisions of the system requirements, and also allowed the addition of new elements as needed to adapt the system to broader geographic and geopolitical areas, including areas along the southwest and northwest U.S. borders. This paper addresses the processes used to develop and refine requirements for the NCC interoperability simulation capability, as well as the response of the law enforcement community to the use of

  5. Semantic-based sound retrieval by ERP in rapid serial auditory presentation paradigm.

    PubMed

    Jiang, Lei; Cai, Bangyu; Xiao, Siyuan; Wang, Yiwen; Chen, Weidong; Zheng, Xiaoxiang

    2013-01-01

    "Semantic gap" is the major bottleneck of semantic-based multimedia retrieval technique in the field of information retrieval. Studies have shown that robust semantic-based image retrieval can be achieved by single-trial visual evoked event related potential (ERP) detection. However, the question remains whether auditory evoked ERP can be utilized to achieve semantic-based sound retrieval. In this paper, we investigated this question in the rapid serial auditory presentation (RSAP) paradigm. Eight BCI-naïve participants were instructed to perform target detection in RSAP sequences with the vocalizations of 8 familiar animals as sound stimuli, and we compared ERP components and single-trial ERP classification performance between two conditions, the target was a predefined specific one, and the targets were different but belonged to the same semantic category (i.e., semantic-based sound retrieval). Although the amplitudes of ERP components (e.g., N2 and P3) and classification performance decreased a little due to the difficulty of the semantic-based sound retrieval tasks, the best two participants still achieved the area under the receive operating characteristic curve (AUC) of single-trial ERP detection more than 0.77. It suggested that semantic-based sound retrieval by auditory evoked ERP was potentially feasible.

  6. Distributed Semantic Overlay Networks

    NASA Astrophysics Data System (ADS)

    Doulkeridis, Christos; Vlachou, Akrivi; Nørvåg, Kjetil; Vazirgiannis, Michalis

    Semantic Overlay Networks (SONs) have been recently proposed as a way to organize content in peer-to-peer (P2P) networks. The main objective is to discover peers with similar content and then form thematically focused peer groups. Efficient content retrieval can be performed by having queries selectively forwarded only to relevant groups of peers to the query. As a result, less peers need to be contacted, in order to answer a query. In this context, the challenge is to generate SONs in a decentralized and distributed manner, as the centralized assembly of global information is not feasible. Different approaches for exploiting the generated SONs for content retrieval have been proposed in the literature, which are examined in this chapter, with a particular focus on SON interconnections for efficient search. Several applications, such as P2P document and image retrieval, can be deployed over generated SONs, motivating the need for distributed and truly scalable SON creation. Therefore, recently several research papers focus on SONs as stated in our comprehensive overview of related work in the field of semantic overlay networks. A classification of existing algorithms according to a set of qualitative criteria is also provided. In spite of the rich existing work in the field of SONs, several challenges have not been efficiently addressed yet, therefore, future promising research directions are pointed out and discussed at the end of this chapter.

  7. Semantic Workflows and Provenance

    NASA Astrophysics Data System (ADS)

    Gil, Y.

    2011-12-01

    While sharing and disseminating data is widely practiced across scientific communities, we have yet to recognize the importance of sharing and disseminating the analytic processes that leads to published data. Data retrieved from shared repositories and archives is often hard to interpret because we lack documentation about those processes: what models were used, what assumptions were made, what calibrations were carried out, etc. This process documentation is also key to aggregate data in a meaningful way, whether aggregating shared third party data or aggregating shared data with local sensor data collected by individual investigators. We suggest that augmenting published data with process documentation would greatly enhance our ability to find, reuse, interpret, and aggregate data and therefore have a significant impact in the utility of data repositories and archives. We will show that semantic workflows and provenance provide key technologies for capturing process documentation. Semantic workflows describe the kinds of data transformation and analysis steps used to create new data products, and can include useful constraints about why specific models were selected or parameters chosen. Provenance records can be used to publish workflow descriptions in standard formats that can be reused to enable verification and reproducibility of data products.

  8. "Pre-Semantic" Cognition Revisited: Critical Differences between Semantic Aphasia and Semantic Dementia

    ERIC Educational Resources Information Center

    Jefferies, Elizabeth; Rogers, Timothy T.; Hopper, Samantha; Lambon Ralph, Matthew A.

    2010-01-01

    Patients with semantic dementia show a specific pattern of impairment on both verbal and non-verbal "pre-semantic" tasks, e.g., reading aloud, past tense generation, spelling to dictation, lexical decision, object decision, colour decision and delayed picture copying. All seven tasks are characterised by poorer performance for items that are…

  9. Flexible procedural interoperability across security and coalition boundaries using rapidly reconfigurable boundary protection definitions

    NASA Astrophysics Data System (ADS)

    Peach, Nicholas

    2013-05-01

    Existing configuration of boundary protection devices, which validate the content and context of information crossing between security domains, uses a set of accreditor-agreed steps individually agreed for every situation. This has traditionally been a slow and exacting process of negotiation between integrators and accreditors. The Decentralized Operation Procedure (DOP) technique allows interoperability definitions of system interactions to be created as XML files and deployed across the battlefield environment. By extending the security information definitions within the DOP technique, it is intended to provide sufficient incorporated information to allow boundary protection devices to also immediately load and utilize a DOP XML file and then apply established standards of security. This allows boundary devices to be updated with the same dynamism as the deployment of new DOPs and DOP interoperability definitions to also exploit coalitional capabilities having crossed security boundaries. The proposal describes an open and published boundary definition to support the aims of the MOD 23-13 Generic Base Architecture Defense Standard when working with coalition partners. The research aims are; a) to identify each element within a DOP that requires security characteristics to be described; b) create a means to define security characteristics using XML; c) determine whether external validation of an approved DOP requires additional authentication; d) determine the actions that end users will have to perform on boundary protection devices in support of these aims. The paper will present the XML security extensions and the results of a practical implementation achieved through the modification of an existing accredited barrier device.

  10. The 3rd DBCLS BioHackathon: improving life science data integration with Semantic Web technologies

    PubMed Central

    2013-01-01

    Background BioHackathon 2010 was the third in a series of meetings hosted by the Database Center for Life Sciences (DBCLS) in Tokyo, Japan. The overall goal of the BioHackathon series is to improve the quality and accessibility of life science research data on the Web by bringing together representatives from public databases, analytical tool providers, and cyber-infrastructure researchers to jointly tackle important challenges in the area of in silico biological research. Results The theme of BioHackathon 2010 was the 'Semantic Web', and all attendees gathered with the shared goal of producing Semantic Web data from their respective resources, and/or consuming or interacting those data using their tools and interfaces. We discussed on topics including guidelines for designing semantic data and interoperability of resources. We consequently developed tools and clients for analysis and visualization. Conclusion We provide a meeting report from BioHackathon 2010, in which we describe the discussions, decisions, and breakthroughs made as we moved towards compliance with Semantic Web technologies - from source provider, through middleware, to the end-consumer. PMID:23398680

  11. Semantic Representation and Naming in Young Children.

    ERIC Educational Resources Information Center

    McGregor, Karla K.; Friedman, Rena M.; Reilly, Renee M.; Newman, Robyn M.

    2002-01-01

    Two experiments examined children's semantic representations and semantic naming errors. Results suggested that functional and physical properties are core aspects of object representations in the semantic lexicon and that the degree of semantic knowledge makes words more or less vulnerable to retrieval failure. Discussion focuses on the dynamic…

  12. The Semantic Distance Model of Relevance Assessment.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.

    1998-01-01

    Presents the Semantic Distance Model (SDM) of Relevance Assessment, a cognitive model of the relationship between semantic distance and relevance assessment. Discusses premises of the model such as the subjective nature of information and the metaphor of semantic distance. Empirical results illustrate the effects of semantic distance and semantic…

  13. Mapping the Structure of Semantic Memory

    ERIC Educational Resources Information Center

    Morais, Ana Sofia; Olsson, Henrik; Schooler, Lael J.

    2013-01-01

    Aggregating snippets from the semantic memories of many individuals may not yield a good map of an individual's semantic memory. The authors analyze the structure of semantic networks that they sampled from individuals through a new snowball sampling paradigm during approximately 6 weeks of 1-hr daily sessions. The semantic networks of individuals…

  14. Semantic Modeling of Requirements: Leveraging Ontologies in Systems Engineering

    ERIC Educational Resources Information Center

    Mir, Masood Saleem

    2012-01-01

    The interdisciplinary nature of "Systems Engineering" (SE), having "stakeholders" from diverse domains with orthogonal facets, and need to consider all stages of "lifecycle" of system during conception, can benefit tremendously by employing "Knowledge Engineering" (KE) to achieve semantic agreement among all…

  15. Exploiting Recurring Structure in a Semantic Network

    NASA Technical Reports Server (NTRS)

    Wolfe, Shawn R.; Keller, Richard M.

    2004-01-01

    With the growing popularity of the Semantic Web, an increasing amount of information is becoming available in machine interpretable, semantically structured networks. Within these semantic networks are recurring structures that could be mined by existing or novel knowledge discovery methods. The mining of these semantic structures represents an interesting area that focuses on mining both for and from the Semantic Web, with surprising applicability to problems confronting the developers of Semantic Web applications. In this paper, we present representative examples of recurring structures and show how these structures could be used to increase the utility of a semantic repository deployed at NASA.

  16. Workspaces in the Semantic Web

    NASA Technical Reports Server (NTRS)

    Wolfe, Shawn R.; Keller, RIchard M.

    2005-01-01

    Due to the recency and relatively limited adoption of Semantic Web technologies. practical issues related to technology scaling have received less attention than foundational issues. Nonetheless, these issues must be addressed if the Semantic Web is to realize its full potential. In particular, we concentrate on the lack of scoping methods that reduce the size of semantic information spaces so they are more efficient to work with and more relevant to an agent's needs. We provide some intuition to motivate the need for such reduced information spaces, called workspaces, give a formal definition, and suggest possible methods of deriving them.

  17. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    SciTech Connect

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan; Feo, John T.; Haglin, David J.; Mackey, Greg E.; Mizell, David W.

    2011-06-02

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.

  18. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications

    PubMed Central

    2011-01-01

    Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP

  19. Community-Based Services that Facilitate Interoperability and Intercomparison of Precipitation Datasets from Multiple Sources

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Kempler, Steven; Teng, William; Leptoukh, Gregory; Ostrenga, Dana

    2010-01-01

    perform the complicated data access and match-up processes. In addition, PDISC tool and service capabilities being adapted for GPM data will be described, including the Google-like Mirador data search and access engine; semantic technology to help manage large amounts of multi-sensor data and their relationships; data access through various Web services (e.g., OPeNDAP, GDS, WMS, WCS); conversion to various formats (e.g., netCDF, HDF, KML (for Google Earth)); visualization and analysis of Level 2 data profiles and maps; parameter and spatial subsetting; time and temporal aggregation; regridding; data version control and provenance; continuous archive verification; and expertise in data-related standards and interoperability. The goal of providing these services is to further the progress towards a common framework by which data analysis/validation can be more easily accomplished.

  20. Does semantic redundancy gain result from multiple semantic priming?

    PubMed

    Schröter, Hannes; Bratzke, Daniel; Fiedler, Anja; Birngruber, Teresa

    2015-10-01

    Fiedler, Schröter, and Ulrich (2013) reported faster responses to a single written word when the semantic content of this word (e.g., "elephant") matched both targets (e.g., "animal", "gray") as compared to a single target (e.g., "animal", "brown"). This semantic redundancy gain was explained by statistical facilitation due to a race of independent memory retrieval processes. The present experiment addresses one alternative explanation, namely that semantic redundancy gain results from multiple pre-activation of words that match both targets. In different blocks of trials, participants performed a redundant-targets task and a lexical decision task. The targets of the redundant-targets task served as primes in the lexical decision task. Replicating the findings of Fiedler et al., a semantic redundancy gain was observed in the redundant-targets task. Crucially, however, there was no evidence of a multiple semantic priming effect in the lexical decision task. This result suggests that semantic redundancy gain cannot be explained by multiple pre-activation of words that match both targets.

  1. linkedISA: semantic representation of ISA-Tab experimental metadata

    PubMed Central

    2014-01-01

    Background Reporting and sharing experimental metadata- such as the experimental design, characteristics of the samples, and procedures applied, along with the analysis results, in a standardised manner ensures that datasets are comprehensible and, in principle, reproducible, comparable and reusable. Furthermore, sharing datasets in formats designed for consumption by humans and machines will also maximize their use. The Investigation/Study/Assay (ISA) open source metadata tracking framework facilitates standards-compliant collection, curation, visualization, storage and sharing of datasets, leveraging on other platforms to enable analysis and publication. The ISA software suite includes several components used in increasingly diverse set of life science and biomedical domains; it is underpinned by a general-purpose format, ISA-Tab, and conversions exist into formats required by public repositories. While ISA-Tab works well mainly as a human readable format, we have also implemented a linked data approach to semantically define the ISA-Tab syntax. Results We present a semantic web representation of the ISA-Tab syntax that complements ISA-Tab's syntactic interoperability with semantic interoperability. We introduce the linkedISA conversion tool from ISA-Tab to the Resource Description Framework (RDF), supporting mappings from the ISA syntax to multiple community-defined, open ontologies and capitalising on user-provided ontology annotations in the experimental metadata. We describe insights of the implementation and how annotations can be expanded driven by the metadata. We applied the conversion tool as part of Bio-GraphIIn, a web-based application supporting integration of the semantically-rich experimental descriptions. Designed in a user-friendly manner, the Bio-GraphIIn interface hides most of the complexities to the users, exposing a familiar tabular view of the experimental description to allow seamless interaction with the RDF representation, and visualising

  2. Interoperability between Publications, Reference Data and Visualisation Tools

    NASA Astrophysics Data System (ADS)

    Allen, Mark G.; Ocvirk, Pierre; Genova, Francoise

    2015-08-01

    Astronomy research is becoming more and more inter-connected, and there is a high expectation for our publications, reference data and tools to be interoperable. Publications are the hard earned final results of scientific endeavour, and technology allows us to enable publications as useable resources, going beyond their traditional role as a readable document. There is strong demand for simple access to the data associated with publications, and that links and references in publications are strongly connected to online resources, and are useable in visualisation tools. We highlight the capabilities of the CDS reference services for interoperability between the reference data obtained from publications, the connections between Journal and literature services, and combination of these data and information in Aladin and other CDS services. (In support of the abstract submitted by P. Ocvirk)

  3. Foundations of reusable and interoperable facet models using category theory.

    PubMed

    Harris, Daniel R

    2016-10-01

    Faceted browsing has become ubiquitous with modern digital libraries and online search engines, yet the process is still difficult to abstractly model in a manner that supports the development of interoperable and reusable interfaces. We propose category theory as a theoretical foundation for faceted browsing and demonstrate how the interactive process can be mathematically abstracted. Existing efforts in facet modeling are based upon set theory, formal concept analysis, and light-weight ontologies, but in many regards, they are implementations of faceted browsing rather than a specification of the basic, underlying structures and interactions. We will demonstrate that category theory allows us to specify faceted objects and study the relationships and interactions within a faceted browsing system. Resulting implementations can then be constructed through a category-theoretic lens using these models, allowing abstract comparison and communication that naturally support interoperability and reuse.

  4. A STANAG for NATO imagery interoperable data links

    NASA Astrophysics Data System (ADS)

    Peckham, H. M.

    1993-12-01

    NATO, under the direction of Air Group IV (A/C 224) of the Air Force Armament Group is writing a Standardization Agreement (STANAG) for an Imagery Interoperable Data Link. This is the last segment of the NATO Imagery Interoperable Architecture (NIIA) to be completed. This paper will briefly the background of the development of the NIIA and the inter-relationship of the three segments, and then describe the approach being taken to the preparation of the data link STANAG. The concept of the data link described by a layered model using Open Systems Interconnect concepts to define interfaces between the layers will be discussed and then the specific interfaces being used for the STANAG development will be described.

  5. Problem Solving with General Semantics.

    ERIC Educational Resources Information Center

    Hewson, David

    1996-01-01

    Discusses how to use general semantics formulations to improve problem solving at home or at work--methods come from the areas of artificial intelligence/computer science, engineering, operations research, and psychology. (PA)

  6. Nine Principles of Semantic Harmonization.

    PubMed

    Cunningham, James A; Van Speybroeck, Michel; Kalra, Dipak; Verbeeck, Rudi

    2016-01-01

    Medical data is routinely collected, stored and recorded across different institutions and in a range of different formats. Semantic harmonization is the process of collating this data into a singular consistent logical view, with many approaches to harmonizing both possible and valid. The broad scope of possibilities for undertaking semantic harmonization do lead however to the development of bespoke and ad-hoc systems; this is particularly the case when it comes to cohort data, the format of which is often specific to a cohort's area of focus. Guided by work we have undertaken in developing the 'EMIF Knowledge Object Library', a semantic harmonization framework underpinning the collation of pan-European Alzheimer's cohort data, we have developed a set of nine generic guiding principles for developing semantic harmonization frameworks, the application of which will establish a solid base for constructing similar frameworks.

  7. Nine Principles of Semantic Harmonization

    PubMed Central

    Cunningham, James A.; Van Speybroeck, Michel; Kalra, Dipak; Verbeeck, Rudi

    2016-01-01

    Medical data is routinely collected, stored and recorded across different institutions and in a range of different formats. Semantic harmonization is the process of collating this data into a singular consistent logical view, with many approaches to harmonizing both possible and valid. The broad scope of possibilities for undertaking semantic harmonization do lead however to the development of bespoke and ad-hoc systems; this is particularly the case when it comes to cohort data, the format of which is often specific to a cohort’s area of focus. Guided by work we have undertaken in developing the ‘EMIF Knowledge Object Library’, a semantic harmonization framework underpinning the collation of pan-European Alzheimer’s cohort data, we have developed a set of nine generic guiding principles for developing semantic harmonization frameworks, the application of which will establish a solid base for constructing similar frameworks. PMID:28269840

  8. Distributed semantic networks and CLIPS

    NASA Technical Reports Server (NTRS)

    Snyder, James; Rodriguez, Tony

    1991-01-01

    Semantic networks of frames are commonly used as a method of reasoning in many problems. In most of these applications the semantic network exists as a single entity in a single process environment. Advances in workstation hardware provide support for more sophisticated applications involving multiple processes, interacting in a distributed environment. In these applications the semantic network may well be distributed over several concurrently executing tasks. This paper describes the design and implementation of a frame based, distributed semantic network in which frames are accessed both through C Language Integrated Production System (CLIPS) expert systems and procedural C++ language programs. The application area is a knowledge based, cooperative decision making model utilizing both rule based and procedural experts.

  9. Interoperability-Reforger 80 Illustrates the Importance.

    DTIC Science & Technology

    1982-04-16

    and the actual activity . Using the description as a basis, he then focuses on these areas essential to DO 1473 EDITON OF I NOV 66 IS OBSOLETE[ JAM 7S...THIS PA0WS’I--. DO *"- Item 20. Continued The successful conduct of the activity . Much of the success was achieved as a result of efforts beyond what...required from all NATO Allies. One area where positive progress is being made to overcome differences is the multi-national exercises such as REFOGM

  10. The Role of Standards in Cloud-Computing Interoperability

    DTIC Science & Technology

    2012-10-01

    Ahronovitz 2010, Harding 2010, Badger 2011, Kundra 2011]. Risks of vendor lock-in include reduced negotiation power in reaction to price increases and...use cases classified into three groups: cloud management, cloud interoperability, and cloud security [ Badger 2010]. These use cases are listed below... Badger 2010]: • Cloud Management Use Cases − Open an Account − Close an Account − Terminate an Account − Copy Data Objects into a Cloud − Copy

  11. Towards an Interoperability Ontology for Software Development Tools

    DTIC Science & Technology

    2003-03-01

    Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY...This efficiency (high productivity with less software faults) results from best practices in building, managing and tes ting software projects via the...interoperability and enhanced communication. 15. NUMBER OF PAGES 271 14. SUBJECT TERMS Software Engineering, Computer Science, Management

  12. TRIGA: Telecommunications Protocol Processing Subsystem Using Reconfigurable Interoperable Gate Arrays

    NASA Technical Reports Server (NTRS)

    Pang, Jackson; Pingree, Paula J.; Torgerson, J. Leigh

    2006-01-01

    We present the Telecommunications protocol processing subsystem using Reconfigurable Interoperable Gate Arrays (TRIGA), a novel approach that unifies fault tolerance, error correction coding and interplanetary communication protocol off-loading to implement CCSDS File Delivery Protocol and Datalink layers. The new reconfigurable architecture offers more than one order of magnitude throughput increase while reducing footprint requirements in memory, command and data handling processor utilization, communication system interconnects and power consumption.

  13. Enabling Medical Device Interoperability for the Integrated Clinical Environment

    DTIC Science & Technology

    2016-02-01

    interoperability, patient safety, device control, health care , standards, data logger, clinical scenario, integrated clinical environment 16. SECURITY...Performance: 30 July 2013 – 30 July 2014 Introduction Health Information Technology (HIT) systems should facilitate the collection and point-of- care ...device identifiers (UDIs) in administrative health care claims. As part of the UDI Implementation Work Group, we have been in discussions with FDA and

  14. Technical Data Interoperability (TDI) Pathfinder Via Emerging Standards

    NASA Technical Reports Server (NTRS)

    Conroy, Mike; Gill, Paul; Hill, Bradley; Ibach, Brandon; Jones, Corey; Ungar, David; Barch, Jeffrey; Ingalls, John; Jacoby, Joseph; Manning, Josh; Bengtsson, Kjell; Falls, Mark; Kent, Peter; Heath, Shaun; Kennedy, Steven

    2014-01-01

    The TDI project (TDI) investigates trending technical data standards for applicability to NASA vehicles, space stations, payloads, facilities, and equipment. TDI tested COTS software compatible with a certain suite of related industry standards for capabilities of individual benefits and interoperability. These standards not only esnable Information Technology (IT) efficiencies, but also address efficient structures and standard content for business processes. We used source data from generic industry samples as well as NASA and European Space Agency (ESA) data from space systems.

  15. Secure and interoperable communication infrastructures for PPDR organisations

    NASA Astrophysics Data System (ADS)

    Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David

    2016-05-01

    The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.

  16. An Emergent Perspective on Interoperation in Systems of Systems

    DTIC Science & Technology

    2006-03-01

    addressing interoperation in systems of systems . It is our hope that this report will stimulate interest in the development of sound theory and drive... decisions with respect to the system of which it is a part. In its internal structure, an autonomous system can be either monolithic or a system of...structures create unnecessary vulnerabilities • outcomes depend not only on the automated and mechanized aspects of systems but also on the decisions

  17. Information Exchange in Support of C2-Interoperability

    DTIC Science & Technology

    2001-04-01

    Interoperability Freek N. Driesenaar, MSc Scientist TNO Phyisics and Electronics Laboratory P.O. Box 96864 2509 JG The Hague The Netherlands Driesenaar@fel.tno.nl 1...specified in different ways; e.g., an armoured infantry table updates (creates, updates and deletes) and applied unit can be identified by /ARMn/IN...point relates to the first the other end of the line, the transactions are received point, and applied to the database, and the GIS then translates the

  18. Schoolbook Texts: Behavioral Achievement Priming in Math and Language.

    PubMed

    Engeser, Stefan; Baumann, Nicola; Baum, Ingrid

    2016-01-01

    Prior research found reliable and considerably strong effects of semantic achievement primes on subsequent performance. In order to simulate a more natural priming condition to better understand the practical relevance of semantic achievement priming effects, running texts of schoolbook excerpts with and without achievement primes were used as priming stimuli. Additionally, we manipulated the achievement context; some subjects received no feedback about their achievement and others received feedback according to a social or individual reference norm. As expected, we found a reliable (albeit small) positive behavioral priming effect of semantic achievement primes on achievement in math (Experiment 1) and language tasks (Experiment 2). Feedback moderated the behavioral priming effect less consistently than we expected. The implication that achievement primes in schoolbooks can foster performance is discussed along with general theoretical implications.

  19. Schoolbook Texts: Behavioral Achievement Priming in Math and Language

    PubMed Central

    Engeser, Stefan; Baumann, Nicola; Baum, Ingrid

    2016-01-01

    Prior research found reliable and considerably strong effects of semantic achievement primes on subsequent performance. In order to simulate a more natural priming condition to better understand the practical relevance of semantic achievement priming effects, running texts of schoolbook excerpts with and without achievement primes were used as priming stimuli. Additionally, we manipulated the achievement context; some subjects received no feedback about their achievement and others received feedback according to a social or individual reference norm. As expected, we found a reliable (albeit small) positive behavioral priming effect of semantic achievement primes on achievement in math (Experiment 1) and language tasks (Experiment 2). Feedback moderated the behavioral priming effect less consistently than we expected. The implication that achievement primes in schoolbooks can foster performance is discussed along with general theoretical implications. PMID:26938446

  20. NASA and The Semantic Web

    NASA Technical Reports Server (NTRS)

    Ashish, Naveen

    2005-01-01

    We provide an overview of several ongoing NASA endeavors based on concepts, systems, and technology from the Semantic Web arena. Indeed NASA has been one of the early adopters of Semantic Web Technology and we describe ongoing and completed R&D efforts for several applications ranging from collaborative systems to airspace information management to enterprise search to scientific information gathering and discovery systems at NASA.

  1. Neural substrates of semantic memory.

    PubMed

    Hart, John; Anand, Raksha; Zoccoli, Sandra; Maguire, Mandy; Gamino, Jacque; Tillman, Gail; King, Richard; Kraut, Michael A

    2007-09-01

    Semantic memory is described as the storage of knowledge, concepts, and information that is common and relatively consistent across individuals (e.g., memory of what is a cup). These memories are stored in multiple sensorimotor modalities and cognitive systems throughout the brain (e.g., how a cup is held and manipulated, the texture of a cup's surface, its shape, its function, that is related to beverages such as coffee, and so on). Our ability to engage in purposeful interactions with our environment is dependent on the ability to understand the meaning and significance of the objects and actions around us that are stored in semantic memory. Theories of the neural basis of the semantic memory of objects have produced sophisticated models that have incorporated to varying degrees the results of cognitive and neural investigations. The models are grouped into those that are (1) cognitive models, where the neural data are used to reveal dissociations in semantic memory after a brain lesion occurs; (2) models that incorporate both cognitive and neuroanatomical information; and (3) models that use cognitive, neuroanatomic, and neurophysiological data. This review highlights the advances and issues that have emerged from these models and points to future directions that provide opportunities to extend these models. The models of object memory generally describe how category and/or feature representations encode for object memory, and the semantic operations engaged in object processing. The incorporation of data derived from multiple modalities of investigation can lead to detailed neural specifications of semantic memory organization. The addition of neurophysiological data can potentially provide further elaboration of models to include semantic neural mechanisms. Future directions should incorporate available and newly developed techniques to better inform the neural underpinning of semantic memory models.

  2. On the Feasibility of Interoperable Schemes in Hand Biometrics

    PubMed Central

    Morales, Aythami; González, Ester; Ferrer, Miguel A.

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors. PMID:22438714

  3. On the feasibility of interoperable schemes in hand biometrics.

    PubMed

    Morales, Aythami; González, Ester; Ferrer, Miguel A

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  4. Semantic preview benefit during reading.

    PubMed

    Hohenstein, Sven; Kliegl, Reinhold

    2014-01-01

    Word features in parafoveal vision influence eye movements during reading. The question of whether readers extract semantic information from parafoveal words was studied in 3 experiments by using a gaze-contingent display change technique. Subjects read German sentences containing 1 of several preview words that were replaced by a target word during the saccade to the preview (boundary paradigm). In the 1st experiment the preview word was semantically related or unrelated to the target. Fixation durations on the target were shorter for semantically related than unrelated previews, consistent with a semantic preview benefit. In the 2nd experiment, half the sentences were presented following the rules of German spelling (i.e., previews and targets were printed with an initial capital letter), and the other half were presented completely in lowercase. A semantic preview benefit was obtained under both conditions. In the 3rd experiment, we introduced 2 further preview conditions, an identical word and a pronounceable nonword, while also manipulating the text contrast. Whereas the contrast had negligible effects, fixation durations on the target were reliably different for all 4 types of preview. Semantic preview benefits were greater for pretarget fixations closer to the boundary (large preview space) and, although not as consistently, for long pretarget fixation durations (long preview time). The results constrain theoretical proposals about eye movement control in reading. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  5. Direct2Experts: a pilot national network to demonstrate interoperability among research-networking platforms.

    PubMed

    Weber, Griffin M; Barnett, William; Conlon, Mike; Eichmann, David; Kibbe, Warren; Falk-Krzesinski, Holly; Halaas, Michael; Johnson, Layne; Meeks, Eric; Mitchell, Donald; Schleyer, Titus; Stallings, Sarah; Warden, Michael; Kahlon, Maninder

    2011-12-01

    Research-networking tools use data-mining and social networking to enable expertise discovery, matchmaking and collaboration, which are important facets of team science and translational research. Several commercial and academic platforms have been built, and many institutions have deployed these products to help their investigators find local collaborators. Recent studies, though, have shown the growing importance of multiuniversity teams in science. Unfortunately, the lack of a standard data-exchange model and resistance of universities to share information about their faculty have presented barriers to forming an institutionally supported national network. This case report describes an initiative, which, in only 6 months, achieved interoperability among seven major research-networking products at 28 universities by taking an approach that focused on addressing institutional concerns and encouraging their participation. With this necessary groundwork in place, the second phase of this effort can begin, which will expand the network's functionality and focus on the end users.

  6. Progress Toward Standards for the Seamless Interoperability of Broadband Satellite Communication Networks

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.; Glover, Daniel R.; vonDeak, Thomas C.; Bhasin, Kul B.

    1998-01-01

    The realization of the full potential of the National Information Infrastructure (NH) and Global Information Infrastructure (GII) requires seamless interoperability of emerging satellite networks with terrestrial networks. This requires a cooperative effort between industry, academia and government agencies to develop and advocate new, satellite-friendly communication protocols and modifications to existing communication protocol standards. These groups have recently come together to actively participating in a number of standards making bodies including: the Internet Engineering Task Force (IETF), the Asynchronous Transfer Mode (ATM) Forum, the International Telecommunication Union (ITU) and the Telecommunication Industry Association MA) to ensure that issues regarding efficient use of these protocols over satellite links are not overlooked. This paper will summarize the progress made toward standards development to achieve seamless integration and accelerate the deployment of multimedia applications.

  7. Direct2Experts: a pilot national network to demonstrate interoperability among research-networking platforms

    PubMed Central

    Barnett, William; Conlon, Mike; Eichmann, David; Kibbe, Warren; Falk-Krzesinski, Holly; Halaas, Michael; Johnson, Layne; Meeks, Eric; Mitchell, Donald; Schleyer, Titus; Stallings, Sarah; Warden, Michael; Kahlon, Maninder

    2011-01-01

    Research-networking tools use data-mining and social networking to enable expertise discovery, matchmaking and collaboration, which are important facets of team science and translational research. Several commercial and academic platforms have been built, and many institutions have deployed these products to help their investigators find local collaborators. Recent studies, though, have shown the growing importance of multiuniversity teams in science. Unfortunately, the lack of a standard data-exchange model and resistance of universities to share information about their faculty have presented barriers to forming an institutionally supported national network. This case report describes an initiative, which, in only 6 months, achieved interoperability among seven major research-networking products at 28 universities by taking an approach that focused on addressing institutional concerns and encouraging their participation. With this necessary groundwork in place, the second phase of this effort can begin, which will expand the network's functionality and focus on the end users. PMID:22037890

  8. How to ensure sustainable interoperability in heterogeneous distributed systems through architectural approach.

    PubMed

    Pape-Haugaard, Louise; Frank, Lars

    2011-01-01

    A major obstacle in ensuring ubiquitous information is the utilization of heterogeneous systems in eHealth. The objective in this paper is to illustrate how an architecture for distributed eHealth databases can be designed without lacking the characteristic features of traditional sustainable databases. The approach is firstly to explain traditional architecture in central and homogeneous distributed database computing, followed by a possible approach to use an architectural framework to obtain sustainability across disparate systems i.e. heterogeneous databases, concluded with a discussion. It is seen that through a method of using relaxed ACID properties on a service-oriented architecture it is possible to achieve data consistency which is essential when ensuring sustainable interoperability.

  9. SCALEUS: Semantic Web Services Integration for Biomedical Applications.

    PubMed

    Sernadela, Pedro; González-Castro, Lorena; Oliveira, José Luís

    2017-04-01

    In recent years, we have witnessed an explosion of biological data resulting largely from the demands of life science research. The vast majority of these data are freely available via diverse bioinformatics platforms, including relational databases and conventional keyword search applications. This type of approach has achieved great results in the last few years, but proved to be unfeasible when information needs to be combined or shared among different and scattered sources. During recent years, many of these data distribution challenges have been solved with the adoption of semantic web. Despite the evident benefits of this technology, its adoption introduced new challenges related with the migration process, from existent systems to the semantic level. To facilitate this transition, we have developed Scaleus, a semantic web migration tool that can be deployed on top of traditional systems in order to bring knowledge, inference rules, and query federation to the existent data. Targeted at the biomedical domain, this web-based platform offers, in a single package, straightforward data integration and semantic web services that help developers and researchers in the creation process of new semantically enhanced information systems. SCALEUS is available as open source at http://bioinformatics-ua.github.io/scaleus/ .

  10. The impact of electronic health record (EHR) interoperability on immunization information system (IIS) data quality

    PubMed Central

    Woinarowicz, Mary; Howell, Molly

    2016-01-01

    Objectives: To evaluate the impact of electronic health record (EHR) interoperability on the quality of immunization data in the North Dakota Immunization Information System (NDIIS). Methods: NDIIS doses administered data was evaluated for completeness of the patient and dose-level core data elements for records that belong to interoperable and non-interoperable providers. Data was compared at three months prior to electronic health record (EHR) interoperability enhancement to data at three, six, nine and twelve months post-enhancement following the interoperability go live date. Doses administered per month and by age group, timeliness of vaccine entry and the number of duplicate clients added to the NDIIS was also compared, in addition to, immunization rates for children 19 – 35 months of age and adolescents 11 – 18 years of age. Results: Doses administered by both interoperable and non-interoperable providers remained fairly consistent from pre-enhancement through twelve months post-enhancement. Comparing immunization rates for infants and adolescents, interoperable providers had higher rates both pre- and post-enhancement than non-interoperable providers for all vaccines and vaccine series assessed. The overall percentage of doses entered into the NDIIS within one month of administration varied slightly between interoperable and non-interoperable providers; however, there were significant changes between the percentage of doses entered within one day and within one week with the percentage entered within one day increasing and within one week decreasing with interoperability. The number of duplicate client records created by interoperable providers increased from 94 duplicates pre-enhancement to 10,552 at twelve months post-enhancement, while the duplicates from non-interoperable providers only increased from 300 to 637 over the same period. Of the 40 core data elements in the NDIIS, there was some difference in completeness between the interoperable versus

  11. Use of Annotations for Component and Framework Interoperability

    NASA Astrophysics Data System (ADS)

    David, O.; Lloyd, W.; Carlson, J.; Leavesley, G. H.; Geter, F.

    2009-12-01

    The popular programming languages Java and C# provide annotations, a form of meta-data construct. Software frameworks for web integration, web services, database access, and unit testing now take advantage of annotations to reduce the complexity of APIs and the quantity of integration code between the application and framework infrastructure. Adopting annotation features in frameworks has been observed to lead to cleaner and leaner application code. The USDA Object Modeling System (OMS) version 3.0 fully embraces the annotation approach and additionally defines a meta-data standard for components and models. In version 3.0 framework/model integration previously accomplished using API calls is now achieved using descriptive annotations. This enables the framework to provide additional functionality non-invasively such as implicit multithreading, and auto-documenting capabilities while achieving a significant reduction in the size of the model source code. Using a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside of it. To study the effectiveness of an annotation based framework approach with other modeling frameworks, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A monthly water balance model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. In a next step, the PRMS model was implemented in OMS 3.0 and is currently being implemented for water supply forecasting in the

  12. Standardized Semantic Markup for Reference Terminologies, Thesauri and Coding Systems: Benefits for distributed E-Health Applications.

    PubMed

    Hoelzer, Simon; Schweiger, Ralf K; Liu, Raymond; Rudolf, Dirk; Rieger, Joerg; Dudeck, Joachim

    2005-01-01

    With the introduction of the ICD-10 as the standard for diagnosis, the development of an electronic representation of its complete content, inherent semantics and coding rules is necessary. Our concept refers to current efforts of the CEN/TC 251 to establish a European standard for hierarchical classification systems in healthcare. We have developed an electronic representation of the ICD-10 with the extensible Markup Language (XML) that facilitates the integration in current information systems or coding software taking into account different languages and versions. In this context, XML offers a complete framework of related technologies and standard tools for processing that helps to develop interoperable applications.

  13. A public health response to data interoperability to prevent child maltreatment.

    PubMed

    Nguyen, Loc H

    2014-11-01

    The sharing of data, particularly health data, has been an important tool for the public health community, especially in terms of data sharing across systems (i.e., interoperability). Child maltreatment is a serious public health issue that could be better mitigated if there were interoperability. There are challenges to addressing child maltreatment interoperability that include the current lack of data sharing among systems, the lack of laws that promote interoperability to address child maltreatment, and the lack of data sharing at the individual level. There are waivers in federal law that allow for interoperability to prevent communicable diseases at the individual level. Child maltreatment has a greater long-term impact than a number of communicable diseases combined, and interoperability should be leveraged to maximize public health strategies to prevent child maltreatment.

  14. A Public Health Response to Data Interoperability to Prevent Child Maltreatment

    PubMed Central

    2014-01-01

    The sharing of data, particularly health data, has been an important tool for the public health community, especially in terms of data sharing across systems (i.e., interoperability). Child maltreatment is a serious public health issue that could be better mitigated if there were interoperability. There are challenges to addressing child maltreatment interoperability that include the current lack of data sharing among systems, the lack of laws that promote interoperability to address child maltreatment, and the lack of data sharing at the individual level. There are waivers in federal law that allow for interoperability to prevent communicable diseases at the individual level. Child maltreatment has a greater long-term impact than a number of communicable diseases combined, and interoperability should be leveraged to maximize public health strategies to prevent child maltreatment. PMID:25211715

  15. Web Image Re-Ranking UsingQuery-Specific Semantic Signatures.

    PubMed

    Wang, Xiaogang; Qiu, Shi; Liu, Ke; Tang, Xiaoou

    2014-04-01

    Image re-ranking, as an effective way to improve the results of web-based image search, has been adopted by current commercial search engines such as Bing and Google. Given a query keyword, a pool of images are first retrieved based on textual information. By asking the user to select a query image from the pool, the remaining images are re-ranked based on their visual similarities with the query image. A major challenge is that the similarities of visual features do not well correlate with images' semantic meanings which interpret users' search intention. Recently people proposed to match images in a semantic space which used attributes or reference classes closely related to the semantic meanings of images as basis. However, learning a universal visual semantic space to characterize highly diverse images from the web is difficult and inefficient. In this paper, we propose a novel image re-ranking framework, which automatically offline learns different semantic spaces for different query keywords. The visual features of images are projected into their related semantic spaces to get semantic signatures. At the online stage, images are re-ranked by comparing their semantic signatures obtained from the semantic space specified by the query keyword. The proposed query-specific semantic signatures significantly improve both the accuracy and efficiency of image re-ranking. The original visual features of thousands of dimensions can be projected to the semantic signatures as short as 25 dimensions. Experimental results show that 25-40 percent relative improvement has been achieved on re-ranking precisions compared with the state-of-the-art methods.

  16. What Does Semantic Tiling of the Cortex Tell us about Semantics?

    PubMed

    Barsalou, Lawrence W

    2017-04-07

    Recent use of voxel-wise modeling in cognitive neuroscience suggests that semantic maps tile the cortex. Although this impressive research establishes distributed cortical areas active during the conceptual processing that underlies semantics, it tells us little about the nature of this processing. While mapping concepts between Marr's computational and implementation levels to support neural encoding and decoding, this approach ignores Marr's algorithmic level, central for understanding the mechanisms that implement cognition, in general, and conceptual processing, in particular. Following decades of research in cognitive science and neuroscience, what do we know so far about the representation and processing mechanisms that implement conceptual abilities? Most basically, much is known about the mechanisms associated with: (1) features and frame representations, (2) grounded, abstract, and linguistic representations, (3) knowledge-based inference, (4) concept composition, and (5) conceptual flexibility. Rather than explaining these fundamental representation and processing mechanisms, semantic tiles simply provide a trace of their activity over a relatively short time period within a specific learning context. Establishing the mechanisms that implement conceptual processing in the brain will require more than mapping it to cortical (and sub-cortical) activity, with process models from cognitive science likely to play central roles in specifying the intervening mechanisms. More generally, neuroscience will not achieve its basic goals until it establishes algorithmic-level mechanisms that contribute essential explanations to how the brain works, going beyond simply establishing the brain areas that respond to various task conditions.

  17. Leveraging electronic healthcare record standards and semantic web technologies for the identification of patient cohorts

    PubMed Central

    Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto; Marcos, Mar; Legaz-García, María del Carmen; Moner, David; Torres-Sospedra, Joaquín; Esteban-Gil, Angel; Martínez-Salvador, Begoña; Robles, Montserrat

    2013-01-01

    Background The secondary use of electronic healthcare records (EHRs) often requires the identification of patient cohorts. In this context, an important problem is the heterogeneity of clinical data sources, which can be overcome with the combined use of standardized information models, virtual health records, and semantic technologies, since each of them contributes to solving aspects related to the semantic interoperability of EHR data. Objective To develop methods allowing for a direct use of EHR data for the identification of patient cohorts leveraging current EHR standards and semantic web technologies. Materials and methods We propose to take advantage of the best features of working with EHR standards and ontologies. Our proposal is based on our previous results and experience working with both technological infrastructures. Our main principle is to perform each activity at the abstraction level with the most appropriate technology available. This means that part of the processing will be performed using archetypes (ie, data level) and the rest using ontologies (ie, knowledge level). Our approach will start working with EHR data in proprietary format, which will be first normalized and elaborated using EHR standards and then transformed into a semantic representation, which will be exploited by automated reasoning. Results We have applied our approach to protocols for colorectal cancer screening. The results comprise the archetypes, ontologies, and datasets developed for the standardization and semantic analysis of EHR data. Anonymized real data have been used and the patients have been successfully classified by the risk of developing colorectal cancer. Conclusions This work provides new insights in how archetypes and ontologies can be effectively combined for EHR-driven phenotyping. The methodological approach can be applied to other problems provided that suitable archetypes, ontologies, and classification rules can be designed. PMID:23934950

  18. A Semantic Relatedness Approach for Traceability Link Recovery

    SciTech Connect

    Mahmoud, Anas M.; Niu, Nan; Xu, Songhua

    2012-01-01

    Human analysts working with automated tracing tools need to directly vet candidate traceability links in order to determine the true traceability information. Currently, human intervention happens at the end of the traceability process, after candidate traceability links have already been generated. This often leads to a decline in the results accuracy. In this paper, we propose an approach, based on semantic relatedness (SR), which brings human judgment to an earlier stage of the tracing process by integrating it into the underlying retrieval mechanism. SR tries to mimic human mental model of relevance by considering a broad range of semantic relations, hence producing more semantically meaningful results. We evaluated our approach using three datasets from different application domains, and assessed the tracing results via six different performance measures concerning both result quality and browsability. The empirical evaluation results show that our SR approach achieves a significantly better performance in recovering true links than a standard Vector Space Model (VSM) in all datasets. Our approach also achieves a significantly better precision than Latent Semantic Indexing (LSI) in two of our datasets.

  19. Accelerating cancer systems biology research through Semantic Web technology.

    PubMed

    Wang, Zhihui; Sagotsky, Jonathan; Taylor, Thomas; Shironoshita, Patrick; Deisboeck, Thomas S

    2013-01-01

    Cancer systems biology is an interdisciplinary, rapidly expanding research field in which collaborations are a critical means to advance the field. Yet the prevalent database technologies often isolate data rather than making it easily accessible. The Semantic Web has the potential to help facilitate web-based collaborative cancer research by presenting data in a manner that is self-descriptive, human and machine readable, and easily sharable. We have created a semantically linked online Digital Model Repository (DMR) for storing, managing, executing, annotating, and sharing computational cancer models. Within the DMR, distributed, multidisciplinary, and inter-organizational teams can collaborate on projects, without forfeiting intellectual property. This is achieved by the introduction of a new stakeholder to the collaboration workflow, the institutional licensing officer, part of the Technology Transfer Office. Furthermore, the DMR has achieved silver level compatibility with the National Cancer Institute's caBIG, so users can interact with the DMR not only through a web browser but also through a semantically annotated and secure web service. We also discuss the technology behind the DMR leveraging the Semantic Web, ontologies, and grid computing to provide secure inter-institutional collaboration on cancer modeling projects, online grid-based execution of shared models, and the collaboration workflow protecting researchers' intellectual property.

  20. Convergence of Health Level Seven Version 2 Messages to Semantic Web Technologies for Software-Intensive Systems in Telemedicine Trauma Care

    PubMed Central

    Cook, Timothy Wayne; Cavalini, Luciana Tricai

    2016-01-01

    Objectives To present the technical background and the development of a procedure that enriches the semantics of Health Level Seven version 2 (HL7v2) messages for software-intensive systems in telemedicine trauma care. Methods This study followed a multilevel model-driven approach for the development of semantically interoperable health information systems. The Pre-Hospital Trauma Life Support (PHTLS) ABCDE protocol was adopted as the use case. A prototype application embedded the semantics into an HL7v2 message as an eXtensible Markup Language (XML) file, which was validated against an XML schema that defines constraints on a common reference model. This message was exchanged with a second prototype application, developed on the Mirth middleware, which was also used to parse and validate both the original and the hybrid messages. Results Both versions of the data instance (one pure XML, one embedded in the HL7v2 message) were equally validated and the RDF-based semantics recovered by the receiving side of the prototype from the shared XML schema. Conclusions This study demonstrated the semantic enrichment of HL7v2 messages for intensive-software telemedicine systems for trauma care, by validating components of extracts generated in various computing environments. The adoption of the method proposed in this study ensures the compliance of the HL7v2 standard in Semantic Web technologies. PMID:26893947

  1. Semantics Enabled Queries in EuroGEOSS: a Discovery Augmentation Approach

    NASA Astrophysics Data System (ADS)

    Santoro, M.; Mazzetti, P.; Fugazza, C.; Nativi, S.; Craglia, M.

    2010-12-01

    One of the main challenges in Earth Science Informatics is to build interoperability frameworks which allow users to discover, evaluate, and use information from different scientific domains. This needs to address multidisciplinary interoperability challenges concerning both technological and scientific aspects. From the technological point of view, it is necessary to provide a set of special interoperability arrangement in order to develop flexible frameworks that allow a variety of loosely-coupled services to interact with each other. From a scientific point of view, it is necessary to document clearly the theoretical and methodological assumptions underpinning applications in different scientific domains, and develop cross-domain ontologies to facilitate interdisciplinary dialogue and understanding. In this presentation we discuss a brokering approach that extends the traditional Service Oriented Architecture (SOA) adopted by most Spatial Data Infrastructures (SDIs) to provide the necessary special interoperability arrangements. In the EC-funded EuroGEOSS (A European approach to GEOSS) project, we distinguish among three possible functional brokering components: discovery, access and semantics brokers. This presentation focuses on the semantics broker, the Discovery Augmentation Component (DAC), which was specifically developed to address the three thematic areas covered by the EuroGEOSS project: biodiversity, forestry and drought. The EuroGEOSS DAC federates both semantics (e.g. SKOS repositories) and ISO-compliant geospatial catalog services. The DAC can be queried using common geospatial constraints (i.e. what, where, when, etc.). Two different augmented discovery styles are supported: a) automatic query expansion; b) user assisted query expansion. In the first case, the main discovery steps are: i. the query keywords (the what constraint) are “expanded” with related concepts/terms retrieved from the set of federated semantic services. A default expansion

  2. Action semantics modulate action prediction.

    PubMed

    Springer, Anne; Prinz, Wolfgang

    2010-11-01

    Previous studies have demonstrated that action prediction involves an internal action simulation that runs time-locked to the real action. The present study replicates and extends these findings by indicating a real-time simulation process (Graf et al., 2007), which can be differentiated from a similarity-based evaluation of internal action representations. Moreover, results showed that action semantics modulate action prediction accuracy. The semantic effect was specified by the processing of action verbs and concrete nouns (Experiment 1) and, more specifically, by the dynamics described by action verbs (Experiment 2) and the speed described by the verbs (e.g., "to catch" vs. "to grasp" vs. "to stretch"; Experiment 3). These results propose a linkage between action simulation and action semantics as two yet unrelated domains, a view that coincides with a recent notion of a close link between motor processes and the understanding of action language.

  3. Semantic processing of crowded stimuli?

    PubMed

    Huckauf, Anke; Knops, Andre; Nuerk, Hans-Christoph; Willmes, Klaus

    2008-11-01

    Effects of semantic processing of crowded characters were investigated using numbers as stimuli. In an identification task, typical spacing effects in crowding were replicated. Using the same stimuli in a magnitude comparison task, a smaller effect of spacing was observed as well as an effect of response congruency. These effects were replicated in a second experiment with varying stimulus-onset asynchronies. In addition, decreasing performance with increasing onset-asynchrony (so-called type-B masking) for incongruent flankers indicates semantic processing of target and flankers. The data show that semantic processing takes place even in crowded stimuli. This argues strongly against common accounts of crowding in terms of early stimulus-driven impairments of processing.

  4. Ontology Matching with Semantic Verification

    PubMed Central

    Jean-Mary, Yves R.; Shironoshita, E. Patrick; Kabuka, Mansur R.

    2009-01-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies. PMID:20186256

  5. International Planetary Science Interoperability: The Venus Express Interface Prototype

    NASA Astrophysics Data System (ADS)

    Sanford Bussard, Stephen; Chanover, N.; Huber, L.; Trejo, I.; Hughes, J. S.; Kelly, S.; Guinness, E.; Heather, D.; Salgado, J.; Osuna, P.

    2009-09-01

    NASA's Planetary Data System (PDS) and ESA's Planetary Science Archive (PSA) have successfully demonstrated interoperability between planetary science data archives with the Venus Express (VEX) Interface prototype. Because VEX is an ESA mission, there is no memorandum of understanding to archive the data in the PDS. However, using a common communications protocol and common data standards, VEX mission science data ingested into the PSA can be accessed from a user interface at the Atmospheres Node of the PDS, making the science data accessible globally through two established planetary science data portals. The PSA makes scientific and engineering data from ESA's planetary missions accessible to the worldwide scientific community. The PSA consists of online services incorporating search, preview, download, notification and delivery basket functionality. Mission data included in the archive aside from VEX include data from the Giotto, Mars Express, Smart-1, Huygens, and Rosetta spacecraft and several ground-based cometary observations. All data are compatible to the Planetary Data System data standard. The PDS archives and distributes scientific data from NASA planetary missions, astronomical observations, and laboratory measurements. The PDS is sponsored by NASA's Science Mission Directorate. Its purpose is to ensure the long-term usability of NASA data and to stimulate advanced research. The architecture of the VEX prototype interface leverages components from both the PSA and PDS information system infrastructures, a user interface developed at the New Mexico State University, and the International Planetary Data Alliance (IPDA) Planetary Data Access Protocol (PDAP). The VEX Interoperability Project was a key project of the IPDA, whose objective is to ensure world-wide access to planetary data regardless of which agency collects and archives the data. A follow-on IPDA project will adapt the VEX Interoperability protocol for access in JAXA to the Venus Climate

  6. Progress of Interoperability in Planetary Research for Geospatial Data Analysis

    NASA Astrophysics Data System (ADS)

    Hare, T. M.; Gaddis, L. R.

    2015-12-01

    For nearly a decade there has been a push in the planetary science community to support interoperable methods of accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (i.e., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized image formats that retain geographic information (e.g., GeoTiff, GeoJpeg2000), digital geologic mapping conventions, planetary extensions for symbols that comply with U.S. Federal Geographic Data Committee cartographic and geospatial metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they have been modified to support the planetary domain. The motivation to support common, interoperable data format and delivery standards is not only to improve access for higher-level products but also to address the increasingly distributed nature of the rapidly growing volumes of data. The strength of using an OGC approach is that it provides consistent access to data that are distributed across many facilities. While data-steaming standards are well-supported by both the more sophisticated tools used in Geographic Information System (GIS) and remote sensing industries, they are also supported by many light-weight browsers which facilitates large and small focused science applications and public use. Here we provide an

  7. Policy interoperability in stem cell research: demystifying harmonization.

    PubMed

    Isasi, Rosario M

    2009-06-01

    Scientific developments in the field of stem cell research continue to emerge at incredible speed and so too has the contentious debate surrounding their broad implications. Though economic, socio-ethical and legal concerns remain, at both national and international forums; we are witnessing a departure from an "embryo-centric" approach, to one that is focused on the globalization of research and to the ensuing need for policy interoperability. The common response to the challenges associated with the meaning, scope, and ethical significance of variance in national policies, is a call for the creation of uniform legal and ethical standards. However, this call towards policy convergence on the fundamental ethical and governance principles underpinning policies choices has led to confusion and to the mystification of the notion of harmonization. In this article we aim demystify the notion of policy harmonization in the context of stem cell research. We will do so by surveying the diverse elements to be harmonized. We will then present the problems of policy interoperability in the context of the globalization of SC research, in order to propose that the goal of harmonization in this field lies in the identification of prospective strategies to foster seamless cross-jurisdictional collaboration. Finally, policy interoperability will be analyzed through the lens of a range of policy approaches addressing the cross-jurisdictional transfer of hESC lines with the aim of demonstrating that the apparent ethical-political-legal divide in some contexts largely vanishes once we grasp the notion of harmonization and identify points of convergence.

  8. Making OGC standards work - interoperability testing between meteorological web services

    NASA Astrophysics Data System (ADS)

    Siemen, Stephan; Little, Chris; Voidrot, Marie-Françoise

    2015-04-01

    The Meteorology and Oceanography Domain Working Group (Met Ocean DWG) is a community orientated working group of the Open Geospatial Consortium (OGC). The group does not directly revise OGC standards, but rather enables collaboration and communication between groups with meteorological and oceanographic interests. The Met Ocean DWG maintains a list of topics of interest to the meteorological and oceanographic communities for discussion, prioritises activities, defining feedback to the OGC Standards Working Groups (SWG), and performing interoperability experiments. One of the activities of the MetOcean DWG is the definition of Best Practices documents for common OGC standards, such as WMS and WCS. This is necessary since meteorological data has additional complexities in time, elevation and multi models runs including ensembles. To guarantee interoperability in practice it is important to test each other systems and ensure standards are implemented correctly, but also make recommendations to the DWG on the establishment of Best Practices guides. The European Working Group on Operational meteorological Workstations (EGOWS) was founded in 1990 as an informal forum for people working in the development field of operational meteorological workstations. The annual EGOWS meeting offers an excellent platform for exchanging information and furthering co-operation among the experts from NMS's, ECMWF and other institutes in the work with OGC standards. The presentation will give an update of the testing, which was being done during the June 2014 EGOWS meeting in Oslo and what has happen since. The presenter will also give an overview of the online resources to follow the tests and how interested parties can contribute to future interoperability tests.

  9. Exploiting and developing interoperability between multidisciplinary environmental research infrastructures in Europe - step toward international collaboration

    NASA Astrophysics Data System (ADS)

    Sorvari, S.; Asmi, A.; Konijn, J.; Pursula, A.; Los, W.; Laj, P.; Kutsch, W. L.

    2014-12-01

    Environmental Research infrastructures are long-term facilities, resources, and related services that are used by research communities to conduct environmental research in their respective fields. The focus of the European environmental Research Infrastructures is in in-situ or short-range remote sensing infrastructures. Each environmental research infrastructure (RI) has its own particular set of science questions and foci that it must solve to achieve its objectives; however every RI is also providing its data and services to the wider user communities and thus contributing to the wider, trans- and interdisciplinary science questions and grand environmental challenges. Thus, there are many issues that most of the RIs share, e.g. data collection, preservation, quality control, integration and availability, as well as providing the computational capability to researchers. ENVRI - Common operation of European Research Infrastructures - project was a collaborative action of major European Environmental RIs working towards increased cooperation and interoperability between the infrastructures (www.envri.eu). From the technological point-of-view, one of the major results is the development of common Environmental RIs Reference Model, which is a tool to effectively enhance the interoperability among RIs. In addition to common technical solutions, also cultural and human related topics need to be tackled in parallel with the technical solutions. Topics such as open access, data policy issues (licenses, citation agreements, IPR agreements), technologies for machine-machine interaction, workflows, metadata, data annotations, and the training of the data scientist and research generalist to make it all work and implemented. These three interdependent resource capitals (technological incl. ENVRI Reference Model, cultural and human capitals) will be discussed in the presentation.

  10. Semantic Web Research Trends and Directions

    DTIC Science & Technology

    2006-01-01

    social trust on the semantic web that builds upon the previous work to create end user applications that benefit from the semantic foundation. 2 Swoop...security, authentication, and privacy. However, the social component of trust is one that is both important and ideally suited for the Semantic Web. When the...Semantic Web-based social networks are augmented with trust information, it is possible to make computations over the values, and integrate the

  11. Making Interoperability Easier with NASA's Metadata Management Tool (MMT)

    NASA Technical Reports Server (NTRS)

    Shum, Dana; Reese, Mark; Pilone, Dan; Baynes, Katie

    2016-01-01

    While the ISO-19115 collection level metadata format meets many users' needs for interoperable metadata, it can be cumbersome to create it correctly. Through the MMT's simple UI experience, metadata curators can create and edit collections which are compliant with ISO-19115 without full knowledge of the NASA Best Practices implementation of ISO-19115 format. Users are guided through the metadata creation process through a forms-based editor, complete with field information, validation hints and picklists. Once a record is completed, users can download the metadata in any of the supported formats with just 2 clicks.

  12. Leveraging Python Interoperability Tools to Improve Sapphire's Usability

    SciTech Connect

    Gezahegne, A; Love, N S

    2007-12-10

    The Sapphire project at the Center for Applied Scientific Computing (CASC) develops and applies an extensive set of data mining algorithms for the analysis of large data sets. Sapphire's algorithms are currently available as a set of C++ libraries. However many users prefer higher level scripting languages such as Python for their ease of use and flexibility. In this report, we evaluate four interoperability tools for the purpose of wrapping Sapphire's core functionality with Python. Exposing Sapphire's functionality through a Python interface would increase its usability and connect its algorithms to existing Python tools.

  13. ODIP - Ocean Data Interoperability Platform - developing interoperabilty Pilot project 1

    NASA Astrophysics Data System (ADS)

    Schaap, D.

    2014-12-01

    Europe, the USA, Australia and IOC/IODE are making significant progress in facilitating the discovery and access of marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, Geo-Seas, IOOS, the Australian Ocean Portal and the IODE Ocean Data Portal. All of these developments are resulting in the development and implementation of standards for the formats of metadata, data, data products, quality control methods and flags, common vocabularies. They are also providing services for data discovery, viewing and downloading, and software tools for editing, conversions, communication, analysis and presentation, all of which are increasingly being adopted and used by their national and regional marine communities.The Ocean Data Interoperability Platform (ODIP)project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has started 1st October 2012. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC -IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards (ODS) projects.The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The ODIP Prototype project 1 aims at establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP Portals. Use is

  14. Aligning tissue banking data models for caBIG interoperability.

    PubMed

    Riben, Michael; Wade, Geraldine; Edgerton, Mary; Kilbourne, John

    2008-11-06

    MD Anderson Cancer Center strives to share data from its central tissue bank with other institutions via the caBIG data sharing framework. To conform to "NCI Best Practices for Biospecimen Resources", a caGRID compatible model (i.e. caTissue Core/Suite) must be adopted or an existing tissue banking application (TissueStation) must be adapted for interoperability. We present a data model assessment and method used in development of an enterprise strategy for ensuring inter-institution data sharing capabilities.

  15. CCSDS Spacecraft Monitor and Control Mission Operations Interoperability Prototype

    NASA Technical Reports Server (NTRS)

    Lucord, Steve; Martinez, Lindolfo

    2009-01-01

    We are entering a new era in space exploration. Reduced operating budgets require innovative solutions to leverage existing systems to implement the capabilities of future missions. Custom solutions to fulfill mission objectives are no longer viable. Can NASA adopt international standards to reduce costs and increase interoperability with other space agencies? Can legacy systems be leveraged in a service oriented architecture (SOA) to further reduce operations costs? The Operations Technology Facility (OTF) at the Johnson Space Center (JSC) is collaborating with Deutsches Zentrum fur Luft- und Raumfahrt (DLR) to answer these very questions. The Mission Operations and Information Management Services Area (MOIMS) Spacecraft Monitor and Control (SM&C) Working Group within the Consultative Committee for Space Data Systems (CCSDS) is developing the Mission Operations standards to address this problem space. The set of proposed standards presents a service oriented architecture to increase the level of interoperability among space agencies. The OTF and DLR are developing independent implementations of the standards as part of an interoperability prototype. This prototype will address three key components: validation of the SM&C Mission Operations protocol, exploration of the Object Management Group (OMG) Data Distribution Service (DDS), and the incorporation of legacy systems in a SOA. The OTF will implement the service providers described in the SM&C Mission Operation standards to create a portal for interaction with a spacecraft simulator. DLR will implement the service consumers to perform the monitor and control of the spacecraft. The specifications insulate the applications from the underlying transport layer. We will gain experience with a DDS transport layer as we delegate responsibility to the middleware and explore transport bridges to connect disparate middleware products. A SOA facilitates the reuse of software components. The prototype will leverage the

  16. Semantic processing in information retrieval.

    PubMed Central

    Rindflesch, T. C.; Aronson, A. R.

    1993-01-01

    Intuition suggests that one way to enhance the information retrieval process would be the use of phrases to characterize the contents of text. A number of researchers, however, have noted that phrases alone do not improve retrieval effectiveness. In this paper we briefly review the use of phrases in information retrieval and then suggest extensions to this paradigm using semantic information. We claim that semantic processing, which can be viewed as expressing relations between the concepts represented by phrases, will in fact enhance retrieval effectiveness. The availability of the UMLS domain model, which we exploit extensively, significantly contributes to the feasibility of this processing. PMID:8130547

  17. Bootstrapping to a Semantic Grid

    SciTech Connect

    Schwidder, Jens; Talbott, Tara; Myers, James D.

    2005-02-28

    The Scientific Annotation Middleware (SAM) is a set of components and services that enable researchers, applications, problem solving environments (PSE) and software agents to create metadata and annotations about data objects and document the semantic relationships between them. Developed starting in 2001, SAM allows applications to encode metadata within files or to manage metadata at the level of individual relationships as desired. SAM then provides mechanisms to expose metadata and relation¬ships encoded either way as WebDAV properties. In this paper, we report on work to further map this metadata into RDF and discuss the role of middleware such as SAM in bridging between traditional and semantic grid applications.

  18. Language networks in semantic dementia.

    PubMed

    Agosta, Federica; Henry, Roland G; Migliaccio, Raffaella; Neuhaus, John; Miller, Bruce L; Dronkers, Nina F; Brambati, Simona M; Filippi, Massimo; Ogar, Jennifer M; Wilson, Stephen M; Gorno-Tempini, Maria Luisa

    2010-01-01

    Cognitive deficits in semantic dementia have been attributed to anterior temporal lobe grey matter damage; however, key aspects of the syndrome could be due to altered anatomical connectivity between language pathways involving the temporal lobe. The aim of this study was to investigate the left language-related cerebral pathways in semantic dementia using diffusion tensor imaging-based tractography and to combine the findings with cortical anatomical and functional magnetic resonance imaging data obtained during a reading activation task. The left inferior longitudinal fasciculus, arcuate fasciculus and fronto-parietal superior longitudinal fasciculus were tracked in five semantic dementia patients and eight healthy controls. The left uncinate fasciculus and the genu and splenium of the corpus callosum were also obtained for comparison with previous studies. From each tract, mean diffusivity, fractional anisotropy, as well as parallel and transverse diffusivities were obtained. Diffusion tensor imaging results were related to grey and white matter atrophy volume assessed by voxel-based morphometry and functional magnetic resonance imaging activations during a reading task. Semantic dementia patients had significantly higher mean diffusivity, parallel and transverse in the inferior longitudinal fasciculus. The arcuate and uncinate fasciculi demonstrated significantly higher mean diffusivity, parallel and transverse and significantly lower fractional anisotropy. The fronto-parietal superior longitudinal fasciculus was relatively spared, with a significant difference observed for transverse diffusivity and fractional anisotropy, only. In the corpus callosum, the genu showed lower fractional anisotropy compared with controls, while no difference was found in the splenium. The left parietal cortex did not show significant volume changes on voxel-based morphometry and demonstrated normal functional magnetic resonance imaging activation in response to reading items that

  19. Language networks in semantic dementia

    PubMed Central

    Agosta, Federica; Henry, Roland G.; Migliaccio, Raffaella; Neuhaus, John; Miller, Bruce L.; Dronkers, Nina F.; Brambati, Simona M.; Filippi, Massimo; Ogar, Jennifer M.; Wilson, Stephen M.

    2010-01-01

    Cognitive deficits in semantic dementia have been attributed to anterior temporal lobe grey matter damage; however, key aspects of the syndrome could be due to altered anatomical connectivity between language pathways involving the temporal lobe. The aim of this study was to investigate the left language-related cerebral pathways in semantic dementia using diffusion tensor imaging-based tractography and to combine the findings with cortical anatomical and functional magnetic resonance imaging data obtained during a reading activation task. The left inferior longitudinal fasciculus, arcuate fasciculus and fronto-parietal superior longitudinal fasciculus were tracked in five semantic dementia patients and eight healthy controls. The left uncinate fasciculus and the genu and splenium of the corpus callosum were also obtained for comparison with previous studies. From each tract, mean diffusivity, fractional anisotropy, as well as parallel and transverse diffusivities were obtained. Diffusion tensor imaging results were related to grey and white matter atrophy volume assessed by voxel-based morphometry and functional magnetic resonance imaging activations during a reading task. Semantic dementia patients had significantly higher mean diffusivity, parallel and transverse in the inferior longitudinal fasciculus. The arcuate and uncinate fasciculi demonstrated significantly higher mean diffusivity, parallel and transverse and significantly lower fractional anisotropy. The fronto-parietal superior longitudinal fasciculus was relatively spared, with a significant difference observed for transverse diffusivity and fractional anisotropy, only. In the corpus callosum, the genu showed lower fractional anisotropy compared with controls, while no difference was found in the splenium. The left parietal cortex did not show significant volume changes on voxel-based morphometry and demonstrated normal functional magnetic resonance imaging activation in response to reading items that

  20. Abstraction and natural language semantics.

    PubMed Central

    Kayser, Daniel

    2003-01-01

    According to the traditional view, a word prototypically denotes a class of objects sharing similar features, i.e. it results from an abstraction based on the detection of common properties in perceived entities. I explore here another idea: words result from abstraction of common premises in the rules governing our actions. I first argue that taking 'inference', instead of 'reference', as the basic issue in semantics does matter. I then discuss two phenomena that are, in my opinion, particularly difficult to analyse within the scope of traditional semantic theories: systematic polysemy and plurals. I conclude by a discussion of my approach, and by a summary of its main features. PMID:12903662

  1. Examining Lateralized Semantic Access Using Pictures

    ERIC Educational Resources Information Center

    Lovseth, Kyle; Atchley, Ruth Ann

    2010-01-01

    A divided visual field (DVF) experiment examined the semantic processing strategies employed by the cerebral hemispheres to determine if strategies observed with written word stimuli generalize to other media for communicating semantic information. We employed picture stimuli and vary the degree of semantic relatedness between the picture pairs.…

  2. Semantic Relatedness for Evaluation of Course Equivalencies

    ERIC Educational Resources Information Center

    Yang, Beibei

    2012-01-01

    Semantic relatedness, or its inverse, semantic distance, measures the degree of closeness between two pieces of text determined by their meaning. Related work typically measures semantics based on a sparse knowledge base such as WordNet or Cyc that requires intensive manual efforts to build and maintain. Other work is based on a corpus such as the…

  3. Semantic Weight and Verb Retrieval in Aphasia

    ERIC Educational Resources Information Center

    Barde, Laura H. F.; Schwartz, Myrna F.; Boronat, Consuelo B.

    2006-01-01

    Individuals with agrammatic aphasia may have difficulty with verb production in comparison to nouns. Additionally, they may have greater difficulty producing verbs that have fewer semantic components (i.e., are semantically "light") compared to verbs that have greater semantic weight. A connectionist verb-production model proposed by Gordon and…

  4. Chinese Character Decoding: A Semantic Bias?

    ERIC Educational Resources Information Center

    Williams, Clay; Bever, Thomas

    2010-01-01

    The effects of semantic and phonetic radicals on Chinese character decoding were examined. Our results suggest that semantic and phonetic radicals are each available for access when a corresponding task emphasizes one or the other kind of radical. But in a more neutral lexical recognition task, the semantic radical is more informative. Semantic…

  5. Metasemantics: On the Limits of Semantic Theory

    ERIC Educational Resources Information Center

    Parent, T.

    2009-01-01

    METASEMANTICS is a wake-up call for semantic theory: It reveals that some semantic questions have no adequate answer. (This is meant to be the "epistemic" point that certain semantic questions cannot be "settled"--not a metaphysical point about whether there is a fact-of-the-matter.) METASEMANTICS thus checks our default "optimism" that any…

  6. Building Interoperable Learning Objects Using Reduced Learning Object Metadata

    ERIC Educational Resources Information Center

    Saleh, Mostafa S.

    2005-01-01

    The new e-learning generation depends on Semantic Web technology to produce learning objects. As the production of these components is very costly, they should be produced and registered once, and reused and adapted in the same context or in other contexts as often as possible. To produce those components, developers should use learning standards…

  7. Challenges in Microbial Database Interoperability Interagency Microbe Project Working Group

    SciTech Connect

    Critchlow, T

    2001-11-21

    Currently, data of interest to microbial researchers is spread across hundreds of web-accessible data sources, each with a unique interface and data format. Researchers interact with a few of these sites when they analyze their data, but are not able to utilize the majority of them on a regular basis. There are two significant challenges that must be overcome to integrate this environment and allow researchers to efficiently perform data analysis across the entire set of relevant data, or at least a significant portion of it. The first is to provide consistent access to the large numbers of distributed, heterogeneous data sets that are currently distributed over the web. The second is to define the semantics of the data provided by the individual sites in such a way that semantic conflicts can be identified and, ideally, resolved. The first step in establishing any integrated environment, from a data warehouse to a multi-database system, is provide consistent access to all of the relevant sources. While the type of access required will vary based on the integration strategy chosen--for example federated systems use query-based access while warehouses may prefer access to the underlying database--the essence of this challenge remains the same. Thus, without sacrificing generality, the remainder of this discussion focuses on query-based access. Each data source independently determines the queries that it supports, how it will answer them, and the interface that it will use to make them. Even when the same query capability is provided by different sources the details of the interface are usually different. For example, while many sequence data sources support blast searches, they differ in the parameter names, available options, script locations, etc. These differences are not restricted solely to input parameters; the query results returned by different sources also vary dramatically, with some sources returning XML, others preformatted text, and still others a

  8. CINERGI: Community Inventory of EarthCube Resources for Geoscience Interoperability

    NASA Astrophysics Data System (ADS)

    Zaslavsky, Ilya; Bermudez, Luis; Grethe, Jeffrey; Gupta, Amarnath; Hsu, Leslie; Lehnert, Kerstin; Malik, Tanu; Richard, Stephen; Valentine, David; Whitenack, Thomas

    2014-05-01

    Organizing geoscience data resources to support cross-disciplinary data discovery, interpretation, analysis and integration is challenging because of different information models, semantic frameworks, metadata profiles, catalogs, and services used in different geoscience domains, not to mention different research paradigms and methodologies. The central goal of CINERGI, a new project supported by the US National Science Foundation through its EarthCube Building Blocks program, is to create a methodology and assemble a large inventory of high-quality information resources capable of supporting data discovery needs of researchers in a wide range of geoscience domains. The key characteristics of the inventory are: 1) collaboration with and integration of metadata resources from a number of large data facilities; 2) reliance on international metadata and catalog service standards; 3) assessment of resource "interoperability-readiness"; 4) ability to cross-link and navigate data resources, projects, models, researcher directories, publications, usage information, etc.; 5) efficient inclusion of "long-tail" data, which are not appearing in existing domain repositories; 6) data registration at feature level where appropriate, in addition to common dataset-level registration, and 7) integration with parallel EarthCube efforts, in particular focused on EarthCube governance, information brokering, service-oriented architecture design and management of semantic information. We discuss challenges associated with accomplishing CINERGI goals, including defining the inventory scope; managing different granularity levels of resource registration; interaction with search systems of domain repositories; explicating domain semantics; metadata brokering, harvesting and pruning; managing provenance of the harvested metadata; and cross-linking resources based on the linked open data (LOD) approaches. At the higher level of the inventory, we register domain-wide resources such as domain

  9. Evaluation of Interoperability Protocols in Repositories of Electronic Theses and Dissertations

    ERIC Educational Resources Information Center

    Hakimjavadi, Hesamedin; Masrek, Mohamad Noorman

    2013-01-01

    Purpose: The purpose of this study is to evaluate the status of eight interoperability protocols within repositories of electronic theses and dissertations (ETDs) as an introduction to further studies on feasibility of deploying these protocols in upcoming areas of interoperability. Design/methodology/approach: Three surveys of 266 ETD…

  10. 49 CFR 232.603 - Design, interoperability, and configuration management requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Pneumatic (ECP) Braking Systems § 232.603 Design, interoperability, and configuration management... 49 Transportation 4 2010-10-01 2010-10-01 false Design, interoperability, and configuration management requirements. 232.603 Section 232.603 Transportation Other Regulations Relating to...

  11. 49 CFR 232.603 - Design, interoperability, and configuration management requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Pneumatic (ECP) Braking Systems § 232.603 Design, interoperability, and configuration management... 49 Transportation 4 2013-10-01 2013-10-01 false Design, interoperability, and configuration management requirements. 232.603 Section 232.603 Transportation Other Regulations Relating to...

  12. 49 CFR 232.603 - Design, interoperability, and configuration management requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Pneumatic (ECP) Braking Systems § 232.603 Design, interoperability, and configuration management... 49 Transportation 4 2014-10-01 2014-10-01 false Design, interoperability, and configuration management requirements. 232.603 Section 232.603 Transportation Other Regulations Relating to...

  13. 49 CFR 232.603 - Design, interoperability, and configuration management requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Pneumatic (ECP) Braking Systems § 232.603 Design, interoperability, and configuration management... 49 Transportation 4 2012-10-01 2012-10-01 false Design, interoperability, and configuration management requirements. 232.603 Section 232.603 Transportation Other Regulations Relating to...

  14. 49 CFR 232.603 - Design, interoperability, and configuration management requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Pneumatic (ECP) Braking Systems § 232.603 Design, interoperability, and configuration management... 49 Transportation 4 2011-10-01 2011-10-01 false Design, interoperability, and configuration management requirements. 232.603 Section 232.603 Transportation Other Regulations Relating to...

  15. Examining the Relationship between Electronic Health Record Interoperability and Quality Management

    ERIC Educational Resources Information Center

    Purcell, Bernice M.

    2013-01-01

    A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…

  16. 77 FR 66588 - Development of the Nationwide Interoperable Public Safety Broadband Network

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-06

    ... Interoperable Public Safety Broadband Network AGENCY: National Telecommunications and Information Administration... meeting of the First Responder Network Authority (FirstNet) as well as to invite input on other network.../comments-nationwide-interoperable-public-safety-broadband-network-noi . All personal...

  17. IT Labs Proof-of-Concept Project: Technical Data Interoperability (TDI) Pathfinder Via Emerging Standards

    NASA Technical Reports Server (NTRS)

    Conroy, Mike; Gill, Paul; Ingalls, John; Bengtsson, Kjell

    2014-01-01

    No known system is in place to allow NASA technical data interoperability throughout the whole life cycle. Life Cycle Cost (LCC) will be higher on many developing programs if action isn't taken soon to join disparate systems efficiently. Disparate technical data also increases safety risks from poorly integrated elements. NASA requires interoperability and industry standards, but breaking legacy ways is a challenge.

  18. Benefits of Linked Data for Interoperability during Crisis Management

    NASA Astrophysics Data System (ADS)

    Roller, R.; Roes, J.; Verbree, E.

    2015-08-01

    Floodings represent a permanent risk to the Netherlands in general and to her power supply in particular. Data sharing is essential within this crisis scenario as a power cut affects a great variety of interdependant sectors. Currently used data sharing systems have been shown to hamper interoperability between stakeholders since they lack flexibility and there is no consensus in term definitions and interpretations. The study presented in this paper addresses these challenges by proposing a new data sharing solution based on Linked Data, a method of interlinking data points in a structured way on the web. A conceptual model for two data sharing parties in a flood-caused power cut crisis management scenario was developed to which relevant data were linked. The analysis revealed that the presented data sharing solution burderns its user with extra costs in the short run, but saves resources in the long run by overcoming interoperability problems of the legacy systems. The more stakeholders adopt Linked Data the stronger its benefits for data sharing will become.

  19. ACTS 118x: High Speed TCP Interoperability Testing

    NASA Technical Reports Server (NTRS)

    Brooks, David E.; Buffinton, Craig; Beering, Dave R.; Welch, Arun; Ivancic, William D.; Zernic, Mike; Hoder, Douglas J.

    1999-01-01

    With the recent explosion of the Internet and the enormous business opportunities available to communication system providers, great interest has developed in improving the efficiency of data transfer over satellite links using the Transmission Control Protocol (TCP) of the Internet Protocol (IP) suite. The NASA's ACTS experiments program initiated a series of TCP experiments to demonstrate scalability of TCP/IP and determine to what extent the protocol can be optimized over a 622 Mbps satellite link. Through partnerships with the government technology oriented labs, computer, telecommunication, and satellite industries NASA Glenn was able to: (1) promote the development of interoperable, high-performance TCP/IP implementations across multiple computing / operating platforms; (2) work with the satellite industry to answer outstanding questions regarding the use of standard protocols (TCP/IP and ATM) for the delivery of advanced data services, and for use in spacecraft architectures; and (3) conduct a series of TCP/IP interoperability tests over OC12 ATM over a satellite network in a multi-vendor environment using ACTS. The experiments' various network configurations and the results are presented.

  20. Interoperability standards enabling cross-border patient summary exchange.

    PubMed

    Chronaki, Catherine; Estelrich, Ana; Cangioli, Giorgio; Melgara, Marcello; Kalra, Dipak; Gonzaga, Zabrina; Garber, Larry; Blechman, Elaine; Ferguson, Jamie; Kay, Stephen

    2014-01-01

    In an increasingly mobile world, many citizens and professionals are frequent travellers. Access during unplanned care to their patient summary, their most essential health information in a form physicians in another country can understand can impact not only their safety, but also the quality and effectiveness of care. International health information technology (HIT) standards such as HL7 CDA have been developed to advance interoperability. Implementation guides (IG) and IHE profiles constrain standards and make them fit for the purpose of specific use cases. A joint effort between HL7, IHE, and HealthStory created Consolidated CDA (C-CDA), a set of harmonized CDA IGs for the US that is cited in the Meaning Use II (MU-II) regulation. In the EU, the Patient Summary (PS) Guideline recently adopted, cites the epSOS IG also based on HL7 CDA, to support cross-border care in the EU and inform national eHealth programs. Trillium Bridge project supports international standards development by extending the EU PS Guideline and MU-II in the transatlantic setting. This paper presents preliminary findings from comparing patient summaries in the EU and US and reflects on the challenge of implementing interoperable eHealth systems in the cross-border or transatlantic setting.