Space Network Interoperability Panel (SNIP) study
NASA Technical Reports Server (NTRS)
Ryan, Thomas; Lenhart, Klaus; Hara, Hideo
1991-01-01
The Space Network Interoperability Panel (SNIP) study is a tripartite study that involves the National Aeronautics and Space Administration (NASA), the European Space Agency (ESA), and the National Space Development Agency (NASDA) of Japan. SNIP involves an ongoing interoperability study of the Data Relay Satellite (DRS) Systems of the three organizations. The study is broken down into two parts; Phase one deals with S-band (2 GHz) interoperability and Phase two deals with Ka-band (20/30 GHz) interoperability (in addition to S-band). In 1987 the SNIP formed a Working Group to define and study operations concepts and technical subjects to assure compatibility of the international data relay systems. Since that time a number of Panel and Working Group meetings have been held to continue the study. Interoperability is of interest to the three agencies because it offers a number of potential operation and economic benefits. This paper presents the history and status of the SNIP study.
PACS/information systems interoperability using Enterprise Communication Framework.
alSafadi, Y; Lord, W P; Mankovich, N J
1998-06-01
Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).
NASA Technical Reports Server (NTRS)
Fern, Lisa; Rorie, Conrad; Shively, Jay
2016-01-01
At the May 2015 SC-228 meeting, requirements for TCAS II interoperability became elevated in priority. A TCAS interoperability work group was formed to identify and address key issuesquestions. The TCAS work group came up with an initial list of questions and a plan to address those questions. As part of that plan, NASA proposed to run a mini HITL to address display, alerting and guidance issues. A TCAS Interoperability Workshop was held to determine potential displayalertingguidance issues that could be explored in future NASA mini HITLS. Consensus on main functionality of DAA guidance when TCAS II RA occurs. Prioritized list of independent variables for experimental design. Set of use cases to stress TCAS Interoperability.
The Health Service Bus: an architecture and case study in achieving interoperability in healthcare.
Ryan, Amanda; Eklund, Peter
2010-01-01
Interoperability in healthcare is a requirement for effective communication between entities, to ensure timely access to up to-date patient information and medical knowledge, and thus facilitate consistent patient care. An interoperability framework called the Health Service Bus (HSB), based on the Enterprise Service Bus (ESB) middleware software architecture is presented here as a solution to all three levels of interoperability as defined by the HL7 EHR Interoperability Work group in their definitive white paper "Coming to Terms". A prototype HSB system was implemented based on the Mule Open-Source ESB and is outlined and discussed, followed by a clinically-based example.
NASA Technical Reports Server (NTRS)
Bradley, Arthur; Dubowsky, Steven; Quinn, Roger; Marzwell, Neville
2005-01-01
Robots that operate independently of one another will not be adequate to accomplish the future exploration tasks of long-distance autonomous navigation, habitat construction, resource discovery, and material handling. Such activities will require that systems widely share information, plan and divide complex tasks, share common resources, and physically cooperate to manipulate objects. Recognizing the need for interoperable robots to accomplish the new exploration initiative, NASA s Office of Exploration Systems Research & Technology recently funded the development of the Joint Technical Architecture for Robotic Systems (JTARS). JTARS charter is to identify the interface standards necessary to achieve interoperability among space robots. A JTARS working group (JTARS-WG) has been established comprising recognized leaders in the field of space robotics including representatives from seven NASA centers along with academia and private industry. The working group s early accomplishments include addressing key issues required for interoperability, defining which systems are within the project s scope, and framing the JTARS manuals around classes of robotic systems.
Managing Interoperability for GEOSS - A Report from the SIF
NASA Astrophysics Data System (ADS)
Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.
2009-04-01
The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of GEOSS.
NASA Technical Reports Server (NTRS)
Lucord, Steve A.; Gully, Sylvain
2009-01-01
The purpose of the PROTOTYPE INTEROPERABILITY DOCUMENT is to document the design and interfaces for the service providers and consumers of a Mission Operations prototype between JSC-OTF and DLR-GSOC. The primary goal is to test the interoperability sections of the CCSDS Spacecraft Monitor & Control (SM&C) Mission Operations (MO) specifications between both control centers. An additional goal is to provide feedback to the Spacecraft Monitor and Control (SM&C) working group through the Review Item Disposition (RID) process. This Prototype is considered a proof of concept and should increase the knowledge base of the CCSDS SM&C Mission Operations standards. No operational capabilities will be provided. The CCSDS Mission Operations (MO) initiative was previously called Spacecraft Monitor and Control (SM&C). The specifications have been renamed to better reflect the scope and overall objectives. The working group retains the name Spacecraft Monitor and Control working group and is under the Mission Operations and Information Services Area (MOIMS) of CCSDS. This document will refer to the specifications as SM&C Mission Operations, Mission Operations or just MO.
Opportunities for Next Generation BML: Semantic C-BML
2014-06-01
Simulation Interoperability Workshop, 05S- SIW -007, San Diego, CA. 2005. [11] Schade, Ulrich, Bastian Haarmann, and Michael R. Hieb. "A Grammar for...Language (C-BML) Product Development Group.” Paper 06F- SIW -003. In Proceed-ings of the Fall Simulation Interoperability Workshop. Simulation...process – Based on a specification provided by the C-BML product development group (Blais, Curtis, et al; SISO Fall 2011 SIW ) – My work provides insight
Clinical data interoperability based on archetype transformation.
Costa, Catalina Martínez; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2011-10-01
The semantic interoperability between health information systems is a major challenge to improve the quality of clinical practice and patient safety. In recent years many projects have faced this problem and provided solutions based on specific standards and technologies in order to satisfy the needs of a particular scenario. Most of such solutions cannot be easily adapted to new scenarios, thus more global solutions are needed. In this work, we have focused on the semantic interoperability of electronic healthcare records standards based on the dual model architecture and we have developed a solution that has been applied to ISO 13606 and openEHR. The technological infrastructure combines reference models, archetypes and ontologies, with the support of Model-driven Engineering techniques. For this purpose, the interoperability infrastructure developed in previous work by our group has been reused and extended to cover the requirements of data transformation. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA's Geospatial Interoperability Office(GIO)Program
NASA Technical Reports Server (NTRS)
Weir, Patricia
2004-01-01
NASA produces vast amounts of information about the Earth from satellites, supercomputer models, and other sources. These data are most useful when made easily accessible to NASA researchers and scientists, to NASA's partner Federal Agencies, and to society as a whole. A NASA goal is to apply its data for knowledge gain, decision support and understanding of Earth, and other planetary systems. The NASA Earth Science Enterprise (ESE) Geospatial Interoperability Office (GIO) Program leads the development, promotion and implementation of information technology standards that accelerate and expand the delivery of NASA's Earth system science research through integrated systems solutions. Our overarching goal is to make it easy for decision-makers, scientists and citizens to use NASA's science information. NASA's Federal partners currently participate with NASA and one another in the development and implementation of geospatial standards to ensure the most efficient and effective access to one another's data. Through the GIO, NASA participates with its Federal partners in implementing interoperability standards in support of E-Gov and the associated President's Management Agenda initiatives by collaborating on standards development. Through partnerships with government, private industry, education and communities the GIO works towards enhancing the ESE Applications Division in the area of National Applications and decision support systems. The GIO provides geospatial standards leadership within NASA, represents NASA on the Federal Geographic Data Committee (FGDC) Coordination Working Group and chairs the FGDC's Geospatial Applications and Interoperability Working Group (GAI) and supports development and implementation efforts such as Earth Science Gateway (ESG), Space Time Tool Kit and Web Map Services (WMS) Global Mosaic. The GIO supports NASA in the collection and dissemination of geospatial interoperability standards needs and progress throughout the agency including areas such as ESE Applications, the SEEDS Working Groups, the Facilities Engineering Division (Code JX) and NASA's Chief Information Offices (CIO). With these agency level requirements GIO leads, brokers and facilitates efforts to, develop, implement, influence and fully participate in standards development internationally, federally and locally. The GIO also represents NASA in the OpenGIS Consortium and ISO TC211. The OGC has made considerable progress in regards to relations with other open standards bodies; namely ISO, W3C and OASIS. ISO TC211 is the Geographic and Geomatics Information technical committee that works towards standardization in the field of digital geographic information. The GIO focuses on seamless access to data, applications of data, and enabling technologies furthering the interoperability of distributed data. Through teaming within the Applications Directorate and partnerships with government, private industry, education and communities, GIO works towards the data application goals of NASA, the ESE Applications Directorate, and our Federal partners by managing projects in four categories: Geospatial Standards and Leadership, Geospatial One Stop, Standards Development and Implementation, and National and NASA Activities.
NASA Technical Reports Server (NTRS)
Fischer, Daniel; Aguilar-Sanchez, Ignacio; Saba, Bruno; Moury, Gilles; Biggerstaff, Craig; Bailey, Brandon; Weiss, Howard; Pilgram, Martin; Richter, Dorothea
2015-01-01
The protection of data transmitted over the space-link is an issue of growing importance also for civilian space missions. Through the Consultative Committee for Space Data Systems (CCSDS), space agencies have reacted to this need by specifying the Space Data-Link Layer Security (SDLS) protocol which provides confidentiality and integrity services for the CCSDS Telemetry (TM), Telecommand (TC) and Advanced Orbiting Services (AOS) space data-link protocols. This paper describes the approach of the CCSDS SDLS working group to specify and execute the necessary interoperability tests. It first details the individual SDLS implementations that have been produced by ESA, NASA, and CNES and then the overall architecture that allows the interoperability tests between them. The paper reports on the results of the interoperability tests and identifies relevant aspects for the evolution of the test environment.
2005-08-01
the Office of the Secretary of Defense chartered the Joint Architecture for Unmanned Ground Systems ( JAUGS ) Working Group to address these concerns...The JAUGS Working Group was tasked with developing an initial standard for interoperable unmanned ground systems. In 2002, the charter of the... JAUGS Working Group was 1 2 modified such that their efforts would extend to all unmanned systems, not only ground systems. The standard was
Study and validation of tools interoperability in JPSEC
NASA Astrophysics Data System (ADS)
Conan, V.; Sadourny, Y.; Jean-Marie, K.; Chan, C.; Wee, S.; Apostolopoulos, J.
2005-08-01
Digital imagery is important in many applications today, and the security of digital imagery is important today and is likely to gain in importance in the near future. The emerging international standard ISO/IEC JPEG-2000 Security (JPSEC) is designed to provide security for digital imagery, and in particular digital imagery coded with the JPEG-2000 image coding standard. One of the primary goals of a standard is to ensure interoperability between creators and consumers produced by different manufacturers. The JPSEC standard, similar to the popular JPEG and MPEG family of standards, specifies only the bitstream syntax and the receiver's processing, and not how the bitstream is created or the details of how it is consumed. This paper examines the interoperability for the JPSEC standard, and presents an example JPSEC consumption process which can provide insights in the design of JPSEC consumers. Initial interoperability tests between different groups with independently created implementations of JPSEC creators and consumers have been successful in providing the JPSEC security services of confidentiality (via encryption) and authentication (via message authentication codes, or MACs). Further interoperability work is on-going.
Mookencherry, Shefali
2012-01-01
It makes strategic and business sense for payers and providers to collaborate on how to take substantial cost out of the healthcare delivery system. Acting independently, neither medical groups, hospitals nor health plans have the optimal mix of resources and incentives to significantly reduce costs. Payers have core assets such as marketing, claims data, claims processing, reimbursement systems and capital. It would be cost prohibitive for all but the largest providers to develop these capabilities in order to compete directly with insurers. Likewise, medical groups and hospitals are positioned to foster financial interdependence among providers and coordinate the continuum of patient illnesses and care settings. Payers and providers should commit to reasonable clinical and cost goals, and share resources to minimize expenses and financial risks. It is in the interest of payers to work closely with providers on risk-management strategies because insurers need synergy with ACOs to remain cost competitive. It is in the interest of ACOs to work collaboratively with payers early on to develop reasonable and effective performance benchmarks. Hence, it is essential to have payer interoperability and data sharing integrated in an ACO model.
NASA Astrophysics Data System (ADS)
McDonald, K. R.; Faundeen, J. L.; Petiteville, I.
2005-12-01
The Committee on Earth Observation Satellites (CEOS) was established in 1984 in response to a recommendation from the Economic Summit of Industrialized Nations Working Group on Growth, Technology, and Employment's Panel of Experts on Satellite Remote Sensing. CEOS participants are Members, who are national or international governmental organizations who operate civil spaceborne Earth observation satellites, and Associates who are governmental organizations with civil space programs in development or international scientific or governmental bodies who have an interest in and support CEOS objectives. The primary objective of CEOS is to optimize benefits of satellite Earth observations through cooperation of its participants in mission planning and in development of compatible data products, formats, services, applications and policies. To pursue its objectives, CEOS establishes working groups and associated subgroups that focus on relevant areas of interest. While the structure of CEOS has evolved over its lifetime, today there are three permanent working groups. One is the Working Group on Calibration and Validation that addresses sensor-specific calibration and validation and geophysical parameter validation. A second is the Working Group on Education, Training, and Capacity Building that facilitates activities that enhance international education and training in Earth observation techniques, data analysis, interpretation and applications, with a particular focus on developing countries. The third permanent working group is the Working Group on Information Systems and Services (WGISS). The purpose of WGISS is to promote collaboration in the development of the systems and services based on international standards that manage and supply the Earth observation data and information from participating agencies' missions. WGISS places great emphasis on the use of demonstration projects involving user groups to solve the critical interoperability issues associated with the achievement of global services and its structure reflects that objective. The Technology and Services Subgroup initiates tasks to explore emerging technologies that can be employed to create data and information systems and to develop interoperable services. The interests of the subgroup span the full range of the information processing chain from the initial ingestion of satellite data into archives through to the incorporation of derived information into end-user applications. The subgroup has overseen the creation of an Interoperable Directory Network and an Interoperable Catalog System and has tasks that are investigating the use of new technologies such as Web Services, Grid, and Open Geographical Information Systems to provide enhanced capabilities. The WGISS Projects and Applications Subgroup works with outside organizations to understand their requirements and then helps them to exploit the tools and services available through WGISS and its members and associates. WGISS has instituted the concept of a WGISS Test Facility to test and develop information systems and services prototypes collaboratively with these organizations to meet their specific requirements. This approach has the dual benefit of addressing real information systems and services needs of science and applications projects and helping WGISS to expand and improve its capabilities based on the experience and lessons learned from working with the projects.
Moving Beyond the 10,000 Ways That Don't Work
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Arctur, D. K.; Rueda, C.
2009-12-01
From his research in developing light bulb filaments, Thomas Edison provide us with a good lesson to advance any venture. He said "I have not failed, I've just found 10,000 ways that won't work." Advancing data and access interoperability is one of those ventures difficult to achieve because of the differences among the participating communities. Even within the marine domain, different communities exist and with them different technologies (formats and protocols) to publish data and its descriptions, and different vocabularies to name things (e.g. parameters, sensor types). Simplifying the heterogeneity of technologies is not only accomplished by adopting standards, but by creating profiles, and advancing tools that use those standards. In some cases, standards are advanced by building from existing tools. But what is the best strategy? Edison could provide us a hint. Prototypes and test beds are essential to achieve interoperability among geospatial communities. The Open Geospatial Consortium (OGC) calls them interoperability experiments. The World Wide Web Consortium (W3C) calls them incubator projects. Prototypes help test and refine specifications. The Marine Metadata Interoperability (MMI) Initiative, which is advancing marine data integration and re-use by promoting community solutions, understood this strategy and started an interoperability demonstration with the SURA Coastal Ocean Observing and Prediction (SCOOP) program. This interoperability demonstration transformed into the OGC Ocean Science Interoperability Experiment (Oceans IE). The Oceans IE brings together the Ocean-Observing community to advance interoperability of ocean observing systems by using OGC Standards. The Oceans IE Phase I investigated the use of OGC Web Feature Service (WFS) and OGC Sensor Observation Service (SOS) standards for representing and exchanging point data records from fixed in-situ marine platforms. The Oceans IE Phase I produced an engineering best practices report, advanced reference implementations, and submitted various change requests that are now being considered by the OGC SOS working group. Building on Phase I, and with a focus on semantically-enabled services, Oceans IE Phase II will continue the use and improvement of OGC specifications in the marine community. We will present the lessons learned and in particular the strategy of experimenting with technologies to advance standards to publish data in marine communities, which could also help advance interoperability in other geospatial communities. We will also discuss the growing collaborations among ocean-observing standards organizations that will bring about the institutional acceptance needed for these technologies and practices to gain traction globally.
The GEOSS solution for enabling data interoperability and integrative research.
Nativi, Stefano; Mazzetti, Paolo; Craglia, Max; Pirrone, Nicola
2014-03-01
Global sustainability research requires an integrative research effort underpinned by digital infrastructures (systems) able to harness data and heterogeneous information across disciplines. Digital data and information sharing across systems and applications is achieved by implementing interoperability: a property of a product or system to work with other products or systems, present or future. There are at least three main interoperability challenges a digital infrastructure must address: technological, semantic, and organizational. In recent years, important international programs and initiatives are focusing on such an ambitious objective. This manuscript presents and combines the studies and the experiences carried out by three relevant projects, focusing on the heavy metal domain: Global Mercury Observation System, Global Earth Observation System of Systems (GEOSS), and INSPIRE. This research work recognized a valuable interoperability service bus (i.e., a set of standards models, interfaces, and good practices) proposed to characterize the integrative research cyber-infrastructure of the heavy metal research community. In the paper, the GEOSS common infrastructure is discussed implementing a multidisciplinary and participatory research infrastructure, introducing a possible roadmap for the heavy metal pollution research community to join GEOSS as a new Group on Earth Observation community of practice and develop a research infrastructure for carrying out integrative research in its specific domain.
Ensuring Sustainable Data Interoperability Across the Natural and Social Sciences
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2015-12-01
Both the natural and social science data communities are attempting to address the long-term sustainability of their data infrastructures in rapidly changing research, technological, and policy environments. Many parts of these communities are also considering how to improve the interoperability and integration of their data and systems across natural, social, health, and other domains. However, these efforts have generally been undertaken in parallel, with little thought about how different sustainability approaches may impact long-term interoperability from scientific, legal, or economic perspectives, or vice versa, i.e., how improved interoperability could enhance—or threaten—infrastructure sustainability. Scientific progress depends substantially on the ability to learn from the legacy of previous work available for current and future scientists to study, often by integrating disparate data not previously assembled. Digital data are less likely than scientific publications to be usable in the future unless they are managed by science-oriented repositories that can support long-term data access with the documentation and services needed for future interoperability. We summarize recent discussions in the social and natural science communities on emerging approaches to sustainability and relevant interoperability activities, including efforts by the Belmont Forum E-Infrastructures project to address global change data infrastructure needs; the Group on Earth Observations to further implement data sharing and improve data management across diverse societal benefit areas; and the Research Data Alliance to develop legal interoperability principles and guidelines and to address challenges faced by domain repositories. We also examine emerging needs for data interoperability in the context of the post-2015 development agenda and the expected set of Sustainable Development Goals (SDGs), which set ambitious targets for sustainable development, poverty reduction, and environmental stewardship by 2030. These efforts suggest the need for a holistic approach towards improving and implementing strategies, policies, and practices that will ensure long-term sustainability and interoperability of scientific data repositories and networks across multiple scientific domains.
NASA Astrophysics Data System (ADS)
Fontaine, K. S.
2015-12-01
The Research Data Alliance (RDA) is an international organization created in 2012 to provide researchers with a forum for identifying and removing barriers to data sharing. Since then, RDA has gained over 3000 individual members, over three dozen organizational members, 47 Interest Groups, and 17 Working Groups, all focused on research data sharing. Interoperability is one instantiation of data sharing, but is not the only barrier to overcome. Technology limitations, discipline-specific cultures that do not support sharing, lack of best-practices, or lack of good definitions, are only three of a long list of situations preventing researchers from sharing their data. This presentation will cover how RDA has grown, some details on how the first eight solutions contribute to interoperability and sharing, and a sneak peek at what's in the pipeline.
Systems biology driven software design for the research enterprise.
Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya
2008-06-25
In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data.
Maturity model for enterprise interoperability
NASA Astrophysics Data System (ADS)
Guédria, Wided; Naudet, Yannick; Chen, David
2015-01-01
Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.
D-ATM, a working example of health care interoperability: From dirt path to gravel road.
DeClaris, John-William
2009-01-01
For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.
Combining the CIDOC CRM and MPEG-7 to Describe Multimedia in Museums.
ERIC Educational Resources Information Center
Hunter, Jane
This paper describes a proposal for an interoperable metadata model, based on international standards, that has been designed to enable the description, exchange and sharing of multimedia resources both within and between cultural institutions. Domain-specific ontologies have been developed by two different ISO Working Groups to standardize the…
2017-02-22
manages operations through guidance, policies, programs, and organizations. The NSG is designed to be a mutually supportive enterprise that...deliberate technical design and deliberate human actions. Geospatial engineer teams (GETs) within the geospatial intelligence cells are the day-to-day...standards working group and are designated by the AGC Geospatial Acquisition Support Directorate as required for interoperability. Applicable standards
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-25
... Grid Cyber Security AGENCY: National Institute of Standards and Technology (NIST), Department of... and Technology (NIST) seeks comments on draft NISTIR 7628 Rev. 1, Guidelines for Smart Grid Cyber... (formerly the Cyber Security Working Group) of the Smart Grid Interoperability Panel. The document has been...
Systems biology driven software design for the research enterprise
Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya
2008-01-01
Background In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. Results We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. Conclusion By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data. PMID:18578887
EUnetHTA information management system: development and lessons learned.
Chalon, Patrice X; Kraemer, Peter
2014-11-01
The aim of this study was to describe the techniques used in achieving consensus on common standards to be implemented in the EUnetHTA Information Management System (IMS); and to describe how interoperability between tools was explored. Three face to face meetings were organized to identify and agree on common standards to the development of online tools. Two tools were created to demonstrate the added value of implementing interoperability standards at local levels. Developers of tools outside EUnetHTA were identified and contacted. Four common standards have been agreed on by consensus; and consequently all EUnetHTA tools have been modified or designed accordingly. RDF Site Summary (RSS) has demonstrated a good potential to support rapid dissemination of HTA information. Contacts outside EUnetHTA resulted in direct collaboration (HTA glossary, HTAi Vortal), evaluation of options for interoperability between tools (CRD HTA database) or a formal framework to prepare cooperation on concrete projects (INAHTA projects database). While being entitled a project on IT infrastructure, the work program was also about people. When having to agree on complex topics, fostering a cohesive group dynamic and hosting face to face meetings brings added value and enhances understanding between partners. The adoption of widespread standards enhanced the homogeneity of the EUnetHTA tools and should thus contribute to their wider use, therefore, to the general objective of EUnetHTA. The initiatives on interoperability of systems need to be developed further to support a general interoperable information system that could benefit the whole HTA community.
Dependable Emergency-Response Networking Based on Retaskable Network Infrastructures
2008-04-01
a Focus Group for the National Reliability and Interoperability Council (NRIC VII), which has helped to suggest a list of possible types of agents...APR 2008 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Dependable Emergency-Response Networking Based on Retaskable Network...of his network op- timization algorithms. We would like to thank the TCIP Center team for their feed- back on this work. This work was supported in
Inter-agency Working Group for Airborne Data and Telemetry Systems (IWGADTS)
NASA Technical Reports Server (NTRS)
Webster, Chris; Freudinger, Lawrence; Sorenson, Carl; Myers, Jeff; Sullivan, Don; Oolman, Larry
2009-01-01
The Interagency Coordinating Committee for Airborne Geosciences Research and Applications (ICCAGRA) was established to improve cooperation and communication among agencies sponsoring airborne platforms and instruments for research and applications, and to serve as a resource for senior level management on airborne geosciences issues. The Interagency Working Group for Airborne Data and Telecommunications Systems (IWGADTS) is a subgroup to ICCAGRA for the purpose of developing recommendations leading to increased interoperability among airborne platforms and instrument payloads, producing increased synergy among research programs with similar goals, and enabling the suborbital layer of the Global Earth Observing System of Systems.
Commanding Heterogeneous Multi-Robot Teams
2014-06-01
Coalition Battle Management Language (C-BML) Study Group Report. 2005 Fall Simulation Interoperability Workshop (05F- SIW - 041), Orlando, FL, September...NMSG-085 CIG Land Operation Demonstration. 2013 Spring Simulation Interoperability Workshop (13S- SIW -031), San Diego, CA. April 2013. [4] K...Simulation Interoperability Workshop (10F- SIW -039), Orlando, FL, September 2010. [5] M. Langerwisch, M. Ax, S. Thamke, T. Remmersmann, A. Tiderko
NASA Astrophysics Data System (ADS)
Arney, David; Goldman, Julian M.; Whitehead, Susan F.; Lee, Insup
When a x-ray image is needed during surgery, clinicians may stop the anesthesia machine ventilator while the exposure is made. If the ventilator is not restarted promptly, the patient may experience severe complications. This paper explores the interconnection of a ventilator and simulated x-ray into a prototype plug-and-play medical device system. This work assists ongoing interoperability framework development standards efforts to develop functional and non-functional requirements and illustrates the potential patient safety benefits of interoperable medical device systems by implementing a solution to a clinical use case requiring interoperability.
Look who's talking. A guide to interoperability groups and resources.
2011-06-01
There are huge challenges in getting medical devices to communicate with other devices and to information systems. Fortunately, a number of groups have emerged to help hospitals cope. Here's a description of the most prominent ones, including useful web links for each. We also discuss the latest and most pertinent interoperability standards.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, Dave; Stephan, Eric G.; Wang, Weimin
Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability liemore » the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.« less
Enabling Medical Device Interoperability for the Integrated Clinical Environment
2016-12-01
else who is eager to work together to mature the healthcare technology ecosystem to enable the next generation of safe and intelligent medical device...Award Number: W81XWH-12-C-0154 TITLE: “Enabling Medical Device Interoperability for the Integrated Clinical Environment ” PRINCIPAL INVESTIGATOR...SUBTITLE 5a. CONTRACT NUMBER W81XWH-12-C-0154 “Enabling Medical Device Interoperability for the Integrated Clinical Environment ” 5b. GRANT NUMBER 5c
2014-01-01
The application of semantic technologies to the integration of biological data and the interoperability of bioinformatics analysis and visualization tools has been the common theme of a series of annual BioHackathons hosted in Japan for the past five years. Here we provide a review of the activities and outcomes from the BioHackathons held in 2011 in Kyoto and 2012 in Toyama. In order to efficiently implement semantic technologies in the life sciences, participants formed various sub-groups and worked on the following topics: Resource Description Framework (RDF) models for specific domains, text mining of the literature, ontology development, essential metadata for biological databases, platforms to enable efficient Semantic Web technology development and interoperability, and the development of applications for Semantic Web data. In this review, we briefly introduce the themes covered by these sub-groups. The observations made, conclusions drawn, and software development projects that emerged from these activities are discussed. PMID:24495517
Seeking the Path to Metadata Nirvana
NASA Astrophysics Data System (ADS)
Graybeal, J.
2008-12-01
Scientists have always found reusing other scientists' data challenging. Computers did not fundamentally change the problem, but enabled more and larger instances of it. In fact, by removing human mediation and time delays from the data sharing process, computers emphasize the contextual information that must be exchanged in order to exchange and reuse data. This requirement for contextual information has two faces: "interoperability" when talking about systems, and "the metadata problem" when talking about data. As much as any single organization, the Marine Metadata Interoperability (MMI) project has been tagged with the mission "Solve the metadata problem." Of course, if that goal is achieved, then sustained, interoperable data systems for interdisciplinary observing networks can be easily built -- pesky metadata differences, like which protocol to use for data exchange, or what the data actually measures, will be a thing of the past. Alas, as you might imagine, there will always be complexities and incompatibilities that are not addressed, and data systems that are not interoperable, even within a science discipline. So should we throw up our hands and surrender to the inevitable? Not at all. Rather, we try to minimize metadata problems as much as we can. In this we increasingly progress, despite natural forces that pull in the other direction. Computer systems let us work with more complexity, build community knowledge and collaborations, and preserve and publish our progress and (dis-)agreements. Funding organizations, science communities, and technologists see the importance interoperable systems and metadata, and direct resources toward them. With the new approaches and resources, projects like IPY and MMI can simultaneously define, display, and promote effective strategies for sustainable, interoperable data systems. This presentation will outline the role metadata plays in durable interoperable data systems, for better or worse. It will describe times when "just choosing a standard" can work, and when it probably won't work. And it will point out signs that suggest a metadata storm is coming to your community project, and how you might avoid it. From these lessons we will seek a path to producing interoperable, interdisciplinary, metadata-enlightened environment observing systems.
NASA Astrophysics Data System (ADS)
Oggioni, A.; Tagliolato, P.; Schleidt, K.; Carrara, P.; Grellet, S.; Sarretta, A.
2016-02-01
The state of the art in biodiversity data management unfortunately encompases a plethora of diverse data formats. Compared to other research fields, there is a lack in harmonization and standardization of these data. While data from traditional biodiversity collections (e.g. from museums) can be easily represented by existing standard as provided by TDWG, the growing number of field observations stemming from both VGI activities (e.g. iNaturalist) as well as from automated systems (e.g. animal biotelemetry) would at the very least require upgrades of current formats. Moreover, from an eco-informatics perspective, the integration and use of data from different scientific fields is the norm (abiotic data, geographic information, etc.); the possibility to represent this information and biodiversity data in a homogeneous way would be an advantage for interoperability, allowing for easy integration across environmental media. We will discuss the possibility to exploit the Open Geospatial Consortium/ISO standard, Observations and Measurements (O&M) [1], a generic conceptual model developed for observation data but with strong analogies with the biodiversity-oriented OBOE ontology [2]. The applicability of OGC O&M for the provision of biodiviersity occurence data has been suggested by the INSPIRE Cross Thematic Working Group on Observations & Measurements [3], Inspire Environmental Monitoring Facilities Thematic Working Group [4] and New Zealand Environmental Information Interoperability Framework [5]. This approach, in our opinion, could be an advantage for the biodiversity community. We will provide some examples for encoding biodiversity occurence data using the O&M standard in addition to highlighting the advatages offered by O&M in comparison to other representation formats. [1] Cox, S. (2013). Geographic information - Observations and measurements - OGC and ISO 19156. [2] Madin, J., Bowers, S., Schildhauer, M., Krivov, S., Pennington, D., & Villa, F. (2007). An ontology for describing and synthesizing ecological observation data. Ecological Informatics, 2(3), 279-296. [3] INSPIRE_D2.9_O&M_Guidelines_v2.0rc3.pdf[4] INSPIRE_DataSpecification_EF_v3.0.pdf[5] Watkins, A. (2012) Biodiversity Interoperability through Open Geospatial Standards
Achieving Interoperability in GEOSS - How Close Are We?
NASA Astrophysics Data System (ADS)
Arctur, D. K.; Khalsa, S. S.; Browdy, S. F.
2010-12-01
A primary goal of the Global Earth Observing System of System (GEOSS) is improving the interoperability between the observational, modelling, data assimilation, and prediction systems contributed by member countries. The GEOSS Common Infrastructure (GCI) comprises the elements designed to enable discovery and access to these diverse data and information sources. But to what degree can the mechanisms for accessing these data, and the data themselves, be considered interoperable? Will the separate efforts by Communities of Practice within GEO to build their own portals, such as for Energy, Biodiversity, and Air Quality, lead to fragmentation or synergy? What communication and leadership do we need with these communities to improve interoperability both within and across such communities? The Standards and Interoperability Forum (SIF) of GEO's Architecture and Data Committee has assessed progress towards achieving the goal of global interoperability and made recommendations regarding evolution of the architecture and overall data strategy to ensure fulfillment of the GEOSS vision. This presentation will highlight the results of this study, and directions for further work.
Myneni, Sahiti; Patel, Vimla L.
2009-01-01
Biomedical researchers often have to work on massive, detailed, and heterogeneous datasets that raise new challenges of information management. This study reports an investigation into the nature of the problems faced by the researchers in two bioscience test laboratories when dealing with their data management applications. Data were collected using ethnographic observations, questionnaires, and semi-structured interviews. The major problems identified in working with these systems were related to data organization, publications, and collaboration. The interoperability standards were analyzed using a C4I framework at the level of connection, communication, consolidation, and collaboration. Such an analysis was found to be useful in judging the capabilities of data management systems at different levels of technological competency. While collaboration and system interoperability are the “must have” attributes of these biomedical scientific laboratory information management applications, usability and human interoperability are the other design concerns that must also be addressed for easy use and implementation. PMID:20351900
76 FR 71048 - Assistance to Firefighters Grant Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-16
... the highest-scoring applications for awards. Applications that involve interoperable communications... categories: (1) Activities designed to reach high- risk target groups and mitigate incidences of death and... communications project is consistent with the Statewide Communications Interoperability Plan (SCIP). If the State...
CCSDS Spacecraft Monitor and Control Mission Operations Interoperability Prototype
NASA Technical Reports Server (NTRS)
Lucord, Steve; Martinez, Lindolfo
2009-01-01
We are entering a new era in space exploration. Reduced operating budgets require innovative solutions to leverage existing systems to implement the capabilities of future missions. Custom solutions to fulfill mission objectives are no longer viable. Can NASA adopt international standards to reduce costs and increase interoperability with other space agencies? Can legacy systems be leveraged in a service oriented architecture (SOA) to further reduce operations costs? The Operations Technology Facility (OTF) at the Johnson Space Center (JSC) is collaborating with Deutsches Zentrum fur Luft- und Raumfahrt (DLR) to answer these very questions. The Mission Operations and Information Management Services Area (MOIMS) Spacecraft Monitor and Control (SM&C) Working Group within the Consultative Committee for Space Data Systems (CCSDS) is developing the Mission Operations standards to address this problem space. The set of proposed standards presents a service oriented architecture to increase the level of interoperability among space agencies. The OTF and DLR are developing independent implementations of the standards as part of an interoperability prototype. This prototype will address three key components: validation of the SM&C Mission Operations protocol, exploration of the Object Management Group (OMG) Data Distribution Service (DDS), and the incorporation of legacy systems in a SOA. The OTF will implement the service providers described in the SM&C Mission Operation standards to create a portal for interaction with a spacecraft simulator. DLR will implement the service consumers to perform the monitor and control of the spacecraft. The specifications insulate the applications from the underlying transport layer. We will gain experience with a DDS transport layer as we delegate responsibility to the middleware and explore transport bridges to connect disparate middleware products. A SOA facilitates the reuse of software components. The prototype will leverage the capabilities of existing legacy systems. Various custom applications and middleware solutions will be combined into one system providing the illusion of a set of homogenous services. This paper will document our journey as we implement the interoperability prototype. The team consists of software engineers with experience on the current command, telemetry and messaging systems that support the International Space Station (ISS) and Space Shuttle programs. Emphasis will be on the objectives, results and potential cost saving benefits.
Maturity Model for Advancing Smart Grid Interoperability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knight, Mark; Widergren, Steven E.; Mater, J.
2013-10-28
Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met withmore » process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.« less
Metadata, Identifiers, and Physical Samples
NASA Astrophysics Data System (ADS)
Arctur, D. K.; Lenhardt, W. C.; Hills, D. J.; Jenkyns, R.; Stroker, K. J.; Todd, N. S.; Dassie, E. P.; Bowring, J. F.
2016-12-01
Physical samples are integral to much of the research conducted by geoscientists. The samples used in this research are often obtained at significant cost and represent an important investment for future research. However, making information about samples - whether considered data or metadata - available for researchers to enable discovery is difficult: a number of key elements related to samples are difficult to characterize in common ways, such as classification, location, sample type, sampling method, repository information, subsample distribution, and instrumentation, because these differ from one domain to the next. Unifying these elements or developing metadata crosswalks is needed. The iSamples (Internet of Samples) NSF-funded Research Coordination Network (RCN) is investigating ways to develop these types of interoperability and crosswalks. Within the iSamples RCN, one of its working groups, WG1, has focused on the metadata related to physical samples. This includes identifying existing metadata standards and systems, and how they might interoperate with the International Geo Sample Number (IGSN) schema (schema.igsn.org) in order to help inform leading practices for metadata. For example, we are examining lifecycle metadata beyond the IGSN `birth certificate.' As a first step, this working group is developing a list of relevant standards and comparing their various attributes. In addition, the working group is looking toward technical solutions to facilitate developing a linked set of registries to build the web of samples. Finally, the group is also developing a comparison of sample identifiers and locators. This paper will provide an overview and comparison of the standards identified thus far, as well as an update on the technical solutions examined for integration. We will discuss how various sample identifiers might work in complementary fashion with the IGSN to more completely describe samples, facilitate retrieval of contextual information, and access research work on related samples. Finally, we welcome suggestions and community input to move physical sample unique identifiers forward.
NASA Astrophysics Data System (ADS)
Graves, S. J.; Keiser, K.; Law, E.; Yang, C. P.; Djorgovski, S. G.
2016-12-01
ECITE (EarthCube Integration and Testing Environment) is providing both cloud-based computational testing resources and an Assessment Framework for Technology Interoperability and Integration. NSF's EarthCube program is funding the development of cyberinfrastructure building block components as technologies to address Earth science research problems. These EarthCube building blocks need to support integration and interoperability objectives to work towards a coherent cyberinfrastructure architecture for the program. ECITE is being developed to provide capabilities to test and assess the interoperability and integration across funded EarthCube technology projects. EarthCube defined criteria for interoperability and integration are applied to use cases coordinating science problems with technology solutions. The Assessment Framework facilitates planning, execution and documentation of the technology assessments for review by the EarthCube community. This presentation will describe the components of ECITE and examine the methodology of cross walking between science and technology use cases.
CCSDS SOIS Subnetwork Services: A First Reference Implementation
NASA Astrophysics Data System (ADS)
Gunes-Lasnet, S.; Notebaert, O.; Farges, P.-Y.; Fowell, S.
2008-08-01
The CCSDS SOIS working groups are developing a range of standards for spacecraft onboard interfaces with the intention of promoting reuse of hardware and software designs across a range of missions while enabling interoperability of onboard systems from diverse sources. The CCSDS SOIS working groups released in June 2007 their red books for both Subnetwork and application support layers. In order to allow the verification of these recommended standards and to pave the way for future implementation onboard spacecrafts, it is essential for these standards to be prototyped on a representative spacecraft platform, to provide valuable feed back to the SOIS working group. A first reference implementation of both Subnetwork and Application Support SOIS services over SpaceWire and Mil-Std-1553 bus is thus being realised by SciSys Ltd and Astrium under an ESA contract.
System of Systems Technology Readiness Assessment
2007-09-01
for their support and commitment to my education . Also, I would like to thank the Department of Navy for providing me the varied and challenging...they are key convergence points for SoS and there may be no opportunity for changes to the interfaces without major impact to the entire SoS. The...Architecture Working Group, 1997): • Procedures: guidance that impact system interoperability, including doctrine, mission, architectures, and standards
Interoperability Is the Foundation for Successful Internet Telephony.
ERIC Educational Resources Information Center
Fromm, Larry
1997-01-01
More than 40 leading computer and telephony companies have united to lead the charge toward open standards and universal interoperability for Internet telephony products. The voice of IP Forum (VoIP) is working to define technical guidelines for two-party, real-time communications over IP networks, including provisions for compatibility with…
EPA Scientific Knowledge Management Assessment and Needs
A series of activities have been conducted by a core group of EPA scientists from across the Agency. The activities were initiated in 2012 and the focus was to increase the reuse and interoperability of science software at EPA. The need for increased reuse and interoperability ...
Smart Grid Interoperability Maturity Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Levinson, Alex; Mater, J.
2010-04-28
The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizationalmore » alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.« less
Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'
NASA Astrophysics Data System (ADS)
Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno
2015-04-01
Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.
ERIC Educational Resources Information Center
Bourdon, Francoise
2005-01-01
The translation into French of the Encoded Archival Context (EAC) DTD tag library has been in progress for a few months. It is being carried out by a group of experts gathered by AFNOR, the French national standards agency. The main goal of this group is to foster the interoperability of authority data between archives, libraries and museums, and…
Putting the School Interoperability Framework to the Test
ERIC Educational Resources Information Center
Mercurius, Neil; Burton, Glenn; Hopkins, Bill; Larsen, Hans
2004-01-01
The Jurupa Unified School District in Southern California recently partnered with Microsoft, Dell and the Zone Integration Group for the implementation of a School Interoperability Framework (SIF) database repository model throughout the district (Magner 2002). A two-week project--the Integrated District Education Applications System, better known…
The Next Generation of Interoperability Agents in Healthcare
Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel ; Abelha, António; Machado, José
2014-01-01
Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA. PMID:24840351
The role of markup for enabling interoperability in health informatics.
McKeever, Steve; Johnson, David
2015-01-01
Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable.
The HDF Product Designer - Interoperability in the First Mile
NASA Astrophysics Data System (ADS)
Lee, H.; Jelenak, A.; Habermann, T.
2014-12-01
Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.
Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.
Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert
2017-08-01
Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.
Current Barriers to Large-scale Interoperability of Traceability Technology in the Seafood Sector.
Hardt, Marah J; Flett, Keith; Howell, Colleen J
2017-08-01
Interoperability is a critical component of full-chain digital traceability, but is almost nonexistent in the seafood industry. Using both quantitative and qualitative methodology, this study explores the barriers impeding progress toward large-scale interoperability among digital traceability systems in the seafood sector from the perspectives of seafood companies, technology vendors, and supply chains as a whole. We highlight lessons from recent research and field work focused on implementing traceability across full supply chains and make some recommendations for next steps in terms of overcoming challenges and scaling current efforts. © 2017 Institute of Food Technologists®.
The GIIDA (Management of the CNR Environmental Data for Interoperability) project
NASA Astrophysics Data System (ADS)
Nativi, S.
2009-04-01
This work presents the GIIDA (Gestione Integrata e Interoperativa dei Dati Ambientali del CNR) inter-departimental project of the Italian National Research Council (CNR). The project is an initiative of the Earth and Environment Department (Dipartimento Terra e Ambiente) of the CNR. GIIDA mission is "To implement the Spatial Information Infrastructure (SII) of CNR for Environmental and Earth Observation data". The project aims to design and develop a multidisciplinary cyber-infrastructure for the management, processing and evaluation of Earth and environmental data. This infrastructure will contribute to the Italian presence in international projects and initiatives, such as: INSPIRE, GMES, GEOSS and SEIS. The main GIIDA goals are: • Networking: To create a network of CNR Institutes for implementing a common information space and sharing spatial resources. • Observation: Re-engineering the environmental observation system of CNR • Modeling: Re-engineering the environmental modeling system del CNR • Processing: Re-engineering the environmental processing system del CNR • Mediation: To define mediation methods and instruments for implementing the international interoperability standards. The project started in July 2008 releasing a specification document of the GIIDA architecture for interoperability and security. Based on these documents, a Call for Proposals was issued in September 2008. GIIDA received 23 proposed pilots from 16 different Institutes belonging to five CNR Departments and from 15 non-CNR Institutions (e.g. three Italian regional administrations, three national research centers, four universities, some SMEs). These pilot were divided into thematic areas. In fact, GIIDA considers seven main thematic areas/domains: • Biodiversity; • Climate Changes; • Air Quality; • Soil and Water Quality; • Risks; • Infrastructures for Research and Public Administrations; • Sea and Marine resources Each of these thematic areas is covered by a Working Group which coordinates the activities and the achievements of the respective pilots. Working Groups are called to develop for each area: 1) a specific Web Portal; 2) a thematic catalog service; 3) a thematic thesaurus service; 4) a thematic Wiki; 5) standard access and view services for thematic resources -such as: datasets, models, and processing services; 6) a couple of significant use scenarios to be demonstrated.
Wireless Network Communications Overview for Space Mission Operations
NASA Technical Reports Server (NTRS)
Fink, Patrick W.
2009-01-01
The mission of the On-Board Wireless Working Group (WWG) is to serve as a general CCSDS focus group for intra-vehicle wireless technologies. The WWG investigates and makes recommendations pursuant to standardization of applicable wireless network protocols, ensuring the interoperability of independently developed wireless communication assets. This document presents technical background information concerning uses and applicability of wireless networking technologies for space missions. Agency-relevant driving scenarios, for which wireless network communications will provide a significant return-on-investment benefiting the participating international agencies, are used to focus the scope of the enclosed technical information.
Plugfest 2009: Global Interoperability in Telerobotics and Telemedicine
King, H. Hawkeye; Hannaford, Blake; Kwok, Ka-Wai; Yang, Guang-Zhong; Griffiths, Paul; Okamura, Allison; Farkhatdinov, Ildar; Ryu, Jee-Hwan; Sankaranarayanan, Ganesh; Arikatla, Venkata; Tadano, Kotaro; Kawashima, Kenji; Peer, Angelika; Schauß, Thomas; Buss, Martin; Miller, Levi; Glozman, Daniel; Rosen, Jacob; Low, Thomas
2014-01-01
Despite the great diversity of teleoperator designs and applications, their underlying control systems have many similarities. These similarities can be exploited to enable inter-operability between heterogeneous systems. We have developed a network data specification, the Interoperable Telerobotics Protocol, that can be used for Internet based control of a wide range of teleoperators. In this work we test interoperable telerobotics on the global Internet, focusing on the telesurgery application domain. Fourteen globally dispersed telerobotic master and slave systems were connected in thirty trials in one twenty four hour period. Users performed common manipulation tasks to demonstrate effective master-slave operation. With twenty eight (93%) successful, unique connections the results show a high potential for standardizing telerobotic operation. Furthermore, new paradigms for telesurgical operation and training are presented, including a networked surgery trainer and upper-limb exoskeleton control of micro-manipulators. PMID:24748993
Semantically Interoperable XML Data
Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel
2013-01-01
XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789
Architecture for interoperable software in biology.
Bare, James Christopher; Baliga, Nitin S
2014-07-01
Understanding biological complexity demands a combination of high-throughput data and interdisciplinary skills. One way to bring to bear the necessary combination of data types and expertise is by encapsulating domain knowledge in software and composing that software to create a customized data analysis environment. To this end, simple flexible strategies are needed for interconnecting heterogeneous software tools and enabling data exchange between them. Drawing on our own work and that of others, we present several strategies for interoperability and their consequences, in particular, a set of simple data structures--list, matrix, network, table and tuple--that have proven sufficient to achieve a high degree of interoperability. We provide a few guidelines for the development of future software that will function as part of an interoperable community of software tools for biological data analysis and visualization. © The Author 2012. Published by Oxford University Press.
EVA safety: Space suit system interoperability
NASA Technical Reports Server (NTRS)
Skoog, A. I.; McBarron, J. W.; Abramov, L. P.; Zvezda, A. O.
1995-01-01
The results and the recommendations of the International Academy of Astronautics extravehicular activities (IAA EVA) Committee work are presented. The IAA EVA protocols and operation were analyzed for harmonization procedures and for the standardization of safety critical and operationally important interfaces. The key role of EVA and how to improve the situation based on the identified EVA space suit system interoperability deficiencies were considered.
Saleh, Kutaiba; Stucke, Stephan; Uciteli, Alexandr; Faulbrück-Röhr, Sebastian; Neumann, Juliane; Tahar, Kais; Ammon, Danny; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Portheine, Frank; Herre, Heinrich; Kaeding, André; Specht, Martin
2017-01-01
With the growing strain of medical staff and complexity of patient care, the risk of medical errors increases. In this work we present the use of Fast Healthcare Interoperability Resources (FHIR) as communication standard for the integration of an ontology- and agent-based system to identify risks across medical processes in a clinical environment.
NASA Astrophysics Data System (ADS)
Yang, Gongping; Zhou, Guang-Tong; Yin, Yilong; Yang, Xiukun
2010-12-01
A critical step in an automatic fingerprint recognition system is the segmentation of fingerprint images. Existing methods are usually designed to segment fingerprint images originated from a certain sensor. Thus their performances are significantly affected when dealing with fingerprints collected by different sensors. This work studies the sensor interoperability of fingerprint segmentation algorithms, which refers to the algorithm's ability to adapt to the raw fingerprints obtained from different sensors. We empirically analyze the sensor interoperability problem, and effectively address the issue by proposing a [InlineEquation not available: see fulltext.]-means based segmentation method called SKI. SKI clusters foreground and background blocks of a fingerprint image based on the [InlineEquation not available: see fulltext.]-means algorithm, where a fingerprint block is represented by a 3-dimensional feature vector consisting of block-wise coherence, mean, and variance (abbreviated as CMV). SKI also employs morphological postprocessing to achieve favorable segmentation results. We perform SKI on each fingerprint to ensure sensor interoperability. The interoperability and robustness of our method are validated by experiments performed on a number of fingerprint databases which are obtained from various sensors.
IHE based interoperability - benefits and challenges.
Wozak, Florian; Ammenwerth, Elske; Hörbst, Alexander; Sögner, Peter; Mair, Richard; Schabetsberger, Thomas
2008-01-01
Optimized workflows and communication between institutions involved in a patient's treatment process can lead to improved quality and efficiency in the healthcare sector. Electronic Health Records (EHRs) provide a patient-centered access to clinical data across institutional boundaries supporting the above mentioned aspects. Interoperability is regarded as vital success factor. However a clear definition of interoperability does not exist. The aim of this work is to define and to assess interoperability criteria as required for EHRs. The definition and assessment of interoperability criteria is supported by the analysis of existing literature and personal experience as well as by discussions with several domain experts. Criteria for interoperability addresses the following aspects: Interfaces, Semantics, Legal and organizational aspects and Security. The Integrating the Healthcare Enterprises initiative (IHE) profiles make a major contribution to these aspects, but they also arise new problems. Flexibility for adoption to different organizational/regional or other specific conditions is missing. Regional or national initiatives should get a possibility to realize their specific needs within the boundaries of IHE profiles. Security so far is an optional element which is one of IHE greatest omissions. An integrated security approach seems to be preferable. Irrespective of the so far practical significance of the IHE profiles it appears to be of great importance, that the profiles are constantly checked against practical experiences and are continuously adapted.
An Interoperability Framework and Capability Profiling for Manufacturing Software
NASA Astrophysics Data System (ADS)
Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.
ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.
An Architecture for Semantically Interoperable Electronic Health Records.
Toffanello, André; Gonçalves, Ricardo; Kitajima, Adriana; Puttini, Ricardo; Aguiar, Atualpa
2017-01-01
Despite the increasing adhesion of electronic health records, the challenge of semantic interoperability remains unsolved. The fact that different parties can exchange messages does not mean they can understand the underlying clinical meaning, therefore, it cannot be assumed or treated as a requirement. This work introduces an architecture designed to achieve semantic interoperability, in a way which organizations that follow different policies may still share medical information through a common infrastructure comparable to an ecosystem, whose organisms are exemplified within the Brazilian scenario. Nonetheless, the proposed approach describes a service-oriented design with modules adaptable to different contexts. We also discuss the establishment of an enterprise service bus to mediate a health infrastructure defined on top of international standards, such as openEHR and IHE. Moreover, we argue that, in order to achieve truly semantic interoperability in a wide sense, a proper profile must be published and maintained.
Medical Device Plug-and-Play Interoperability Standards & Technology Leadership
2011-10-01
official Department of the Army position, policy or decision unless so designated by other documentation. REPORT DOCUMENTATION PAGE Form Approved...biomedical engineering students completed their senior design project on the X-Ray / Ventilator Use Case. We worked closely with the students to...Supporting Medical Device Adverse Event Analysis in an Interoperable Clinical Environment: Design of a Data Logging and Playback System,” Publication in
A distributed component framework for science data product interoperability
NASA Technical Reports Server (NTRS)
Crichton, D.; Hughes, S.; Kelly, S.; Hardman, S.
2000-01-01
Correlation of science results from multi-disciplinary communities is a difficult task. Traditionally data from science missions is archived in proprietary data systems that are not interoperable. The Object Oriented Data Technology (OODT) task at the Jet Propulsion Laboratory is working on building a distributed product server as part of a distributed component framework to allow heterogeneous data systems to communicate and share scientific results.
Evaluating Sustainability Models for Interoperability through Brokering Software
NASA Astrophysics Data System (ADS)
Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew
2016-04-01
Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Nichols, Kelvin F.
2006-01-01
To date very little effort has been made to provide interoperability between various space agency projects. To effectively get to the Moon and beyond systems must interoperate. To provide interoperability, standardization and registries of various technologies will be required. These registries will be created as they relate to space flight. With the new NASA Moon/Mars initiative a requirement to standardize and control the naming conventions of very disparate systems and technologies are emerging. The need to provide numbering to the many processes, schemas, vehicles, robots, space suits and technologies (e.g. versions), to name a few, in the highly complex Constellation Initiative is imperative. The number of corporations, developer personnel, system interfaces, people interfaces will require standardization and registries on a scale not currently envisioned. It would only take one exception (stove piped system development) to weaken, if not, destroy interoperability. To start, a standardized registry process must be defined that allows many differing engineers, organizations and operators the ability to easily access disparate registry information across numerous technological and scientific disciplines. Once registries are standardized the need to provide registry support in terms of setup and operations, resolution of conflicts between registries and other issues will need to be addressed. Registries should not be confused with repositories. No end user data is "stored" in a registry nor is it a configuration control system. Once a registry standard is created and approved, the technologies that should be registered must be identified and prioritized. In this paper, we will identify and define a registry process that is compatible with the Constellation Initiative and other non related space activities and organizations. We will then identify and define the various technologies that should use a registry to provide interoperability. The first set of technologies will be those that are currently in need of expansion namely the assignment of satellite designations and the process which controls assignments. Second, we will analyze the technologies currently standardized under the Consultative Committee for Space Data Systems (CCSDS) banner. Third, we will analyze the current CCSDS working group and birds of a feather activities to ascertain registry requirements. Lastly, we will identify technologies that are either currently under the auspices of another
Sharing and interoperation of Digital Dongying geospatial data
NASA Astrophysics Data System (ADS)
Zhao, Jun; Liu, Gaohuan; Han, Lit-tao; Zhang, Rui-ju; Wang, Zhi-an
2006-10-01
Digital Dongying project was put forward by Dongying city, Shandong province, and authenticated by Ministry of Information Industry, Ministry of Science and Technology and Ministry of Construction P.R.CHINA in 2002. After five years of building, informationization level of Dongying has reached to the advanced degree. In order to forward the step of digital Dongying building, and to realize geospatial data sharing, geographic information sharing standards are drawn up and applied into realization. Secondly, Digital Dongying Geographic Information Sharing Platform has been constructed and developed, which is a highly integrated platform of WEBGIS. 3S (GIS, GPS, RS), Object oriented RDBMS, Internet, DCOM, etc. It provides an indispensable platform for sharing and interoperation of Digital Dongying Geospatial Data. According to the standards, and based on the platform, sharing and interoperation of "Digital Dongying" geospatial data have come into practice and the good results have been obtained. However, a perfect leadership group is necessary for data sharing and interoperation.
USGEO Common Framework For Earth Observation Data
NASA Astrophysics Data System (ADS)
Walter, J.; de la Beaujardiere, J.; Bristol, S.
2015-12-01
The United States Group on Earth Observations (USGEO) Data Management Working Group (DMWG) is an interagency body established by the White House Office of Science and Technology Policy (OSTP). The primary purpose of this group is to foster interagency cooperation and collaboration for improving the life cycle data management practices and interoperability of federally held earth observation data consistent with White House documents including the National Strategy for Civil Earth Observations, the National Plan for Civil Earth Observations, and the May 2013 Executive Order on Open Data (M-13-13). The members of the USGEO DMWG are working on developing a Common Framework for Earth Observation Data that consists of recommended standards and approaches for realizing these goals as well as improving the discoverability, accessibility, and usability of federally held earth observation data. These recommendations will also guide work being performed under the Big Earth Data Initiative (BEDI). This talk will summarize the Common Framework, the philosophy behind it, and next steps forward.
An HL7/CDA Framework for the Design and Deployment of Telemedicine Services
2001-10-25
schemes and prescription databases. Furthermore, interoperability with the Electronic Health Re- cord ( EHR ) facilitates automatic retrieval of relevant...local EHR system or the integrated electronic health record (I- EHR ) [9], which indexes all medical contacts of a patient in the regional net- work...suspected medical problem. Interoperability with middleware services of the HII and other data sources such as the local EHR sys- tem affects
Solving Identity Management and Interoperability Problems at Pan-European Level
NASA Astrophysics Data System (ADS)
Sánchez García, Sergio; Gómez Oliva, Ana
In a globalized digital world, it is essential for persons and entities to have a recognized and unambiguous electronic identity that allows them to communicate with one another. The management of this identity by public administrations is an important challenge that becomes even more crucial when interoperability among public administrations of different countries becomes necessary, as persons and entities have different credentials depending on their own national legal frameworks. More specifically, different credentials and legal frameworks cause interoperability problems that prevent reliable access to public services in a cross-border scenarios like today's European Union. Work in this doctoral thesis try to analyze the problem in a carefully detailed manner by studying existing proposals (basically in Europe), proposing improvements in defined architectures and performing practical work to test the viability of solutions. Moreover, this thesis will also address the long-standing security problem of identity delegation, which is especially important in complex and heterogeneous service delivery environments like those mentioned above. This is a position paper.
Archive interoperability in the Virtual Observatory
NASA Astrophysics Data System (ADS)
Genova, Françoise
2003-02-01
Main goals of Virtual Observatory projects are to build interoperability between astronomical on-line services, observatory archives, databases and results published in journals, and to develop tools permitting the best scientific usage from the very large data sets stored in observatory archives and produced by large surveys. The different Virtual Observatory projects collaborate to define common exchange standards, which are the key for a truly International Virtual Observatory: for instance their first common milestone has been a standard allowing exchange of tabular data, called VOTable. The Interoperability Work Area of the European Astrophysical Virtual Observatory project aims at networking European archives, by building a prototype using the CDS VizieR and Aladin tools, and at defining basic rules to help archive providers in interoperability implementation. The prototype is accessible for scientific usage, to get user feedback (and science results!) at an early stage of the project. ISO archive participates very actively to this endeavour, and more generally to information networking. The on-going inclusion of the ISO log in SIMBAD will allow higher level links for users.
A SOA-Based Platform to Support Clinical Data Sharing.
Gazzarata, R; Giannini, B; Giacomini, M
2017-01-01
The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the "Interoperable" Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the "Interoperable" Tier, the current solution actually covers the "Connected" Tier, due to local hospital policy restrictions.
Regional interoperability: making systems connect in complex disasters.
Briggs, Susan Miller
2009-08-01
Effective use of the Incident Command System (ICS) is the key to regional interoperability. Many different organizations with different command structures and missions respond to a disaster. The ICS allows different kinds of agencies (fire, police, and medical) to work together effectively in response to a disaster. Functional requirements, not titles, determine the organizational hierarchy of the ICS structure. The ICS is a modular/adaptable system for all disasters regardless of etiology and for all organizations regardless of size.
NASA's Earth Science Data Systems Standards Process Experiences
NASA Technical Reports Server (NTRS)
Ullman, Richard E.; Enloe, Yonsook
2007-01-01
NASA has impaneled several internal working groups to provide recommendations to NASA management on ways to evolve and improve Earth Science Data Systems. One of these working groups is the Standards Process Group (SPC). The SPG is drawn from NASA-funded Earth Science Data Systems stakeholders, and it directs a process of community review and evaluation of proposed NASA standards. The working group's goal is to promote interoperability and interuse of NASA Earth Science data through broader use of standards that have proven implementation and operational benefit to NASA Earth science by facilitating the NASA management endorsement of proposed standards. The SPC now has two years of experience with this approach to identification of standards. We will discuss real examples of the different types of candidate standards that have been proposed to NASA's Standards Process Group such as OPeNDAP's Data Access Protocol, the Hierarchical Data Format, and Open Geospatial Consortium's Web Map Server. Each of the three types of proposals requires a different sort of criteria for understanding the broad concepts of "proven implementation" and "operational benefit" in the context of NASA Earth Science data systems. We will discuss how our Standards Process has evolved with our experiences with the three candidate standards.
NASA Astrophysics Data System (ADS)
Tomas, Robert; Harrison, Matthew; Barredo, José I.; Thomas, Florian; Llorente Isidro, Miguel; Cerba, Otakar; Pfeiffer, Manuela
2014-05-01
The vast amount of information and data necessary for comprehensive hazard and risk assessment presents many challenges regarding the lack of accessibility, comparability, quality, organisation and dissemination of natural hazards spatial data. In order to mitigate these limitations an interoperable framework has been developed in the framework of the development of legally binding Implementing rules of the EU INSPIRE Directive1* aiming at the establishment of the European Spatial Data Infrastructure. The interoperability framework is described in the Data Specification on Natural risk zones - Technical Guidelines (DS) document2* that was finalized and published on 10.12. 2013. This framework provides means for facilitating access, integration, harmonisation and dissemination of natural hazard data from different domains and sources. The objective of this paper is twofold. Firstly, the paper demonstrates the applicability of the interoperable framework developed in the DS and highlights the key aspects of the interoperability to the various natural hazards communities. Secondly, the paper "translates" into common language the main features and potentiality of the interoperable framework of the DS for a wider audience of scientists and practitioners in the natural hazards domain. Further in this paper the main five aspects of the interoperable framework will be presented. First, the issue of a common terminology for the natural hazards domain will be addressed. A common data model to facilitate cross domain data integration will follow secondly. Thirdly, the common methodology developed to provide qualitative or quantitative assessments of natural hazards will be presented. Fourthly, the extensible classification schema for natural hazards developed from a literature review and key reference documents from the contributing community of practice will be shown. Finally, the applicability of the interoperable framework for the various stakeholder groups will be also presented. This paper closes discussing open issues and next steps regarding the sustainability and evolution of the interoperable framework and missing aspects such as multi-hazard and multi-risk. --------------- 1*INSPIRE - Infrastructure for spatial information in Europe, http://inspire.ec.europa.eu 2*http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_NZ_v3.0.pdf
MENTOR: an enabler for interoperable intelligent systems
NASA Astrophysics Data System (ADS)
Sarraipa, João; Jardim-Goncalves, Ricardo; Steiger-Garcao, Adolfo
2010-07-01
A community with knowledge organisation based on ontologies will enable an increase in the computational intelligence of its information systems. However, due to the worldwide diversity of communities, a high number of knowledge representation elements, which are not semantically coincident, have appeared representing the same segment of reality, becoming a barrier to business communications. Even if a domain community uses the same kind of technologies in its information systems, such as ontologies, it doesn't solve its semantics differences. In order to solve this interoperability problem, a solution is to use a reference ontology as an intermediary in the communications between the community enterprises and the outside, while allowing the enterprises to keep their own ontology and semantics unchanged internally. This work proposes MENTOR, a methodology to support the development of a common reference ontology for a group of organisations sharing the same business domain. This methodology is based on the mediator ontology (MO) concept, which assists the semantic transformations among each enterprise's ontology and the referential one. The MO enables each organisation to keep its own terminology, glossary and ontological structures, while providing seamless communication and interaction with the others.
The 2006 NESCent Phyloinformatics Hackathon: A Field Report
Lapp, Hilmar; Bala, Sendu; Balhoff, James P.; Bouck, Amy; Goto, Naohisa; Holder, Mark; Holland, Richard; Holloway, Alisha; Katayama, Toshiaki; Lewis, Paul O.; Mackey, Aaron J.; Osborne, Brian I.; Piel, William H.; Kosakovsky Pond, Sergei L.; Poon, Art F.Y.; Qiu, Wei-Gang; Stajich, Jason E.; Stoltzfus, Arlin; Thierer, Tobias; Vilella, Albert J.; Vos, Rutger A.; Zmasek, Christian M.; Zwickl, Derrick J.; Vision, Todd J.
2007-01-01
In December, 2006, a group of 26 software developers from some of the most widely used life science programming toolkits and phylogenetic software projects converged on Durham, North Carolina, for a Phyloinformatics Hackathon, an intense five-day collaborative software coding event sponsored by the National Evolutionary Synthesis Center (NESCent). The goal was to help researchers to integrate multiple phylogenetic software tools into automated workflows. Participants addressed deficiencies in interoperability between programs by implementing “glue code” and improving support for phylogenetic data exchange standards (particularly NEXUS) across the toolkits. The work was guided by use-cases compiled in advance by both developers and users, and the code was documented as it was developed. The resulting software is freely available for both users and developers through incorporation into the distributions of several widely-used open-source toolkits. We explain the motivation for the hackathon, how it was organized, and discuss some of the outcomes and lessons learned. We conclude that hackathons are an effective mode of solving problems in software interoperability and usability, and are underutilized in scientific software development.
2012-08-01
ELECTRONICS AND ARCHITECTURE (VEA) MINI-SYMPOSIUM AUGUST 14-16, TROY MICHIGAN Performance of an Embedded Platform Aggregating and Executing...NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) UBT Technologies,3250 W. Big Beaver Rd.,Ste. 329, Troy ,MI...Technology Symposium August 14-16 Troy , Michigan 14. ABSTRACT The Vehicular Integration for C4ISR/EW Interoperability (VICTORY) Standard adopts many
Secure, Autonomous, Intelligent Controller for Integrating Distributed Sensor Webs
NASA Technical Reports Server (NTRS)
Ivancic, William D.
2007-01-01
This paper describes the infrastructure and protocols necessary to enable near-real-time commanding, access to space-based assets, and the secure interoperation between sensor webs owned and controlled by various entities. Select terrestrial and aeronautics-base sensor webs will be used to demonstrate time-critical interoperability between integrated, intelligent sensor webs both terrestrial and between terrestrial and space-based assets. For this work, a Secure, Autonomous, Intelligent Controller and knowledge generation unit is implemented using Virtual Mission Operation Center technology.
Community for Data Integration 2014 annual report
Langseth, Madison L.; Chang, Michelle Y.; Carlino, Jennifer; Birch, Daniella D.; Bradley, Joshua; Bristol, R. Sky; Conzelmann, Craig; Diehl, Robert H.; Earle, Paul S.; Ellison, Laura E.; Everette, Anthony L.; Fuller, Pamela L.; Gordon, Janice M.; Govoni, David L.; Guy, Michelle R.; Henkel, Heather S.; Hutchison, Vivian B.; Kern, Tim; Lightsom, Frances L.; Long, Joseph W.; Longhenry, Ryan; Preston, Todd M.; Smith, Stan W.; Viger, Roland J.; Wesenberg, Katherine; Wood, Eric C.
2015-10-02
To achieve these goals, the CDI operates within four applied areas: monthly forums, annual workshop/webinar series, working groups, and projects. The monthly forums, also known as the Opportunity/Challenge of the Month, provide an open dialogue to share and learn about data integration efforts or to present problems that invite the community to offer solutions, advice, and support. Since 2010, the CDI has also sponsored annual workshops/webinar series to encourage the exchange of ideas, sharing of activities, presentations of current projects, and networking among members. Stemming from common interests, the working groups are focused on efforts to address data management and technical challenges including the development of standards and tools, improving interoperability and information infrastructure, and data preservation within USGS and its partners. The growing support for the activities of the working groups led to the CDI’s first formal request for proposals (RFP) process in 2013 to fund projects that produced tangible products. As of 2014, the CDI continues to hold an annual RFP that creates data management tools and practices, collaboration tools, and training in support of data integration and delivery.
Interoperability of Electronic Health Records: A Physician-Driven Redesign.
Miller, Holly; Johns, Lucy
2018-01-01
PURPOSE: Electronic health records (EHRs), now used by hundreds of thousands of providers and encouraged by federal policy, have the potential to improve quality and decrease costs in health care. But interoperability, although technically feasible among different EHR systems, is the weak link in the chain of logic. Interoperability is inhibited by poor understanding, by suboptimal implementation, and at times by a disinclination to dilute market share or patient base on the part of vendors or providers, respectively. The intent of this project has been to develop a series of practicable recommendations that, if followed by EHR vendors and users, can promote and enhance interoperability, helping EHRs reach their potential. METHODOLOGY: A group of 11 physicians, one nurse, and one health policy consultant, practicing from California to Massachusetts, has developed a document titled "Feature and Function Recommendations To Optimize Clinician Usability of Direct Interoperability To Enhance Patient Care" that offers recommendations from the clinician point of view. This report introduces some of these recommendations and suggests their implications for policy and the "virtualization" of EHRs. CONCLUSION: Widespread adoption of even a few of these recommendations by designers and vendors would enable a major advance toward the "Triple Aim" of improving the patient experience, improving the health of populations, and reducing per capita costs.
Extravehicular activity space suit interoperability.
Skoog, A I; McBarron JW 2nd; Severin, G I
1995-10-01
The European Agency (ESA) and the Russian Space Agency (RKA) are jointly developing a new space suit system for improved extravehicular activity (EVA) capabilities in support of the MIR Space Station Programme, the EVA Suit 2000. Recent national policy agreements between the U.S. and Russia on planned cooperations in manned space also include joint extravehicular activity (EVA). With an increased number of space suit systems and a higher operational frequency towards the end of this century an improved interoperability for both routine and emergency operations is of eminent importance. It is thus timely to report the current status of ongoing work on international EVA interoperability being conducted by the Committee on EVA Protocols and Operations of the International Academy of Astronauts initiated in 1991. This paper summarises the current EVA interoperability issues to be harmonised and presents quantified vehicle interface requirements for the current U.S. Shuttle EMU and Russian MIR Orlan DMA and the new European/Russian EVA Suit 2000 extravehicular systems. Major critical/incompatible interfaces for suits/mother-craft of different combinations are discussed, and recommendations for standardisations given.
OneGeology - improving access to geoscience globally
NASA Astrophysics Data System (ADS)
Jackson, Ian; Asch, Kristine; Tellez-Arenas, Agnès.; Komac, Marko; Demicheli, Luca
2010-05-01
The OneGeology concept originated in early 2006. With the potential stimulus of the International Year of Planet Earth (IYPE) very much in mind, the challenge was: could we use IYPE to begin the creation of an interoperable digital geological dataset of the planet? Fourteen months later on the concept was unanimously endorsed by 83 representatives of the international geoscience community at a meeting in Brighton, UK, and goals were set to for a global launch at the 33rd IGC in Oslo in August 2008. The goals that the Brighton meeting agreed for OneGeology were deceptively simple. They were to: • improve the accessibility of geological map data • exchange know-how and skills so that all nations could participate • accelerate interoperability in the geosciences and the take up of a new "standard" (GeoSciML) At the time of writing (January 2010) there are 113 countries participating in OneGeology, more than 40 of which are serving data using a web map portal and protocols, registries and technology to "harvest" and serve data from around the world. An essential part of the development of OneGeology has been the exchange of know-how and provision of guidance and support so that any geological survey can participate and serve their data. The team have also moved forward and raised the profile of a crucial data model and interoperability standard - GeoSciML, which will allow geoscience data to be shared across the globe. OneGeology is coordinated through a two-part "hub" - a Secretariat based at the British Geological Survey (BGS), and the portal technology and servers provided by the French geological survey (BRGM). The "hub" is guided and supported by two international groups - the Operational Management Group (OMG) and the Technical Working Group (TWG). A Steering Group to provide strategic guidance for OneGeology and comprising geological survey directors representing the six continents was formed at the end of 2008. The Steering Group are now looking at options to incorporate Onegeology and consolidate its governance and sustainability. Two regional initiatives have been spawned which are strongly linked to OneGeology. OneGeology-Europe and the US project Geoscience Information Network (GIN) are progressing OneGeology goals in Europe and the USA. Additionally, in south-east Asia, CCOP members are making sure that OneGeology goals are progressed in their region. Each of these initiatives reinforces the other. A set of Success Criteria for the next 3 years, up to the 34 IGC in Brisbane, are providing new goals for the OneGeology work programme. Within these major aims are increasing the number of participants, increasing the number of those participants serving data, and increasing the number of participants moving from a web map service to a web feature service, which will offer significantly improved functionality. Communication and outreach have always been a priority for OneGeology; nonetheless the volume of global media coverage the project has received has been astounding. A dynamic website with rich and regularly updated content is a strong factor in that outreach. The audiences for these presentations range from geoscientists, to informatics and spatial data specialists, to environmental scientists, politicians and not least the general public. OneGeology has proved to be a project that has much broader appeal (and thus more opportunity to communicate the relevance of geology to society) than was ever envisaged. This external appeal has served to strengthen geoscience interest in the project, which has in turn given a higher profile and impetus to the interoperability standards OneGeology uses.
NASA Astrophysics Data System (ADS)
Glaves, Helen
2015-04-01
Marine research is rapidly moving away from traditional discipline specific science to a wider ecosystem level approach. This more multidisciplinary approach to ocean science requires large amounts of good quality, interoperable data to be readily available for use in an increasing range of new and complex applications. Significant amounts of marine data and information are already available throughout the world as a result of e-infrastructures being established at a regional level to manage and deliver marine data to the end user. However, each of these initiatives has been developed to address specific regional requirements and independently of those in other regions. Establishing a common framework for marine data management on a global scale necessitates that there is interoperability across these existing data infrastructures and active collaboration between the organisations responsible for their management. The Ocean Data Interoperability Platform (ODIP) project is promoting co-ordination between a number of these existing regional e-infrastructures including SeaDataNet and Geo-Seas in Europe, the Integrated Marine Observing System (IMOS) in Australia, the Rolling Deck to Repository (R2R) in the USA and the international IODE initiative. To demonstrate this co-ordinated approach the ODIP project partners are currently working together to develop several prototypes to test and evaluate potential interoperability solutions for solving the incompatibilities between the individual regional marine data infrastructures. However, many of the issues being addressed by the Ocean Data Interoperability Platform are not specific to marine science. For this reason many of the outcomes of this international collaborative effort are equally relevant and transferable to other domains.
Community for Data Integration 2013 Annual Report
Chang, Michelle Y.; Carlino, Jennifer; Barnes, Christopher; Blodgett, David L.; Bock, Andrew R.; Everette, Anthony L.; Fernette, Gregory L.; Flint, Lorraine E.; Gordon, Janice M.; Govoni, David L.; Hay, Lauren E.; Henkel, Heather S.; Hines, Megan K.; Holl, Sally L.; Homer, Collin G.; Hutchison, Vivian B.; Ignizio, Drew A.; Kern, Tim J.; Lightsom, Frances L.; Markstrom, Steven L.; O'Donnell, Michael S.; Schei, Jacquelyn L.; Schmid, Lorna A.; Schoephoester, Kathryn M.; Schweitzer, Peter N.; Skagen, Susan K.; Sullivan, Daniel J.; Talbert, Colin; Warren, Meredith Pavlick
2015-01-01
grow overall USGS capabilities with data and information by increasing visibility of the work of many people throughout the USGS and the CDI community. To achieve these goals, the CDI operates within four applied areas: monthly forums, annual workshop/webinar series, working groups, and projects. The monthly forums, also known as the Opportunity/Challenge of the Month, provide an open dialogue to share and learn about data integration efforts or to present problems that invite the Community to offer solutions, advice, and support. Since 2010, the CDI has also sponsored annual workshops/webinar series to encourage the exchange of ideas, sharing of activities, presentations of current projects, and networking among members. Stemming from common interests, the working groups are focused on efforts to address data management and technical 2 challenges, including the development of standards and tools, improving interoperability and information infrastructure, and data preservation within USGS and its partners. The growing support for the activities of the working groups led to the CDI’s first formal request for proposals (RFP) process in 2013 to fund projects that produced tangible products. Today the CDI continues to hold an annual RFP that create data management tools and practices, collaboration tools, and training in support of data integration and delivery.
Zhou, Yuan; Ancker, Jessica S; Upadhye, Mandar; McGeorge, Nicolette M; Guarrera, Theresa K; Hegde, Sudeep; Crane, Peter W; Fairbanks, Rollin J; Bisantz, Ann M; Kaushal, Rainu; Lin, Li
2013-01-01
The effect of health information technology (HIT) on efficiency and workload among clinical and nonclinical staff has been debated, with conflicting evidence about whether electronic health records (EHRs) increase or decrease effort. None of this paper to date, however, examines the effect of interoperability quantitatively using discrete event simulation techniques. To estimate the impact of EHR systems with various levels of interoperability on day-to-day tasks and operations of ambulatory physician offices. Interviews and observations were used to collect workflow data from 12 adult primary and specialty practices. A discrete event simulation model was constructed to represent patient flows and clinical and administrative tasks of physicians and staff members. High levels of EHR interoperability were associated with reduced time spent by providers on four tasks: preparing lab reports, requesting lab orders, prescribing medications, and writing referrals. The implementation of an EHR was associated with less time spent by administrators but more time spent by physicians, compared with time spent at paper-based practices. In addition, the presence of EHRs and of interoperability did not significantly affect the time usage of registered nurses or the total visit time and waiting time of patients. This paper suggests that the impact of using HIT on clinical and nonclinical staff work efficiency varies, however, overall it appears to improve time efficiency more for administrators than for physicians and nurses.
2011-01-01
Background The practice and research of medicine generates considerable quantities of data and model resources (DMRs). Although in principle biomedical resources are re-usable, in practice few can currently be shared. In particular, the clinical communities in physiology and pharmacology research, as well as medical education, (i.e. PPME communities) are facing considerable operational and technical obstacles in sharing data and models. Findings We outline the efforts of the PPME communities to achieve automated semantic interoperability for clinical resource documentation in collaboration with the RICORDO project. Current community practices in resource documentation and knowledge management are overviewed. Furthermore, requirements and improvements sought by the PPME communities to current documentation practices are discussed. The RICORDO plan and effort in creating a representational framework and associated open software toolkit for the automated management of PPME metadata resources is also described. Conclusions RICORDO is providing the PPME community with tools to effect, share and reason over clinical resource annotations. This work is contributing to the semantic interoperability of DMRs through ontology-based annotation by (i) supporting more effective navigation and re-use of clinical DMRs, as well as (ii) sustaining interoperability operations based on the criterion of biological similarity. Operations facilitated by RICORDO will range from automated dataset matching to model merging and managing complex simulation workflows. In effect, RICORDO is contributing to community standards for resource sharing and interoperability. PMID:21878109
Weininger, Sandy; Jaffe, Michael B; Goldman, Julian M
2017-01-01
Medical device and health information technology systems are increasingly interdependent with users demanding increased interoperability. Related safety standards must be developed taking into account these systems' perspective. In this article, we describe the current development of medical device standards and the need for these standards to address medical device informatics. Medical device information should be gathered from a broad range of clinical scenarios to lay the foundation for safe medical device interoperability. Five clinical examples show how medical device informatics principles, if applied in the development of medical device standards, could help facilitate the development of safe interoperable medical device systems. These examples illustrate the clinical implications of the failure to capture important signals and device attributes. We provide recommendations relating to the coordination between historically separate standards development groups, some of which focus on safety and effectiveness and others focus on health informatics. We identify the need for a shared understanding among stakeholders and describe organizational structures to promote cooperation such that device-to-device interactions and related safety information are considered during standards development.
Weininger, Sandy; Jaffe, Michael B.; Goldman, Julian M
2016-01-01
Medical device and health information technology systems are increasingly interdependent with users demanding increased interoperability. Related safety standards must be developed taking into account this systems perspective. In this article we describe the current development of medical device standards and the need for these standards to address medical device informatics. Medical device information should be gathered from a broad range of clinical scenarios to lay the foundation for safe medical device interoperability. Five clinical examples show how medical device informatics principles, if applied in the development of medical device standards, could help facilitate the development of safe interoperable medical device systems. These examples illustrate the clinical implications of the failure to capture important signals and device attributes. We provide recommendations relating to the coordination between historically separate standards development groups; some which focus on safety and effectiveness, and others that focus on health informatics. We identify the need for a shared understanding among stakeholders and describe organizational structures to promote cooperation such that device-to-device interactions and related safety information are considered during standards development. PMID:27584685
Dan Macumber Photo of Daniel Macumber Dan Macumber Engineering Daniel.Macumber@nrel.gov | 303-384 -6172 Orcid ID http://orcid.org/0000-0002-6909-4725 Daniel joined NREL in 2008 and works in the and interoperability. Prior to joining NREL, Daniel worked as a software developer working on
tmBioC: improving interoperability of text-mining tools with BioC.
Khare, Ritu; Wei, Chih-Hsuan; Mao, Yuqing; Leaman, Robert; Lu, Zhiyong
2014-01-01
The lack of interoperability among biomedical text-mining tools is a major bottleneck in creating more complex applications. Despite the availability of numerous methods and techniques for various text-mining tasks, combining different tools requires substantial efforts and time owing to heterogeneity and variety in data formats. In response, BioC is a recent proposal that offers a minimalistic approach to tool interoperability by stipulating minimal changes to existing tools and applications. BioC is a family of XML formats that define how to present text documents and annotations, and also provides easy-to-use functions to read/write documents in the BioC format. In this study, we introduce our text-mining toolkit, which is designed to perform several challenging and significant tasks in the biomedical domain, and repackage the toolkit into BioC to enhance its interoperability. Our toolkit consists of six state-of-the-art tools for named-entity recognition, normalization and annotation (PubTator) of genes (GenNorm), diseases (DNorm), mutations (tmVar), species (SR4GN) and chemicals (tmChem). Although developed within the same group, each tool is designed to process input articles and output annotations in a different format. We modify these tools and enable them to read/write data in the proposed BioC format. We find that, using the BioC family of formats and functions, only minimal changes were required to build the newer versions of the tools. The resulting BioC wrapped toolkit, which we have named tmBioC, consists of our tools in BioC, an annotated full-text corpus in BioC, and a format detection and conversion tool. Furthermore, through participation in the 2013 BioCreative IV Interoperability Track, we empirically demonstrate that the tools in tmBioC can be more efficiently integrated with each other as well as with external tools: Our experimental results show that using BioC reduces >60% in lines of code for text-mining tool integration. The tmBioC toolkit is publicly available at http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/tmTools/. Database URL: http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/tmTools/. Published by Oxford University Press 2014. This work is written by US Government employees and is in the public domain in the US.
Advances Made in the Next Generation of Satellite Networks
NASA Technical Reports Server (NTRS)
Bhasin, Kul B.
1999-01-01
Because of the unique networking characteristics of communications satellites, global satellite networks are moving to the forefront in enhancing national and global information infrastructures. Simultaneously, broadband data services, which are emerging as the major market driver for future satellite and terrestrial networks, are being widely acknowledged as the foundation for an efficient global information infrastructure. In the past 2 years, various task forces and working groups around the globe have identified pivotal topics and key issues to address if we are to realize such networks in a timely fashion. In response, industry, government, and academia undertook efforts to address these topics and issues. A workshop was organized to provide a forum to assess the current state-of-the-art, identify key issues, and highlight the emerging trends in the next-generation architectures, data protocol development, communication interoperability, and applications. The Satellite Networks: Architectures, Applications, and Technologies Workshop was hosted by the Space Communication Program at the NASA Lewis Research Center in Cleveland, Ohio. Nearly 300 executives and technical experts from academia, industry, and government, representing the United States and eight other countries, attended the event (June 2 to 4, 1998). The program included seven panels and invited sessions and nine breakout sessions in which 42 speakers presented on technical topics. The proceedings covers a wide range of topics: access technology and protocols, architectures and network simulations, asynchronous transfer mode (ATM) over satellite networks, Internet over satellite networks, interoperability experiments and applications, multicasting, NASA interoperability experiment programs, NASA mission applications, and Transmission Control Protocol/Internet Protocol (TCP/IP) over satellite: issues, relevance, and experience.
Using ontologies to improve semantic interoperability in health data.
Liyanage, Harshana; Krause, Paul; De Lusignan, Simon
2015-07-10
The present-day health data ecosystem comprises a wide array of complex heterogeneous data sources. A wide range of clinical, health care, social and other clinically relevant information are stored in these data sources. These data exist either as structured data or as free-text. These data are generally individual person-based records, but social care data are generally case based and less formal data sources may be shared by groups. The structured data may be organised in a proprietary way or be coded using one-of-many coding, classification or terminologies that have often evolved in isolation and designed to meet the needs of the context that they have been developed. This has resulted in a wide range of semantic interoperability issues that make the integration of data held on these different systems changing. We present semantic interoperability challenges and describe a classification of these. We propose a four-step process and a toolkit for those wishing to work more ontologically, progressing from the identification and specification of concepts to validating a final ontology. The four steps are: (1) the identification and specification of data sources; (2) the conceptualisation of semantic meaning; (3) defining to what extent routine data can be used as a measure of the process or outcome of care required in a particular study or audit and (4) the formalisation and validation of the final ontology. The toolkit is an extension of a previous schema created to formalise the development of ontologies related to chronic disease management. The extensions are focused on facilitating rapid building of ontologies for time-critical research studies.
Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J
2014-01-01
The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ © The Author(s) 2014. Published by Oxford University Press.
Wiegers, Thomas C.; Davis, Allan Peter; Mattingly, Carolyn J.
2014-01-01
The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ PMID:24919658
EPA Scientific Knowledge Management Assessment and ...
A series of activities have been conducted by a core group of EPA scientists from across the Agency. The activities were initiated in 2012 and the focus was to increase the reuse and interoperability of science software at EPA. The need for increased reuse and interoperability is linked to the increased complexity of environmental assessments in the 21st century. This complexity is manifest in the form of problems that require integrated multi-disciplinary solutions. To enable the means to develop these solutions (i.e., science software systems) it is necessary to integrate software developed by disparate groups representing a variety of science domains. Thus, reuse and interoperability becomes imperative. This report briefly describes the chronology of activities conducted by the group of scientists to provide context for the primary purpose of this report, that is, to describe the proceedings and outcomes of the latest activity, a workshop entitled “Workshop on Advancing US EPA integration of environmental and information sciences”. The EPA has been lagging in digital maturity relative to the private sector and even other government agencies. This report helps begin the process of improving the agency’s use of digital technologies, especially in the areas of efficiency and transparency. This report contributes to SHC 1.61.2.
Chao, Tian-Jy; Kim, Younghun
2015-02-10
An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.
NASA Astrophysics Data System (ADS)
Nieland, Simon; Kleinschmit, Birgit; Förster, Michael
2015-05-01
Ontology-based applications hold promise in improving spatial data interoperability. In this work we use remote sensing-based biodiversity information and apply semantic formalisation and ontological inference to show improvements in data interoperability/comparability. The proposed methodology includes an observation-based, "bottom-up" engineering approach for remote sensing applications and gives a practical example of semantic mediation of geospatial products. We apply the methodology to three different nomenclatures used for remote sensing-based classification of two heathland nature conservation areas in Belgium and Germany. We analysed sensor nomenclatures with respect to their semantic formalisation and their bio-geographical differences. The results indicate that a hierarchical and transparent nomenclature is far more important for transferability than the sensor or study area. The inclusion of additional information, not necessarily belonging to a vegetation class description, is a key factor for the future success of using semantics for interoperability in remote sensing.
Economic impact of a nationwide interoperable e-Health system using the PENG evaluation tool.
Parv, L; Saluse, J; Aaviksoo, A; Tiik, M; Sepper, R; Ross, P
2012-01-01
The aim of this paper is to evaluate the costs and benefits of the Estonian interoperable health information exchange system. In addition, a framework will be built for follow-up monitoring and analysis of a nationwide HIE system. PENG evaluation tool was used to map and quantify the costs and benefits arising from type II diabetic patient management for patients, providers and the society. The analysis concludes with a quantification based on real costs and potential benefits identified by a panel of experts. Setting up a countrywide interoperable eHealth system incurs a large initial investment. However, if the system is working seamlessly, benefits will surpass costs within three years. The results show that while the society stands to benefit the most, the costs will be mainly borne by the healthcare providers. Therefore, new government policies should be devised to encourage providers to invest to ensure society wide benefits.
Improving Interoperability between Registries and EHRs
Blumenthal, Seth
2018-01-01
National performance measurement needs clinical data that track the performance of multi disciplinary teams across episodes of care. Clinical registries are ideal platforms for this work due to their capture of structured, specific data across specialties. Because registries collect data at a national level, and registry data are captured in a consistent structure and format within each registry, registry data are useful for measurement and analysis “out of the box”. Registry business models are hampered by the cost of collecting data from EHRs and other source systems and abstracting or mapping them to fit registry data models. The National Quality Registry Network (NQRN) has launched Registries on FHIR, an initiative to lower barriers to achieving semantic interoperability between registries and source data systems. In 2017 Registries on FHIR conducted an information gathering campaign to learn where registries want better interoperability, and how to go about improving it. PMID:29888033
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Nichols, Kelvin F.; Witherspoon, Keith R.
2006-01-01
To date very little effort has been made to provide interoperability between various space agency projects. To effectively get to the Moon and beyond systems must interoperate. To provide interoperability, standardization and registries of various technologies will be required. These registries will be created as they relate to space flight. With the new NASA Moon/Mars initiative, a requirement to standardize and control the naming conventions of very disparate systems and technologies is emerging. The need to provide numbering to the many processes, schemas, vehicles, robots, space suits and technologies (e.g. versions), to name a few, in the highly complex Constellation initiative is imperative. The number of corporations, developer personnel, system interfaces, people interfaces will require standardization and registries on a scale not currently envisioned. It would only take one exception (stove piped system development) to weaken, if not, destroy interoperability. To start, a standardized registry process must be defined that allows many differing engineers, organizations and operators the ability to easily access disparate registry information across numerous technological and scientific disciplines. Once registries are standardized the need to provide registry support in terms of setup and operations, resolution of conflicts between registries and other issues will need to be addressed. Registries should not be confused with repositories. No end user data is "stored" in a registry nor is it a configuration control system. Once a registry standard is created and approved, the technologies that should be registered must be identified and prioritized. In this paper, we will identify and define a registry process that is compatible with the Constellation initiative and other non related space activities and organizations. We will then identify and define the various technologies that should use a registry to provide interoperability. The first set of technologies will be those that are currently in need of expansion namely the assignment of satellite designations and the process which controls assignments. Second, we will analyze the technologies currently standardized under the Consultative Committee for Space Data Systems (CCSDS) banner. Third, we will analyze the current CCSDS working group and Birds of a Feather (BoF) activities to ascertain registry requirements. Lastly, we will identify technologies that are either currently under the auspices of another standards body or technologies that are currently not standardized. For activities one through three, we will provide the analysis by either discipline or technology with rationale, identification and brief description of requirements and precedence. For activity four, we will provide a list of current standards bodies e.g. IETF and a list of potential candidates.
Multi-disciplinary interoperability challenges (Ian McHarg Medal Lecture)
NASA Astrophysics Data System (ADS)
Annoni, Alessandro
2013-04-01
Global sustainability research requires multi-disciplinary efforts to address the key research challenges to increase our understanding of the complex relationships between environment and society. For this reason dependence on ICT systems interoperability is rapidly growing but, despite some relevant technological improvement is observed, in practice operational interoperable solutions are still lacking. Among the causes is the absence of a generally accepted definition of "interoperability" in all its broader aspects. In fact the concept of interoperability is just a concept and the more popular definitions are not addressing all challenges to realize operational interoperable solutions. The problem become even more complex when multi-disciplinary interoperability is required because in that case solutions for interoperability of different interoperable solution should be envisaged. In this lecture the following definition will be used: "interoperability is the ability to exchange information and to use it". In the lecture the main challenges for addressing multi-disciplinary interoperability will be presented and a set of proposed approaches/solutions shortly introduced.
The Future of Geospatial Standards
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Simonis, I.
2016-12-01
The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds, where we can extract a trend for the future of geospatial standards. We see a number of key elements in focus, but simultaneously a broadening of standards to address particular communities' needs.
Tapuria, Archana; Kalra, Dipak; Kobayashi, Shinji
2013-12-01
The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes.
NASA Astrophysics Data System (ADS)
Jiang, W.; Wang, F.; Meng, Q.; Li, Z.; Liu, B.; Zheng, X.
2018-04-01
This paper presents a new standardized data format named Fire Markup Language (FireML), extended by the Geography Markup Language (GML) of OGC, to elaborate upon the fire hazard model. The proposed FireML is able to standardize the input and output documents of a fire model for effectively communicating with different disaster management systems to ensure a good interoperability. To demonstrate the usage of FireML and testify its feasibility, an adopted forest fire spread model being compatible with FireML is described. And a 3DGIS disaster management system is developed to simulate the dynamic procedure of forest fire spread with the defined FireML documents. The proposed approach will enlighten ones who work on other disaster models' standardization work.
Tool and data interoperability in the SSE system
NASA Technical Reports Server (NTRS)
Shotton, Chuck
1988-01-01
Information is given in viewgraph form on tool and data interoperability in the Software Support Environment (SSE). Information is given on industry problems, SSE system interoperability issues, SSE solutions to tool and data interoperability, and attainment of heterogeneous tool/data interoperability.
An overview of the model integration process: From pre ...
Integration of models requires linking models which can be developed using different tools, methodologies, and assumptions. We performed a literature review with the aim of improving our understanding of model integration process, and also presenting better strategies for building integrated modeling systems. We identified five different phases to characterize integration process: pre-integration assessment, preparation of models for integration, orchestration of models during simulation, data interoperability, and testing. Commonly, there is little reuse of existing frameworks beyond the development teams and not much sharing of science components across frameworks. We believe this must change to enable researchers and assessors to form complex workflows that leverage the current environmental science available. In this paper, we characterize the model integration process and compare integration practices of different groups. We highlight key strategies, features, standards, and practices that can be employed by developers to increase reuse and interoperability of science software components and systems. The paper provides a review of the literature regarding techniques and methods employed by various modeling system developers to facilitate science software interoperability. The intent of the paper is to illustrate the wide variation in methods and the limiting effect the variation has on inter-framework reuse and interoperability. A series of recommendation
WellnessRules: A Web 3.0 Case Study in RuleML-Based Prolog-N3 Profile Interoperation
NASA Astrophysics Data System (ADS)
Boley, Harold; Osmun, Taylor Michael; Craig, Benjamin Larry
An interoperation study, WellnessRules, is described, where rules about wellness opportunities are created by participants in rule languages such as Prolog and N3, and translated within a wellness community using RuleML/XML. The wellness rules are centered around participants, as profiles, encoding knowledge about their activities conditional on the season, the time-of-day, the weather, etc. This distributed knowledge base extends FOAF profiles with a vocabulary and rules about wellness group networking. The communication between participants is organized through Rule Responder, permitting wellness-profile translation and distributed querying across engines. WellnessRules interoperates between rules and queries in the relational (Datalog) paradigm of the pure-Prolog subset of POSL and in the frame (F-logic) paradigm of N3. An evaluation of Rule Responder instantiated for WellnessRules revealed acceptable Web response times.
UHF (Ultra High Frequency) Military Satellite Communications Ground Equipment Interoperability.
1986-10-06
crisis management requires interoperability between various services. These short-term crises often arise from unforeseen circumstances in which...Scheduler Qualcomm has prepared an interoperability study for the JTC3A (Reference 15) as a TA/CE for USCINCLANT ROC 5-84 requirements. It has defined a...interoperability is fundamental. A number of operational crises have occurred where interoperable communications or the lack of interoperable
Towards technical interoperability in telemedicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard Layne, II
2004-05-01
For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.
Progress Report on the Airborne Metadata and Time Series Working Groups of the 2016 ESDSWG
NASA Astrophysics Data System (ADS)
Evans, K. D.; Northup, E. A.; Chen, G.; Conover, H.; Ames, D. P.; Teng, W. L.; Olding, S. W.; Krotkov, N. A.
2016-12-01
NASA's Earth Science Data Systems Working Groups (ESDSWG) was created over 10 years ago. The role of the ESDSWG is to make recommendations relevant to NASA's Earth science data systems from users' experiences. Each group works independently focusing on a unique topic. Participation in ESDSWG groups comes from a variety of NASA-funded science and technology projects, including MEaSUREs and ROSS. Participants include NASA information technology experts, affiliated contractor staff and other interested community members from academia and industry. Recommendations from the ESDSWG groups will enhance NASA's efforts to develop long term data products. The Airborne Metadata Working Group is evaluating the suitability of the current Common Metadata Repository (CMR) and Unified Metadata Model (UMM) for airborne data sets and to develop new recommendations as necessary. The overarching goal is to enhance the usability, interoperability, discovery and distribution of airborne observational data sets. This will be done by assessing the suitability (gaps) of the current UMM model for airborne data using lessons learned from current and past field campaigns, listening to user needs and community recommendations and assessing the suitability of ISO metadata and other standards to fill the gaps. The Time Series Working Group (TSWG) is a continuation of the 2015 Time Series/WaterML2 Working Group. The TSWG is using a case study-driven approach to test the new Open Geospatial Consortium (OGC) TimeseriesML standard to determine any deficiencies with respect to its ability to fully describe and encode NASA earth observation-derived time series data. To do this, the time series working group is engaging with the OGC TimeseriesML Standards Working Group (SWG) regarding unsatisfied needs and possible solutions. The effort will end with the drafting of an OGC Engineering Report based on the use cases and interactions with the OGC TimeseriesML SWG. Progress towards finalizing recommendations will be presented at the meeting.
The interoperability force in the ERP field
NASA Astrophysics Data System (ADS)
Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon
2015-04-01
Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.
To ontologise or not to ontologise: An information model for a geospatial knowledge infrastructure
NASA Astrophysics Data System (ADS)
Stock, Kristin; Stojanovic, Tim; Reitsma, Femke; Ou, Yang; Bishr, Mohamed; Ortmann, Jens; Robertson, Anne
2012-08-01
A geospatial knowledge infrastructure consists of a set of interoperable components, including software, information, hardware, procedures and standards, that work together to support advanced discovery and creation of geoscientific resources, including publications, data sets and web services. The focus of the work presented is the development of such an infrastructure for resource discovery. Advanced resource discovery is intended to support scientists in finding resources that meet their needs, and focuses on representing the semantic details of the scientific resources, including the detailed aspects of the science that led to the resource being created. This paper describes an information model for a geospatial knowledge infrastructure that uses ontologies to represent these semantic details, including knowledge about domain concepts, the scientific elements of the resource (analysis methods, theories and scientific processes) and web services. This semantic information can be used to enable more intelligent search over scientific resources, and to support new ways to infer and visualise scientific knowledge. The work describes the requirements for semantic support of a knowledge infrastructure, and analyses the different options for information storage based on the twin goals of semantic richness and syntactic interoperability to allow communication between different infrastructures. Such interoperability is achieved by the use of open standards, and the architecture of the knowledge infrastructure adopts such standards, particularly from the geospatial community. The paper then describes an information model that uses a range of different types of ontologies, explaining those ontologies and their content. The information model was successfully implemented in a working geospatial knowledge infrastructure, but the evaluation identified some issues in creating the ontologies.
NASA Astrophysics Data System (ADS)
Loescher, H.; Fundamental Instrument Unit
2013-05-01
Ecological research addresses challenges relating to the dynamics of the planet, such as changes in climate, biodiversity, ecosystem functioning and services, carbon and energy cycles, natural and human-induced hazards, and adaptation and mitigation strategies that involve many science and engineering disciplines and cross national boundaries. Because of the global nature of these challenges, greater international collaboration is required for knowledge sharing and technology deployment to advance earth science investigations and enhance societal benefits. For example, the Working Group on Biodiversity Preservation and Ecosystem Services (PCAST 2011) noted the scale and complexity of the physical and human resources needed to address these challenges. Many of the most pressing ecological research questions require global-scale data and global scale solutions (Suresh 2012), e.g., interdisciplinary data access from data centers managing ecological resources and hazards, drought, heat islands, carbon cycle, or data used to forecast the rate of spread of invasive species or zoonotic diseases. Variability and change at one location or in one region may well result from the superposition of global processes coupled together with regional and local modes of variability. For example, we know the El Niño-Southern Oscillation large-scale modes of variability in the coupled terrestrial-aquatic-atmospheric systems' correlation with variability in regional rainfall and ecosystem functions. It is therefore a high priority of government and non-government organizations to develop the necessary large scale, world-class research infrastructures for environmental research—and the framework by which these data can be shared, discovered, and utilized by a broad user community of scientists and policymakers, alike. Given that there are many, albeit nascent, efforts to build new environmental observatories/networks globally (e.g., EU-ICOS, EU-Lifewatch, AU-TERN, China-CERN, GEOSS, GEO-BON, NutNet, etc.) and domestically, (e.g., NSF-CZO, USDA-LTAR, DOE-NGEE, Soil Carbon Network, etc.), there is a strong and mutual desire to assure interoperability of data. Developing interoperability is the degree by which each of the following is mapped between observatories (entities), defined by linking i) science requirements with science questions, ii) traceability of measurements to nationally and internationally accepted standards, iii) how data product are derived, i.e., algorithms, procedures, and methods, and iv) the bioinformatics which broadly include data formats, metadata, controlled vocabularies, and semantics. Here, we explore the rationale and focus areas for interoperability, the governance and work structures, example projects (NSF-NEON, EU-ICOS, and AU-TERN), and the emergent roles of scientists in these endeavors.
Modelling and approaching pragmatic interoperability of distributed geoscience data
NASA Astrophysics Data System (ADS)
Ma, Xiaogang
2010-05-01
Interoperability of geodata, which is essential for sharing information and discovering insights within a cyberinfrastructure, is receiving increasing attention. A key requirement of interoperability in the context of geodata sharing is that data provided by local sources can be accessed, decoded, understood and appropriately used by external users. Various researchers have discussed that there are four levels in data interoperability issues: system, syntax, schematics and semantics, which respectively relate to the platform, encoding, structure and meaning of geodata. Ontology-driven approaches have been significantly studied addressing schematic and semantic interoperability issues of geodata in the last decade. There are different types, e.g. top-level ontologies, domain ontologies and application ontologies and display forms, e.g. glossaries, thesauri, conceptual schemas and logical theories. Many geodata providers are maintaining their identified local application ontologies in order to drive standardization in local databases. However, semantic heterogeneities often exist between these local ontologies, even though they are derived from equivalent disciplines. In contrast, common ontologies are being studied in different geoscience disciplines (e.g., NAMD, SWEET, etc.) as a standardization procedure to coordinate diverse local ontologies. Semantic mediation, e.g. mapping between local ontologies, or mapping local ontologies to common ontologies, has been studied as an effective way of achieving semantic interoperability between local ontologies thus reconciling semantic heterogeneities in multi-source geodata. Nevertheless, confusion still exists in the research field of semantic interoperability. One problem is caused by eliminating elements of local pragmatic contexts in semantic mediation. Comparing to the context-independent feature of a common domain ontology, local application ontologies are closely related to elements (e.g., people, time, location, intention, procedure, consequence, etc.) of local pragmatic contexts and thus context-dependent. Elimination of these elements will inevitably lead to information loss in semantic mediation between local ontologies. Correspondingly, understanding and effect of exchanged data in a new context may differ from that in its original context. Another problem is the dilemma on how to find a balance between flexibility and standardization of local ontologies, because ontologies are not fixed, but continuously evolving. It is commonly realized that we cannot use a unified ontology to replace all local ontologies because they are context-dependent and need flexibility. However, without coordination of standards, freely developed local ontologies and databases will bring enormous work of mediation between them. Finding a balance between standardization and flexibility for evolving ontologies, in a practical sense, requires negotiations (i.e. conversations, agreements and collaborations) between different local pragmatic contexts. The purpose of this work is to set up a computer-friendly model representing local pragmatic contexts (i.e. geodata sources), and propose a practical semantic negotiation procedure for approaching pragmatic interoperability between local pragmatic contexts. Information agents, objective facts and subjective dimensions are reviewed as elements of a conceptual model for representing pragmatic contexts. The author uses them to draw a practical semantic negotiation procedure approaching pragmatic interoperability of distributed geodata. The proposed conceptual model and semantic negotiation procedure were encoded with Description Logic, and then applied to analyze and manipulate semantic negotiations between different local ontologies within the National Mineral Resources Assessment (NMRA) project of China, which involves multi-source and multi-subject geodata sharing.
Semantics-Based Interoperability Framework for the Geosciences
NASA Astrophysics Data System (ADS)
Sinha, A.; Malik, Z.; Raskin, R.; Barnes, C.; Fox, P.; McGuinness, D.; Lin, K.
2008-12-01
Interoperability between heterogeneous data, tools and services is required to transform data to knowledge. To meet geoscience-oriented societal challenges such as forcing of climate change induced by volcanic eruptions, we suggest the need to develop semantic interoperability for data, services, and processes. Because such scientific endeavors require integration of multiple data bases associated with global enterprises, implicit semantic-based integration is impossible. Instead, explicit semantics are needed to facilitate interoperability and integration. Although different types of integration models are available (syntactic or semantic) we suggest that semantic interoperability is likely to be the most successful pathway. Clearly, the geoscience community would benefit from utilization of existing XML-based data models, such as GeoSciML, WaterML, etc to rapidly advance semantic interoperability and integration. We recognize that such integration will require a "meanings-based search, reasoning and information brokering", which will be facilitated through inter-ontology relationships (ontologies defined for each discipline). We suggest that Markup languages (MLs) and ontologies can be seen as "data integration facilitators", working at different abstraction levels. Therefore, we propose to use an ontology-based data registration and discovery approach to compliment mark-up languages through semantic data enrichment. Ontologies allow the use of formal and descriptive logic statements which permits expressive query capabilities for data integration through reasoning. We have developed domain ontologies (EPONT) to capture the concept behind data. EPONT ontologies are associated with existing ontologies such as SUMO, DOLCE and SWEET. Although significant efforts have gone into developing data (object) ontologies, we advance the idea of developing semantic frameworks for additional ontologies that deal with processes and services. This evolutionary step will facilitate the integrative capabilities of scientists as we examine the relationships between data and external factors such as processes that may influence our understanding of "why" certain events happen. We emphasize the need to go from analysis of data to concepts related to scientific principles of thermodynamics, kinetics, heat flow, mass transfer, etc. Towards meeting these objectives, we report on a pair of related service engines: DIA (Discovery, integration and analysis), and SEDRE (Semantically-Enabled Data Registration Engine) that utilize ontologies for semantic interoperability and integration.
Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A
2015-01-01
This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.
2011-12-01
Task Based Approach to Planning.” Paper 08F- SIW -033. In Proceed- ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability...Paper 06F- SIW -003. In Proceed- 2597 Blais ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability Standards Organi...MSDL).” Paper 10S- SIW -003. In Proceedings of the Spring Simulation Interoperability Workshop. Simulation Interoperability Standards Organization
An interoperability experiment for sharing hydrological rating tables
NASA Astrophysics Data System (ADS)
Lemon, D.; Taylor, P.; Sheahan, P.
2013-12-01
The increasing demand on freshwater resources is requiring authorities to produce more accurate and timely estimates of their available water. Calculation of continuous time-series of river discharge and storage volumes generally requires rating tables. These approximate relationships between two phenomena, such as river level and discharge, and allow us to produce continuous estimates of a phenomenon that may be impractical or impossible to measure directly. Standardised information models or access mechanisms for rating tables are required to support sharing and exchange of water flow data. An Interoperability Experiment (IE) is underway to test an information model that describes rating tables, the observations made to build these ratings, and river cross-section data. The IE is an initiative of the joint World Meteorological Organisation/Open Geospatial Consortium's Hydrology Domain Working Group (HydroDWG) and the model will be published as WaterML2.0 part 2. Interoperability Experiments (IEs) are low overhead, multiple member projects that are run under the OGC's interoperability program to test existing and emerging standards. The HydroDWG has previously run IEs to test early versions of OGC WaterML2.0 part 1 - timeseries. This IE is focussing on two key exchange scenarios: Sharing rating tables and gauging observations between water agencies. Through the use of standard OGC web services, rating tables and associated data will be made available from water agencies. The (Australian) Bureau of Meteorology will retrieve rating tables on-demand from water authorities, allowing the Bureau to run conversions of data within their own systems. Exposing rating tables and gaugings for online analysis and educational purposes. A web client will be developed to enable exploration and visualization of rating tables, gaugings and related metadata for monitoring points. The client gives a quick view into available rating tables, their periods of applicability and the standard deviation of observations against the relationship. An example of this client running can be seen at the link provided. The result of the IE will form the basis for the standardisation of WaterML2.0 part 2. The use of the standard will lead to increased transparency and accessibility of rating tables, while also improving general understanding of this important hydrological concept.
Smart Home and Building Systems | Energy Systems Integration Facility |
Exploring how intelligent building systems and the dynamic grid of the future can work together evaluate them as a neighborhood Exploring functionality, applications, interoperability, and cybersecurity
DOT National Transportation Integrated Search
2011-01-01
Connected vehicles have the potential to transform the way Americans travel through the creation of a safe, interoperable wireless communications networka system that includes cars, buses, trucks, trains, traffic signals, cell phones, and other de...
Reunification: keeping families together in crisis.
Blake, Nancy; Stevenson, Kathleen
2009-08-01
In reviewing the literature, there has not been a family reunification plan that has worked consistently during disasters. During Hurricane Katrina, there were children who were sent to a shelter in a different state than their patients. When children are involved, the issues become even more difficult, because some children who are preverbal cannot tell their names or their parents names. Tracking systems have been developed but are not interoperable. No central repository has been developed. There are also issues related to transporting patients, psychosocial issues as well as safety issues that are different when children will be unaccompanied by an adult. Two national meetings were held with experts from all over the country who have an expertise in the care of children. Six focused groups were identified: patient movement/transportation; technology/tracking; clinical issues, nonmedical issues; communication/regulatory issues; and pediatric psychosocial support. The second meeting was a consensus conference. Recommendations from each subgroup were presented and voted on. All recommendations were accepted. The issue of reunification of families in disaster is still a problem. Further work needs to be done on tracking systems that are interoperable before another large disaster strike, pediatric psychological issues after a disaster, transporting patients, and care of the pediatric patient who is not accompanied by an adult. Once a system has been developed, the system needs to be tested by large scale drills that practice moving children across state lines and from one area to another.
Bridging Hydroinformatics Services Between HydroShare and SWATShare
NASA Astrophysics Data System (ADS)
Merwade, V.; Zhao, L.; Song, C. X.; Tarboton, D. G.; Goodall, J. L.; Stealey, M.; Rajib, A.; Morsy, M. M.; Dash, P. K.; Miles, B.; Kim, I. L.
2016-12-01
Many cyberinfrastructure systems in the hydrologic and related domains emerged in the past decade with more being developed to address various data management and modeling needs. Although clearly beneficial to the broad user community, it is a challenging task to build interoperability across these systems due to various obstacles including technological, organizational, semantic, and social issues. This work presents our experience in developing interoperability between two hydrologic cyberinfrastructure systems - SWATShare and HydroShare. HydroShare is a large-scale online system aiming at enabling the hydrologic user community to share their data, models, and analysis online for solving complex hydrologic research questions. On the other side, SWATShare is a focused effort to allow SWAT (Soil and Water Assessment Tool) modelers share, execute and analyze SWAT models using high performance computing resources. Making these two systems interoperable required common sign-in through OAuth, sharing of models through common metadata standards and use of standard web-services for implementing key import/export functionalities. As a result, users from either community can leverage the resources and services across these systems without having to manually importing, exporting, or processing their models. Overall, this use case is an example that can serve as a model for the interoperability among other systems as no one system can provide all the functionality needed to address large interdisciplinary problems.
NASA Astrophysics Data System (ADS)
Glavev, Victor
2016-12-01
The types of software applications used by public administrations can be divided in three main groups: document management systems, record management systems and business process systems. Each one of them generates outputs that can be used as input data to the others. This is the main reason that requires exchange of data between these three groups and well defined models that should be followed. There are also many other reasons that will be discussed in the paper. Interoperability is a key aspect when those models are implemented, especially when there are different manufactures of systems in the area of software applications used by public authorities. The report includes examples of implementation of models for exchange of data between software systems deployed in one of the biggest administration in Bulgaria.
Kalra, Dipak; Kobayashi, Shinji
2013-01-01
Objectives The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Methods Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. Results The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Conclusions Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes. PMID:24523993
A step-by-step methodology for enterprise interoperability projects
NASA Astrophysics Data System (ADS)
Chalmeta, Ricardo; Pazos, Verónica
2015-05-01
Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.
Evaluation results of xTCA equipment for HEP experiments at CERN
NASA Astrophysics Data System (ADS)
Di Cosmo, M.; Bobillier, V.; Haas, S.; Joos, M.; Mico, S.; Vasey, F.; Vichoudis, P.
2013-12-01
The MicroTCA and AdvancedTCA industry standards are candidate modular electronic platforms for the upgrade of the current generation of high energy physics experiments. The PH-ESE group at CERN launched in 2011 the xTCA evaluation project with the aim of performing technical evaluations and eventually providing support for commercially available components. Different devices from different vendors have been acquired, evaluated and interoperability tests have been performed. This paper presents the test procedures and facilities that have been developed and focuses on the evaluation results including electrical, thermal and interoperability aspects.
Lin, M.C.; Vreeman, D.J.; Huff, S.M.
2012-01-01
Objectives We wanted to develop a method for evaluating the consistency and usefulness of LOINC code use across different institutions, and to evaluate the degree of interoperability that can be attained when using LOINC codes for laboratory data exchange. Our specific goals were to: 1) Determine if any contradictory knowledge exists in LOINC. 2) Determine how many LOINC codes were used in a truly interoperable fashion between systems. 3) Provide suggestions for improving the semantic interoperability of LOINC. Methods We collected Extensional Definitions (EDs) of LOINC usage from three institutions. The version space approach was used to divide LOINC codes into small sets, which made auditing of LOINC use across the institutions feasible. We then compared pairings of LOINC codes from the three institutions for consistency and usefulness. Results The number of LOINC codes evaluated were 1,917, 1,267 and 1,693 as obtained from ARUP, Intermountain and Regenstrief respectively. There were 2,022, 2,030, and 2,301 version spaces among ARUP & Intermountain, Intermountain & Regenstrief and ARUP & Regenstrief respectively. Using the EDs as the gold standard, there were 104, 109 and 112 pairs containing contradictory knowledge and there were 1,165, 765 and 1,121 semantically interoperable pairs. The interoperable pairs were classified into three levels: 1) Level I – No loss of meaning, complete information was exchanged by identical codes. 2) Level II – No loss of meaning, but processing of data was needed to make the data completely comparable. 3) Level III – Some loss of meaning. For example, tests with a specific ‘method’ could be rolled-up with tests that were ‘methodless’. Conclusions There are variations in the way LOINC is used for data exchange that result in some data not being truly interoperable across different enterprises. To improve its semantic interoperability, we need to detect and correct any contradictory knowledge within LOINC and add computable relationships that can be used for making reliable inferences about the data. The LOINC committee should also provide detailed guidance on best practices for mapping from local codes to LOINC codes and for using LOINC codes in data exchange. PMID:22306382
NASA and Industry Benefits of ACTS High Speed Network Interoperability Experiments
NASA Technical Reports Server (NTRS)
Zernic, M. J.; Beering, D. R.; Brooks, D. E.
2000-01-01
This paper provides synopses of the design. implementation, and results of key high data rate communications experiments utilizing the technologies of NASA's Advanced Communications Technology Satellite (ACTS). Specifically, the network protocol and interoperability performance aspects will be highlighted. The objectives of these key experiments will be discussed in their relevant context to NASA missions, as well as, to the comprehensive communications industry. Discussion of the experiment implementation will highlight the technical aspects of hybrid network connectivity, a variety of high-speed interoperability architectures, a variety of network node platforms, protocol layers, internet-based applications, and new work focused on distinguishing between link errors and congestion. In addition, this paper describes the impact of leveraging government-industry partnerships to achieve technical progress and forge synergistic relationships. These relationships will be the key to success as NASA seeks to combine commercially available technology with its own internal technology developments to realize more robust and cost effective communications for space operations.
Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes
NASA Astrophysics Data System (ADS)
Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.
2014-12-01
With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data Web developer-friendly with a RESTful service. This goal was achieved by defining a proxy layer on top of the existing SPARQL endpoint that 1) translates HTTP requests into SPARQL queries, and 2) renders the returned results as required by the request sender using content negotiation, suffixes and parameters.
Positive train control test bed interoperability upgrades.
DOT National Transportation Integrated Search
2013-02-01
Transportation Technology Center, Inc. (TTCI) upgraded the Positive Train Control (PTC) Test Bed to support additional PTC testing configurations under Federal Railroad Administration (FRA) Task Order 270. The scope of work provided additional PTC Co...
Interoperable PKI Data Distribution in Computational Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pala, Massimiliano; Cholia, Shreyas; Rea, Scott A.
One of the most successful working examples of virtual organizations, computational grids need authentication mechanisms that inter-operate across domain boundaries. Public Key Infrastructures(PKIs) provide sufficient flexibility to allow resource managers to securely grant access to their systems in such distributed environments. However, as PKIs grow and services are added to enhance both security and usability, users and applications must struggle to discover available resources-particularly when the Certification Authority (CA) is alien to the relying party. This article presents how to overcome these limitations of the current grid authentication model by integrating the PKI Resource Query Protocol (PRQP) into the Gridmore » Security Infrastructure (GSI).« less
An open platform for promoting interoperability in solar system sciences
NASA Astrophysics Data System (ADS)
Csillaghy, André; Aboudarham, Jean; Berghmans, David; Jacquey, Christian
2013-04-01
The European coordination project CASSIS is promoting the creation of an integrated data space that will facilitate science across community boundaries in solar system sciences. Many disciplines may need to use the same data set to support scientific research, although the way they are used may depend on the project and on the particular piece of science. Often, access is hindered because of differences in the way the different communities describe, store their data, as well as how they make them accessible. Working towards this goal, we have set up an open collaboration platform, www.explorespace.eu, that can serve as a hub for discovering and developing interoperability resources in the communities involved. The platform is independent of the project and will be maintained well after the end of the funding. As a first step, we have captured the description of services already provided by the community. The openness of the collaboration platform should allow to discuss with all stakeholders ways to make key types of metadata and derived products more complete and coherent and thus more usable across the domain boundaries. Furthermore, software resources and discussions should help facilitating the development of interoperable services. The platform, along with the database of services, address the following questions, which we consider crucial for promoting interoperability: • Current extent of the data space coverage: What part of the common data space is already covered by the existing interoperable services in terms of data access. In other words, what data, from catalogues as well as from raw data, can be reached by an application through standard protocols today? • Needed extension of the data space coverage: What would be needed to extend the data space coverage? In other words, how can the currently accessible data space be extended by adding services? • Missing services: What applications / services are still missing and need to be developed? This is not a trivial question, as the generation of the common data space in itself creates new requirements on overarching applications that might be necessary to provide a unified access to all the services. As an example, one particular aspect discussed in the platform is the design of web services. Applications of today are mainly human centred while interoperability must happen one level below and the back ends (databases) must be generic, i.e. independent from the applications. We intent our effort to provide to developers resources that disentangle user interfaces from data services. Many activities are challenging and we hope they will be discussed on our platform. In particular, the quality of the services, the data space and the needs of interdisciplinary approaches are serious concerns for instruments such as ATST and EST or the ones onboard SDO and, in the future, Solar Orbiter. We believe that our platform might be useful as a kind of guide that would allow groups of not having to reinvent the wheel for each new instrument.
Designing learning management system interoperability in semantic web
NASA Astrophysics Data System (ADS)
Anistyasari, Y.; Sarno, R.; Rochmawati, N.
2018-01-01
The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.
Medical Device Plug-and-Play Interoperability Standards and Technology Leadership
2017-10-01
Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...Sept 2016 – 20 Sept 2017 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology ...efficiency through interoperable medical technologies . We played a leadership role on interoperability safety standards (AAMI, AAMI/UL Joint
Leverage and Delegation in Developing an Information Model for Geology
NASA Astrophysics Data System (ADS)
Cox, S. J.
2007-12-01
GeoSciML is an information model and XML encoding developed by a group of primarily geologic survey organizations under the auspices of the IUGS CGI. The scope of the core model broadly corresponds with information traditionally portrayed on a geologic map, viz. interpreted geology, some observations, the map legend and accompanying memoir. The development of GeoSciML has followed the methodology specified for an Application Schema defined by OGC and ISO 19100 series standards. This requires agreement within a community concerning their domain model, its formal representation using UML, documentation as a Feature Type Catalogue, with an XML Schema implementation generated from the model by applying a rule-based transformation. The framework and technology supports a modular governance process. Standard datatypes and GI components (geometry, the feature and coverage metamodels, metadata) are imported from the ISO framework. The observation and sampling model (including boreholes) is imported from OGC. The scale used for most scalar literal values (terms, codes, measures) allows for localization where necessary. Wildcards and abstract base- classes provide explicit extensibility points. Link attributes appear in a regular way in the encodings, allowing reference to external resources using URIs. The encoding is compatible with generic GI data-service interfaces (WFS, WMS, SOS). For maximum interoperability within a community, the interfaces may be specialised through domain-specified constraints (e.g. feature-types, scale and vocabulary bindings, query-models). Formalization using UML and XML allows use of standard validation and processing tools. Use of upper-level elements defined for generic GI application reduces the development effort and governance resonsibility, while maximising cross-domain interoperability. On the other hand, enabling specialization to be delegated in a controlled manner is essential to adoption across a range of subdisciplines and jurisdictions. The GeoSciML design team is responsible only for the part of the model that is unique to geology but for which general agreement can be reached within the domain. This paper is presented on behalf of the Interoperability Working Group of the IUGS Commission for Geoscience Information (CGI) - follow web-link for details of the membership.
Marcelo, A; Adejumo, A; Luna, D
2011-01-01
Describe the issues surrounding health informatics in developing countries and the challenges faced by practitioners in building internal capacity. From these issues, the authors propose cost-effective strategies that can fast track health informatics development in these low to medium income countries (LMICs). The authors conducted a review of literature and consulted key opinion leaders who have experience with health informatics implementations around the world. Despite geographic and cultural differences, many LMICs share similar challenges and opportunities in developing health informatics. Partnerships, standards, and inter-operability are well known components of successful informatics programs. Establishing partnerships can be comprised of formal inter-institutional collaborations on training and research, collaborative open source software development, and effective use of social networking. Lacking legacy systems, LMICs can discuss standards and inter-operability more openly and have greater potential for success. Lastly, since cellphones are pervasive in developing countries, they can be leveraged as access points for delivering and documenting health services in remote under-served areas. Mobile health or mHealth gives LMICs a unique opportunity to leapfrog through most issues that have plagued health informatics in developed countries. By employing this proposed roadmap, LMICs can now develop capacity for health informatics using appropriate and cost-effective technologies.
Kamimura, Emi; Tanaka, Shinpei; Takaba, Masayuki; Tachi, Keita; Baba, Kazuyoshi
2017-01-01
The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D) images of teeth captured by a digital impression technique to a conventional impression technique in vivo. Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE). A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE). Stereolithography (STL) data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D) laboratory scanner (D810, 3shape). The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software) for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test). The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm) than when using a conventional impression technique (0.023 ± 0.01 mm). The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator.
A future-proof architecture for telemedicine using loose-coupled modules and HL7 FHIR.
Gøeg, Kirstine Rosenbeck; Rasmussen, Rune Kongsgaard; Jensen, Lasse; Wollesen, Christian Møller; Larsen, Søren; Pape-Haugaard, Louise Bilenberg
2018-07-01
Most telemedicine solutions are proprietary and disease specific which cause a heterogeneous and silo-oriented system landscape with limited interoperability. Solving the interoperability problem would require a strong focus on data integration and standardization in telemedicine infrastructures. Our objective was to suggest a future-proof architecture, that consisted of small loose-coupled modules to allow flexible integration with new and existing services, and the use of international standards to allow high re-usability of modules, and interoperability in the health IT landscape. We identified core features of our future-proof architecture as the following (1) To provide extended functionality the system should be designed as a core with modules. Database handling and implementation of security protocols are modules, to improve flexibility compared to other frameworks. (2) To ensure loosely coupled modules the system should implement an inversion of control mechanism. (3) A focus on ease of implementation requires the system should use HL7 FHIR (Fast Interoperable Health Resources) as the primary standard because it is based on web-technologies. We evaluated the feasibility of our architecture by developing an open source implementation of the system called ORDS. ORDS is written in TypeScript, and makes use of the Express Framework and HL7 FHIR DSTU2. The code is distributed on GitHub. All modules have been tested unit wise, but end-to-end testing awaits our first clinical example implementations. Our study showed that highly adaptable and yet interoperable core frameworks for telemedicine can be designed and implemented. Future work includes implementation of a clinical use case and evaluation. Copyright © 2018 Elsevier B.V. All rights reserved.
Personalized-detailed clinical model for data interoperability among clinical standards.
Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir; Lee, Sungyoung
2013-08-01
Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems.
Personalized-Detailed Clinical Model for Data Interoperability Among Clinical Standards
Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir
2013-01-01
Abstract Objective: Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. Materials and Methods: We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. Results: For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. Conclusions: The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems. PMID:23875730
A Tale of Two Observing Systems: Interoperability in the World of Microsoft Windows
NASA Astrophysics Data System (ADS)
Babin, B. L.; Hu, L.
2008-12-01
Louisiana Universities Marine Consortium's (LUMCON) and Dauphin Island Sea Lab's (DISL) Environmental Monitoring System provide a unified coastal ocean observing system. These two systems are mirrored to maintain autonomy while offering an integrated data sharing environment. Both systems collect data via Campbell Scientific Data loggers, store the data in Microsoft SQL servers, and disseminate the data in real- time on the World Wide Web via Microsoft Internet Information Servers and Active Server Pages (ASP). The utilization of Microsoft Windows technologies presented many challenges to these observing systems as open source tools for interoperability grow. The current open source tools often require the installation of additional software. In order to make data available through common standards formats, "home grown" software has been developed. One example of this is the development of software to generate xml files for transmission to the National Data Buoy Center (NDBC). OOSTethys partners develop, test and implement easy-to-use, open-source, OGC-compliant software., and have created a working prototype of networked, semantically interoperable, real-time data systems. Partnering with OOSTethys, we are developing a cookbook to implement OGC web services. The implementation will be written in ASP, will run in a Microsoft operating system environment, and will serve data via Sensor Observation Services (SOS). This cookbook will give observing systems running Microsoft Windows the tools to easily participate in the Open Geospatial Consortium (OGC) Oceans Interoperability Experiment (OCEANS IE).
The National Capital Region closed circuit television video interoperability project.
Contestabile, John; Patrone, David; Babin, Steven
2016-01-01
The National Capital Region (NCR) includes many government jurisdictions and agencies using different closed circuit TV (CCTV) cameras and video management software. Because these agencies often must work together to respond to emergencies and events, a means of providing interoperability for CCTV video is critically needed. Video data from different CCTV systems that are not inherently interoperable is represented in the "data layer." An "integration layer" ingests the data layer source video and normalizes the different video formats. It then aggregates and distributes this video to a "presentation layer" where it can be viewed by almost any application used by other agencies and without any proprietary software. A native mobile video viewing application is also developed that uses the presentation layer to provide video to different kinds of smartphones. The NCR includes Washington, DC, and surrounding counties in Maryland and Virginia. The video sharing architecture allows one agency to see another agency's video in their native viewing application without the need to purchase new CCTV software or systems. A native smartphone application was also developed to enable them to share video via mobile devices even when they use different video management systems. A video sharing architecture has been developed for the NCR that creates an interoperable environment for sharing CCTV video in an efficient and cost effective manner. In addition, it provides the desired capability of sharing video via a native mobile application.
The value of health care information exchange and interoperability.
Walker, Jan; Pan, Eric; Johnston, Douglas; Adler-Milstein, Julia; Bates, David W; Middleton, Blackford
2005-01-01
In this paper we assess the value of electronic health care information exchange and interoperability (HIEI) between providers (hospitals and medical group practices) and independent laboratories, radiology centers, pharmacies, payers, public health departments, and other providers. We have created an HIEI taxonomy and combined published evidence with expert opinion in a cost-benefit model. Fully standardized HIEI could yield a net value of dollar 77.8 billion per year once fully implemented. Nonstandardized HIEI offers smaller positive financial returns. The clinical impact of HIEI for which quantitative estimates cannot yet be made would likely add further value. A compelling business case exists for national implementation of fully standardized HIEI.
TRIAD: The Translational Research Informatics and Data Management Grid.
Payne, P; Ervin, D; Dhaval, R; Borlawsky, T; Lai, A
2011-01-01
Multi-disciplinary and multi-site biomedical research programs frequently require infrastructures capable of enabling the collection, management, analysis, and dissemination of heterogeneous, multi-dimensional, and distributed data and knowledge collections spanning organizational boundaries. We report on the design and initial deployment of an extensible biomedical informatics platform that is intended to address such requirements. A common approach to distributed data, information, and knowledge management needs in the healthcare and life science settings is the deployment and use of a service-oriented architecture (SOA). Such SOA technologies provide for strongly-typed, semantically annotated, and stateful data and analytical services that can be combined into data and knowledge integration and analysis "pipelines." Using this overall design pattern, we have implemented and evaluated an extensible SOA platform for clinical and translational science applications known as the Translational Research Informatics and Data-management grid (TRIAD). TRIAD is a derivative and extension of the caGrid middleware and has an emphasis on supporting agile "working interoperability" between data, information, and knowledge resources. Based upon initial verification and validation studies conducted in the context of a collection of driving clinical and translational research problems, we have been able to demonstrate that TRIAD achieves agile "working interoperability" between distributed data and knowledge sources. Informed by our initial verification and validation studies, we believe TRIAD provides an example instance of a lightweight and readily adoptable approach to the use of SOA technologies in the clinical and translational research setting. Furthermore, our initial use cases illustrate the importance and efficacy of enabling "working interoperability" in heterogeneous biomedical environments.
MMI: Increasing Community Collaboration
NASA Astrophysics Data System (ADS)
Galbraith, N. R.; Stocks, K.; Neiswender, C.; Maffei, A.; Bermudez, L.
2007-12-01
Building community requires a collaborative environment and guidance to help move members towards a common goal. An effective environment for community collaboration is a workspace that fosters participation and cooperation; effective guidance furthers common understanding and promotes best practices. The Marine Metadata Interoperability (MMI) project has developed a community web site to provide a collaborative environment for scientists, technologists, and data managers from around the world to learn about metadata and exchange ideas. Workshops, demonstration projects, and presentations also provide community-building opportunities for MMI. MMI has developed comprehensive online guides to help users understand and work with metadata standards, ontologies, and other controlled vocabularies. Documents such as "The Importance of Metadata Standards", "Usage vs. Discovery Vocabularies" and "Developing Controlled Vocabularies" guide scientists and data managers through a variety of metadata-related concepts. Members from eight organizations involved in marine science and informatics collaborated on this effort. The MMI web site has moved from Plone to Drupal, two content management systems which provide different opportunities for community-based work. Drupal's "organic groups" feature will be used to provide workspace for future teams tasked with content development, outreach, and other MMI mission-critical work. The new site is designed to enable members to easily create working areas, to build communities dedicated to developing consensus on metadata and other interoperability issues. Controlled-vocabulary-driven menus, integrated mailing-lists, member-based content creation and review tools are facets of the new web site architecture. This move provided the challenge of developing a hierarchical vocabulary to describe the resources presented on the site; consistent and logical tagging of web pages is the basis of Drupal site navigation. The new MMI web site presents enhanced opportunities for electronic discussions, focused collaborative work, and even greater community participation. The MMI project is beginning a new initiative to comprehensively catalog and document tools for marine metadata. The new MMI community-based web site will be used to support this work and to support the work of other ad-hoc teams in the future. We are seeking broad input from the community on this effort.
Supply Chain Interoperability Measurement
2015-06-19
Supply Chain Interoperability Measurement DISSERTATION June 2015 Christos E. Chalyvidis, Major, Hellenic Air...ENS-DS-15-J-001 SUPPLY CHAIN INTEROPERABILITY MEASUREMENT DISSERTATION Presented to the Faculty Department of Operational Sciences...INTEROPERABILITY MEASUREMENT Christos E. Chalyvidis, BS, MSc. Major, Hellenic Air Force Committee Membership: Dr. A.W. Johnson Chair
Interoperability in planetary research for geospatial data analysis
NASA Astrophysics Data System (ADS)
Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara
2018-01-01
For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.
Interoperability and information discovery
Christian, E.
2001-01-01
In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.
Federal Register 2010, 2011, 2012, 2013, 2014
2015-08-03
...] Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments AGENCY: Food... workshop entitled ``FDA/CDC/NLM Workshop on Promoting Semantic Interoperability of Laboratory Data.'' The... to promoting the semantic interoperability of laboratory data between in vitro diagnostic devices and...
2014-06-01
functionalities and capabilities of DDS. We present a middleware design based on the DDS specifications as developed by the Object Management Group . The...functionalities and capabilities of DDS. We present a middleware design based on the DDS specifications as developed by the Object Management Group ...32 c. History .....................................................................................33 d
75 FR 63462 - Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-15
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid Interoperability Standards October 7, 2010... directs the development of a framework to achieve interoperability of smart grid devices and systems...
Federal Register 2010, 2011, 2012, 2013, 2014
2016-10-04
...] Workshop on Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments... Semantic Interoperability of Laboratory Data.'' The purpose of this public workshop is to receive and... Semantic Interoperability of Laboratory Data.'' Received comments will be placed in the docket and, except...
Building the Synergy between Public Sector and Research Data Infrastructures
NASA Astrophysics Data System (ADS)
Craglia, Massimo; Friis-Christensen, Anders; Ostländer, Nicole; Perego, Andrea
2014-05-01
INSPIRE is a European Directive aiming to establish a EU-wide spatial data infrastructure to give cross-border access to information that can be used to support EU environmental policies, as well as other policies and activities having an impact on the environment. In order to ensure cross-border interoperability of data infrastructures operated by EU Member States, INSPIRE sets out a framework based on common specifications for metadata, data, network services, data and service sharing, monitoring and reporting. The implementation of INSPIRE has reached important milestones: the INSPIRE Geoportal was launched in 2011 providing a single access point for the discovery of INSPIRE data and services across EU Member States (currently, about 300K), while all the technical specifications for the interoperability of data across the 34 INSPIRE themes were adopted at the end of 2013. During this period a number of EU and international initiatives has been launched, concerning cross-domain interoperability and (Linked) Open Data. In particular, the EU Open Data Portal, launched in December 2012, made provisions to access government and scientific data from EU institutions and bodies, and the EU ISA Programme (Interoperability Solutions for European Public Administrations) promotes cross-sector interoperability by sharing and re-using EU-wide and national standards and components. Moreover, the Research Data Alliance (RDA), an initiative jointly funded by the European Commission, the US National Science Foundation and the Australian Research Council, was launched in March 2013 to promote scientific data sharing and interoperability. The Joint Research Centre of the European Commission (JRC), besides being the technical coordinator of the implementation of INSPIRE, is also actively involved in the initiatives promoting cross-sector re-use in INSPIRE, and sustainable approaches to address the evolution of technologies - in particular, how to support Linked Data in INSPIRE and the use of global persistent identifiers. It is evident that government and scientific data infrastructures are currently facing a number of issues that have already been addressed in INSPIRE. Sharing experiences and competencies will avoid re-inventing the wheel, and help promoting the cross-domain adoption of consistent solutions. Actually, one of the lessons learnt from INSPIRE and the initiatives in which JRC is involved, is that government and research data are not two separate worlds. Government data are commonly used as a basis to create scientific data, and vice-versa. Consequently, it is fundamental to adopt a consistent approach to address interoperability and data management issues shared by both government and scientific data. The presentation illustrates some of the lessons learnt during the implementation of INSPIRE and in work on data and service interoperability coordinated with European and international initiatives. We describe a number of critical interoperability issues and barriers affecting both scientific and government data, concerning, e.g., data terminologies, quality and licensing, and propose how these problems could be effectively addressed by a closer collaboration of the government and scientific communities, and the sharing of experiences and practices.
Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101
2012-07-01
interoperability, although they are supported by some interoperability attributes For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brackney, L.
Broadly accessible, low cost, accurate, and easy-to-use energy auditing tools remain out of reach for managers of the aging U.S. building population (over 80% of U.S. commercial buildings are more than 10 years old*). concept3D and NREL's commercial buildings group will work to translate and extend NREL's existing spreadsheet-based energy auditing tool for a browser-friendly and mobile-computing platform. NREL will also work with concept3D to further develop a prototype geometry capture and materials inference tool operable on a smart phone/pad platform. These tools will be developed to interoperate with NREL's Building Component Library and OpenStudio energy modeling platforms, and willmore » be marketed by concept3D to commercial developers, academic institutions and governmental agencies. concept3D is NREL's lead developer and subcontractor of the Building Component Library.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liegey, Lauren Rene; Wilcox, Trevor; Mckinney, Gregg Walter
2015-08-07
My internship program was the Domestic Nuclear Detection Office Summer Internship Program. I worked at Los Alamos National Laboratory with Trevor A. Wilcox and Gregg W. McKinney in the NEN-5 group. My project title was “MCNP Physical Model Interoperability & Validation”. The goal of my project was to write a program to predict the solar modulation parameter for dates in the future and then implement it into MCNP6. This update to MCNP6 can be used to calculate the background more precisely, which is an important factor in being able to detect Special Nuclear Material. We will share our work inmore » a published American Nuclear Society (ANS) paper, an ANS presentation, and a LANL student poster session. Through this project, I gained skills in programming, computing, and using MCNP. I also gained experience that will help me decide on a career or perhaps obtain employment in the future.« less
ACTS 118x: High Speed TCP Interoperability Testing
NASA Technical Reports Server (NTRS)
Brooks, David E.; Buffinton, Craig; Beering, Dave R.; Welch, Arun; Ivancic, William D.; Zernic, Mike; Hoder, Douglas J.
1999-01-01
With the recent explosion of the Internet and the enormous business opportunities available to communication system providers, great interest has developed in improving the efficiency of data transfer over satellite links using the Transmission Control Protocol (TCP) of the Internet Protocol (IP) suite. The NASA's ACTS experiments program initiated a series of TCP experiments to demonstrate scalability of TCP/IP and determine to what extent the protocol can be optimized over a 622 Mbps satellite link. Through partnerships with the government technology oriented labs, computer, telecommunication, and satellite industries NASA Glenn was able to: (1) promote the development of interoperable, high-performance TCP/IP implementations across multiple computing / operating platforms; (2) work with the satellite industry to answer outstanding questions regarding the use of standard protocols (TCP/IP and ATM) for the delivery of advanced data services, and for use in spacecraft architectures; and (3) conduct a series of TCP/IP interoperability tests over OC12 ATM over a satellite network in a multi-vendor environment using ACTS. The experiments' various network configurations and the results are presented.
NASA Astrophysics Data System (ADS)
Tisdale, M.
2017-12-01
NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying user requirements from government, private, public and academic communities. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), and OGC Web Coverage Services (WCS) while leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams at ASDC are utilizing these services through the development of applications using the Web AppBuilder for ArcGIS and the ArcGIS API for Javascript. These services provide greater exposure of ASDC data holdings to the GIS community and allow for broader sharing and distribution to various end users. These capabilities provide interactive visualization tools and improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry. The presentation will cover how the ASDC is developing geospatial web services and applications to improve data discoverability, accessibility, and interoperability.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-25
...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...
Warfighter IT Interoperability Standards Study
2012-07-22
data (e.g. messages) between systems ? ii) What process did you used to validate and certify semantic interoperability between your...other systems at this time There was no requirement to validate and certify semantic interoperability The DLS program exchanges data with... semantics Testing for System Compliance with Data Models Verify and Certify Interoperability Using Data
Enabling interoperability in planetary sciences and heliophysics: The case for an information model
NASA Astrophysics Data System (ADS)
Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.
2018-01-01
The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.
Partnerships: One Strategy for Meeting Big Data Challenges
NASA Astrophysics Data System (ADS)
Chandler, C. L.; Groman, R. C.; Kinkade, D.; Shepherd, A.; Allison, M. D.; Rauch, S.; Wiebe, P. H.; Glover, D. M.
2014-12-01
In late 2006 staff members from the previously independent US Joint Global Ocean Flux Study (US JGOFS) and US GLOBal Ocean ECosystems Dynamics (US GLOBEC) data management offices joined forces and received funding from the US National Science Foundation to provide data management support to ocean science researchers. The transition from providing dedicated, project-specific data management services to supporting a broader research community data facility has necessitated understanding of and adaptation to evolving needs. One of the strategies that has proven to be very effective is the formation of partnerships with other groups doing complementary work. Staff members at BCO-DMO have formed collaborative partnerships with others to support our primary research community efficiently and in a way that covers the full research data life cycle. Examples will be provided that highlight ways in which such partnerships have enhanced the work done by BCO-DMO, and also ways in which BCO-DMO activities have contributed to broader national and global initiatives. One of the clear benefits of collaboration with other groups is the opportunity for identification of shared challenges, strategies and solutions and the increased likelihood of developing interoperable systems.
Electronic Health Records Data and Metadata: Challenges for Big Data in the United States.
Sweet, Lauren E; Moulaison, Heather Lea
2013-12-01
This article, written by researchers studying metadata and standards, represents a fresh perspective on the challenges of electronic health records (EHRs) and serves as a primer for big data researchers new to health-related issues. Primarily, we argue for the importance of the systematic adoption of standards in EHR data and metadata as a way of promoting big data research and benefiting patients. EHRs have the potential to include a vast amount of longitudinal health data, and metadata provides the formal structures to govern that data. In the United States, electronic medical records (EMRs) are part of the larger EHR. EHR data is submitted by a variety of clinical data providers and potentially by the patients themselves. Because data input practices are not necessarily standardized, and because of the multiplicity of current standards, basic interoperability in EHRs is hindered. Some of the issues with EHR interoperability stem from the complexities of the data they include, which can be both structured and unstructured. A number of controlled vocabularies are available to data providers. The continuity of care document standard will provide interoperability in the United States between the EMR and the larger EHR, potentially making data input by providers directly available to other providers. The data involved is nonetheless messy. In particular, the use of competing vocabularies such as the Systematized Nomenclature of Medicine-Clinical Terms, MEDCIN, and locally created vocabularies inhibits large-scale interoperability for structured portions of the records, and unstructured portions, although potentially not machine readable, remain essential. Once EMRs for patients are brought together as EHRs, the EHRs must be managed and stored. Adequate documentation should be created and maintained to assure the secure and accurate use of EHR data. There are currently a few notable international standards initiatives for EHRs. Organizations such as Health Level Seven International and Clinical Data Interchange Standards Consortium are developing and overseeing implementation of interoperability standards. Denmark and Singapore are two countries that have successfully implemented national EHR systems. Future work in electronic health information initiatives should underscore the importance of standards and reinforce interoperability of EHRs for big data research and for the sake of patients.
Motion Imagery and Robotics Application (MIRA)
NASA Technical Reports Server (NTRS)
Martinez, Lindolfo; Rich, Thomas
2011-01-01
Objectives include: I. Prototype a camera service leveraging the CCSDS Integrated protocol stack (MIRA/SM&C/AMS/DTN): a) CCSDS MIRA Service (New). b) Spacecraft Monitor and Control (SM&C). c) Asynchronous Messaging Service (AMS). d) Delay/Disruption Tolerant Networking (DTN). II. Additional MIRA Objectives: a) Demo of Camera Control through ISS using CCSDS protocol stack (Berlin, May 2011). b) Verify that the CCSDS standards stack can provide end-to-end space camera services across ground and space environments. c) Test interoperability of various CCSDS protocol standards. d) Identify overlaps in the design and implementations of the CCSDS protocol standards. e) Identify software incompatibilities in the CCSDS stack interfaces. f) Provide redlines to the SM&C, AMS, and DTN working groups. d) Enable the CCSDS MIRA service for potential use in ISS Kibo camera commanding. e) Assist in long-term evolution of this entire group of CCSDS standards to TRL 6 or greater.
NASA Astrophysics Data System (ADS)
Wright, D. J.; Lassoued, Y.; Dwyer, N.; Haddad, T.; Bermudez, L. E.; Dunne, D.
2009-12-01
Coastal mapping plays an important role in informing marine spatial planning, resource management, maritime safety, hazard assessment and even national sovereignty. As such, there is now a plethora of data/metadata catalogs, pre-made maps, tabular and text information on resource availability and exploitation, and decision-making tools. A recent trend has been to encapsulate these in a special class of web-enabled geographic information systems called a coastal web atlas (CWA). While multiple benefits are derived from tailor-made atlases, there is great value added from the integration of disparate CWAs. CWAs linked to one another can query more successfully to optimize planning and decision-making. If a dataset is missing in one atlas, it may be immediately located in another. Similar datasets in two atlases may be combined to enhance study in either region. *But how best to achieve semantic interoperability to mitigate vague data queries, concepts or natural language semantics when retrieving and integrating data and information?* We report on the development of a new prototype seeking to interoperate between two initial CWAs: the Marine Irish Digital Atlas (MIDA) and the Oregon Coastal Atlas (OCA). These two mature atlases are used as a testbed for more regional connections, with the intent for the OCA to use lessons learned to develop a regional network of CWAs along the west coast, and for MIDA to do the same in building and strengthening atlas networks with the UK, Belgium, and other parts of Europe. Our prototype uses semantic interoperability via services harmonization and ontology mediation, allowing local atlases to use their own data structures, and vocabularies (ontologies). We use standard technologies such as OGC Web Map Services (WMS) for delivering maps, and OGC Catalogue Service for the Web (CSW) for delivering and querying ISO-19139 metadata. The metadata records of a given CWA use a given ontology of terms called local ontology. Human or machine users formulate their requests using a common ontology of metadata terms, called global ontology. A CSW mediator rewrites the user’s request into CSW requests over local CSWs using their own (local) ontologies, collects the results and sends them back to the user. To extend the system, we have recently added global maritime boundaries and are also considering nearshore ocean observing system data. Ongoing work includes adding WFS, error management, and exception handling, enabling Smart Searches, and writing full documentation. This prototype is a central research project of the new International Coastal Atlas Network (ICAN), a group of 30+ organizations from 14 nations (and growing) dedicated to seeking interoperability approaches to CWAs in support of coastal zone management and the translation of coastal science to coastal decision-making.
Improving agricultural knowledge management: The AgTrials experience
Hyman, Glenn; Espinosa, Herlin; Camargo, Paola; Abreu, David; Devare, Medha; Arnaud, Elizabeth; Porter, Cheryl; Mwanzia, Leroy; Sonder, Kai; Traore, Sibiry
2017-01-01
Background: Opportunities to use data and information to address challenges in international agricultural research and development are expanding rapidly. The use of agricultural trial and evaluation data has enormous potential to improve crops and management practices. However, for a number of reasons, this potential has yet to be realized. This paper reports on the experience of the AgTrials initiative, an effort to build an online database of agricultural trials applying principles of interoperability and open access. Methods: Our analysis evaluates what worked and what did not work in the development of the AgTrials information resource. We analyzed data on our users and their interaction with the platform. We also surveyed our users to gauge their perceptions of the utility of the online database. Results: The study revealed barriers to participation and impediments to interaction, opportunities for improving agricultural knowledge management and a large potential for the use of trial and evaluation data. Conclusions: Technical and logistical mechanisms for developing interoperable online databases are well advanced. More effort will be needed to advance organizational and institutional work for these types of databases to realize their potential. PMID:28580127
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-18
... Docket 07-100; FCC 11-6] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the... interoperable public safety broadband network. The establishment of a common air interface for 700 MHz public safety broadband networks will create a foundation for interoperability and provide a clear path for the...
Juzwishin, Donald W M
2009-01-01
Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.
NASA UAS Integration into the NAS Project: Human Systems Integration
NASA Technical Reports Server (NTRS)
Shively, Jay
2016-01-01
This presentation provides an overview of the work the Human Systems Integration (HSI) sub-project has done on detect and avoid (DAA) displays while working on the UAS (Unmanned Aircraft System) Integration into the NAS project. The most recent simulation on DAA interoperability with Traffic Collision Avoidance System (TCAS) is discussed in the most detail. The relationship of the work to the larger UAS community and next steps are also detailed.
Interoperability challenges in river discharge modelling: A cross domain application scenario
NASA Astrophysics Data System (ADS)
Santoro, Mattia; Andres, Volker; Jirka, Simon; Koike, Toshio; Looser, Ulrich; Nativi, Stefano; Pappenberger, Florian; Schlummer, Manuela; Strauch, Adrian; Utech, Michael; Zsoter, Ervin
2018-06-01
River discharge is a critical water cycle variable, as it integrates all the processes (e.g. runoff and evapotranspiration) occurring within a river basin and provides a hydrological output variable that can be readily measured. Its prediction is of invaluable help for many water-related tasks including water resources assessment and management, flood protection, and disaster mitigation. Observations of river discharge are important to calibrate and validate hydrological or coupled land, atmosphere and ocean models. This requires using datasets from different scientific domains (Water, Weather, etc.). Typically, such datasets are provided using different technological solutions. This complicates the integration of new hydrological data sources into application systems. Therefore, a considerable effort is often spent on data access issues instead of the actual scientific question. This paper describes the work performed to address multidisciplinary interoperability challenges related to river discharge modeling and validation. This includes definition and standardization of domain specific interoperability standards for hydrological data sharing and their support in global frameworks such as the Global Earth Observation System of Systems (GEOSS). The research was developed in the context of the EU FP7-funded project GEOWOW (GEOSS Interoperability for Weather, Ocean and Water), which implemented a "River Discharge" application scenario. This scenario demonstrates the combination of river discharge observations data from the Global Runoff Data Centre (GRDC) database and model outputs produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) predicting river discharge based on weather forecast information in the context of the GEOSS.
Galarraga, M; Serrano, L; Martinez, I; de Toledo, P; Reynolds, Melvin
2007-01-01
Advances in Information and Communication Technologies, ICT, are bringing new opportunities and use cases in the field of systems and Personal Health Devices used for the telemonitoring of citizens in Home or Mobile scenarios. At a time of such challenges, this review arises from the need to identify robust technical telemonitoring solutions that are both open and interoperable. These systems demand standardized solutions to be cost effective and to take advantage of standardized operation and interoperability. Thus, the fundamental challenge is to design plug-&-play devices that, either as individual elements or as components, can be incorporated in a simple way into different Telecare systems, perhaps configuring a personal user network. Moreover, there is an increasing market pressure from companies not traditionally involved in medical markets, asking for a standard for Personal Health Devices, which foresee a vast demand for telemonitoring, wellness, Ambient Assisted Living (AAL) and e-health applications. However, the newly emerging situations imply very strict requirements for the protocols involved in the communication. The ISO/IEEE 11073 family of standards is adapting and moving in order to face the challenge and might appear the best positioned international standards to reach this goal. This work presents an updated survey of these standards, trying to track the changes that are being fulfilled, and tries to serve as a starting-point for those who want to familiarize themselves with them.
On the Execution Control of HLA Federations using the SISO Space Reference FOM
NASA Technical Reports Server (NTRS)
Moller, Bjorn; Garro, Alfredo; Falcone, Alberto; Crues, Edwin Z.; Dexter, Daniel E.
2017-01-01
In the Space domain the High Level Architecture (HLA) is one of the reference standard for Distributed Simulation. However, for the different organizations involved in the Space domain (e.g. NASA, ESA, Roscosmos, and JAXA) and their industrial partners, it is difficult to implement HLA simulators (called Federates) able to interact and interoperate in the context of a distributed HLA simulation (called Federation). The lack of a common FOM (Federation Object Model) for the Space domain is one of the main reasons that precludes a-priori interoperability between heterogeneous federates. To fill this lack a Product Development Group (PDG) has been recently activated in the Simulation Interoperability Standards Organization (SISO) with the aim to provide a Space Reference FOM (SRFOM) for international collaboration on Space systems simulations. Members of the PDG come from several countries and contribute experiences from projects within NASA, ESA and other organizations. Participants represent government, academia and industry. The paper presents an overview of the ongoing Space Reference FOM standardization initiative by focusing on the solution provided for managing the execution of an SRFOM-based Federation.
Interoperability of Information Systems Managed and Used by the Local Health Departments.
Shah, Gulzar H; Leider, Jonathon P; Luo, Huabin; Kaur, Ravneet
2016-01-01
In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide.
Guided mass spectrum labelling in atom probe tomography.
Haley, D; Choi, P; Raabe, D
2015-12-01
Atom probe tomography (APT) is a valuable near-atomic scale imaging technique, which yields mass spectrographic data. Experimental correctness can often pivot on the identification of peaks within a dataset, this is a manual process where subjectivity and errors can arise. The limitations of manual procedures complicate APT experiments for the operator and furthermore are a barrier to technique standardisation. In this work we explore the capabilities of computer-guided ranging to aid identification and analysis of mass spectra. We propose a fully robust algorithm for enumeration of the possible identities of detected peak positions, which assists labelling. Furthermore, a simple ranking scheme is developed to allow for evaluation of the likelihood of each possible identity being the likely assignment from the enumerated set. We demonstrate a simple, yet complete work-chain that allows for the conversion of mass-spectra to fully identified APT spectra, with the goal of minimising identification errors, and the inter-operator variance within APT experiments. This work chain is compared to current procedures via experimental trials with different APT operators, to determine the relative effectiveness and precision of the two approaches. It is found that there is little loss of precision (and occasionally gain) when participants are given computer assistance. We find that in either case, inter-operator precision for ranging varies between 0 and 2 "significant figures" (2σ confidence in the first n digits of the reported value) when reporting compositions. Intra-operator precision is weakly tested and found to vary between 1 and 3 significant figures, depending upon species composition levels. Finally it is suggested that inconsistencies in inter-operator peak labelling may be the largest source of scatter when reporting composition data in APT. Copyright © 2015 Elsevier B.V. All rights reserved.
National electronic health record interoperability chronology.
Hufnagel, Stephen P
2009-05-01
The federal initiative for electronic health record (EHR) interoperability began in 2000 and set the stage for the establishment of the 2004 Executive Order for EHR interoperability by 2014. This article discusses the chronology from the 2001 e-Government Consolidated Health Informatics (CHI) initiative through the current congressional mandates for an aligned, interoperable, and agile DoD AHLTA and VA VistA.
On the formal definition of the systems' interoperability capability: an anthropomorphic approach
NASA Astrophysics Data System (ADS)
Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav
2017-03-01
The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.
Employing Semantic Technologies for the Orchestration of Government Services
NASA Astrophysics Data System (ADS)
Sabol, Tomáš; Furdík, Karol; Mach, Marián
The main aim of the eGovernment is to provide efficient, secure, inclusive services for its citizens and businesses. The necessity to integrate services and information resources, to increase accessibility, to reduce the administrative burden on citizens and enterprises - these are only a few reasons why the paradigm of the eGovernment has been shifted from the supply-driven approach toward the connected governance, emphasizing the concept of interoperability (Archmann and Nielsen 2008). On the EU level, the interoperability is explicitly addressed as one of the four main challenges, including in the i2010 strategy (i2010 2005). The Commission's Communication (Interoperability for Pan-European eGovernment Services 2006) strongly emphasizes the necessity of interoperable eGovernment services, based on standards, open specifications, and open interfaces. The Pan-European interoperability initiatives, such as the European Interoperability Framework (2004) and IDABC, as well as many projects supported by the European Commission within the IST Program and the Competitiveness and Innovation Program (CIP), illustrate the importance of interoperability on the EU level.
Empowering open systems through cross-platform interoperability
NASA Astrophysics Data System (ADS)
Lyke, James C.
2014-06-01
Most of the motivations for open systems lie in the expectation of interoperability, sometimes referred to as "plug-and-play". Nothing in the notion of "open-ness", however, guarantees this outcome, which makes the increased interest in open architecture more perplexing. In this paper, we explore certain themes of open architecture. We introduce the concept of "windows of interoperability", which can be used to align disparate portions of architecture. Such "windows of interoperability", which concentrate on a reduced set of protocol and interface features, might achieve many of the broader purposes assigned as benefits in open architecture. Since it is possible to engineer proprietary systems that interoperate effectively, this nuanced definition of interoperability may in fact be a more important concept to understand and nurture for effective systems engineering and maintenance.
NASA Astrophysics Data System (ADS)
Stone, N.; Lafuente, B.; Bristow, T.; Keller, R.; Downs, R. T.; Blake, D. F.; Fonda, M.; Pires, A.
2016-12-01
Working primarily with astrobiology researchers at NASA Ames, the Open Data Repository (ODR) has been conducting a software pilot to meet the varying needs of this multidisciplinary community. Astrobiology researchers often have small communities or operate individually with unique data sets that don't easily fit into existing database structures. The ODR constructed its Data Publisher software to allow researchers to create databases with common metadata structures and subsequently extend them to meet their individual needs and data requirements. The software accomplishes these tasks through a web-based interface that allows collaborative creation and revision of common metadata templates and individual extensions to these templates for custom data sets. This allows researchers to search disparate datasets based on common metadata established through the metadata tools, but still facilitates distinct analyses and data that may be stored alongside the required common metadata. The software produces web pages that can be made publicly available at the researcher's discretion so that users may search and browse the data in an effort to make interoperability and data discovery a human-friendly task while also providing semantic data for machine-based discovery. Once relevant data has been identified, researchers can utilize the built-in application programming interface (API) that exposes the data for machine-based consumption and integration with existing data analysis tools (e.g. R, MATLAB, Project Jupyter - http://jupyter.org). The current evolution of the project has created the Astrobiology Habitable Environments Database (AHED)[1] which provides an interface to databases connected through a common metadata core. In the next project phase, the goal is for small research teams and groups to be self-sufficient in publishing their research data to meet funding mandates and academic requirements as well as fostering increased data discovery and interoperability through human-readable and machine-readable interfaces. This project is supported by the Science-Enabling Research Activity (SERA) and NASA NNX11AP82A, MSL. [1] B. Lafuente et al. (2016) AGU, submitted.
Security and Privacy in a DACS.
Delgado, Jaime; Llorente, Silvia; Pàmies, Martí; Vilalta, Josep
2016-01-01
The management of electronic health records (EHR), in general, and clinical documents, in particular, is becoming a key issue in the daily work of Healthcare Organizations (HO). The need for providing secure and private access to, and storage for, clinical documents together with the need for HO to interoperate, raises a number of issues difficult to solve. Many systems are in place to manage EHR and documents. Some of these Healthcare Information Systems (HIS) follow standards in their document structure and communications protocols, but many do not. In fact, they are mostly proprietary and do not interoperate. Our proposal to solve the current situation is the use of a DACS (Document Archiving and Communication System) for providing security, privacy and standardized access to clinical documents.
DIMP: an interoperable solution for software integration and product data exchange
NASA Astrophysics Data System (ADS)
Wang, Xi Vincent; Xu, Xun William
2012-08-01
Today, globalisation has become one of the main trends of manufacturing business that has led to a world-wide decentralisation of resources amongst not only individual departments within one company but also business partners. However, despite the development and improvement in the last few decades, difficulties in information exchange and sharing still exist in heterogeneous applications environments. This article is divided into two parts. In the first part, related research work and integrating solutions are reviewed and discussed. The second part introduces a collaborative environment called distributed interoperable manufacturing platform, which is based on a module-based, service-oriented architecture (SOA). In the platform, the STEP-NC data model is used to facilitate data-exchange among heterogeneous CAD/CAM/CNC systems.
NASA Astrophysics Data System (ADS)
Evans, K. D.; Early, A. B.; Northup, E. A.; Ames, D. P.; Teng, W. L.; Olding, S. W.; Krotkov, N. A.; Arctur, D. K.; Beach, A. L., III; Silverman, M. L.
2017-12-01
The role of NASA's Earth Science Data Systems Working Groups (ESDSWG) is to make recommendations relevant to NASA's Earth science data systems from users' experiences and community insight. Each group works independently, focusing on a unique topic. Progress of two of the 2017 Working Groups will be presented. In a single airborne field campaign, there can be several different instruments and techniques that measure the same parameter on one or more aircraft platforms. Many of these same parameters are measured during different airborne campaigns using similar or different instruments and techniques. The Airborne Composition Standard Variable Name Working Group is working to create a list of variable standard names that can be used across all airborne field campaigns in order to assist in the transition to the ICARTT Version 2.0 file format. The overall goal is to enhance the usability of ICARTT files and the search ability of airborne field campaign data. The Time Series Working Group (TSWG) is a continuation of the 2015 and 2016 Time Series Working Groups. In 2015, we started TSWG with the intention of exploring the new OGC (Open Geospatial Consortium) WaterML 2 standards as a means for encoding point-based time series data from NASA satellites. In this working group, we realized that WaterML 2 might not be the best solution for this type of data, for a number of reasons. Our discussion with experts from other agencies, who have worked on similar issues, identified several challenges that we would need to address. As a result, we made the recommendation to study the new TimeseriesML 1.0 standard of OGC as a potential NASA time series standard. The 2016 TSWG examined closely the TimeseriesML 1.0 and, in coordination with the OGC TimeseriesML Standards Working Group, identified certain gaps in TimeseriesML 1.0 that would need to be addressed for the standard to be applicable to NASA time series data. An engineering report was drafted based on the OGC Engineering Report template, describing recommended changes to TimeseriesML 1.0, in the form of use cases. In 2017, we are conducting interoperability experiments to implement the use cases and demonstrate the feasibility and suitability of these modifications for NASA and related user communities. The results will be incorporated into the existing draft engineering report.
NASA Technical Reports Server (NTRS)
Evans, Keith D.; Early, Amanda; Northup, Emily; Ames, Dan; Teng, William; Archur, David; Beach, Aubrey; Olding, Steve; Krotkov, Nickolay A.
2017-01-01
The role of NASA's Earth Science Data Systems Working Groups (ESDSWG) is to make recommendations relevant to NASA's Earth science data systems from users' experiences and community insight. Each group works independently, focusing on a unique topic. Progress of two of the 2017 Working Groups will be presented. In a single airborne field campaign, there can be several different instruments and techniques that measure the same parameter on one or more aircraft platforms. Many of these same parameters are measured during different airborne campaigns using similar or different instruments and techniques. The Airborne Composition Standard Variable Name Working Group is working to create a list of variable standard names that can be used across all airborne field campaigns in order to assist in the transition to the ICARTT Version 2.0 file format. The overall goal is to enhance the usability of ICARTT files and the search ability of airborne field campaign data. The Time Series Working Group (TSWG) is a continuation of the 2015 and 2016 Time Series Working Groups. In 2015, we started TSWG with the intention of exploring the new OGC (Open Geospatial Consortium) WaterML 2 standards as a means for encoding point-based time series data from NASA satellites. In this working group, we realized that WaterML 2 might not be the best solution for this type of data, for a number of reasons. Our discussion with experts from other agencies, who have worked on similar issues, identified several challenges that we would need to address. As a result, we made the recommendation to study the new TimeseriesML 1.0 standard of OGC as a potential NASA time series standard. The 2016 TSWG examined closely the TimeseriesML 1.0 and, in coordination with the OGC TimeseriesML Standards Working Group, identified certain gaps in TimeseriesML 1.0 that would need to be addressed for the standard to be applicable to NASA time series data. An engineering report was drafted based on the OGC Engineering Report template, describing recommended changes to TimeseriesML 1.0, in the form of use cases. In 2017, we are conducting interoperability experiments to implement the use cases and demonstrate the feasibility and suitability of these modifications for NASA and related user communities. The results will be incorporated into the existing draft engineering report.
A Bridge to the Future: Observations on Building a Digital Library.
ERIC Educational Resources Information Center
Gaunt, Marianne I.
2002-01-01
The experience of Rutgers University Libraries illustrates the extensive planning, work effort, possibilities, and investment required to develop the digital library. Examines these key areas: organizational structure; staff development needs; facilities and the new digital infrastructure; metadata standards/interoperability; digital collection…
The Future of Library Automation in Schools.
ERIC Educational Resources Information Center
Anderson, Elaine
2000-01-01
Addresses the future of library automation programs for schools. Discusses requirements of emerging OPACs and circulation systems; the Schools Interoperability Framework (SIF), an industry initiatives to develop an open specification for ensuring that K-12 instructional and administrative software applications work together more effectively; home…
Science friction: data, metadata, and collaboration.
Edwards, Paul N; Mayernik, Matthew S; Batcheller, Archer L; Bowker, Geoffrey C; Borgman, Christine L
2011-10-01
When scientists from two or more disciplines work together on related problems, they often face what we call 'science friction'. As science becomes more data-driven, collaborative, and interdisciplinary, demand increases for interoperability among data, tools, and services. Metadata--usually viewed simply as 'data about data', describing objects such as books, journal articles, or datasets--serve key roles in interoperability. Yet we find that metadata may be a source of friction between scientific collaborators, impeding data sharing. We propose an alternative view of metadata, focusing on its role in an ephemeral process of scientific communication, rather than as an enduring outcome or product. We report examples of highly useful, yet ad hoc, incomplete, loosely structured, and mutable, descriptions of data found in our ethnographic studies of several large projects in the environmental sciences. Based on this evidence, we argue that while metadata products can be powerful resources, usually they must be supplemented with metadata processes. Metadata-as-process suggests the very large role of the ad hoc, the incomplete, and the unfinished in everyday scientific work.
Interoperability of Information Systems Managed and Used by the Local Health Departments
Leider, Jonathon P.; Luo, Huabin; Kaur, Ravneet
2016-01-01
Background: In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). Objectives: To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. Data and Methods: This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. Results: For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Conclusion: Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide. PMID:27684616
Group Membership Based Authorization to CADC Resources
NASA Astrophysics Data System (ADS)
Damian, A.; Dowler, P.; Gaudet, S.; Hill, N.
2012-09-01
The Group Membership Service (GMS), implemented at the Canadian Astronomy Data Centre (CADC), is a prototype of what could eventually be an IVOA standard for a distributed and interoperable group membership protocol. Group membership is the core authorization concept that enables teamwork and collaboration amongst astronomers accessing distributed resources and services. The service integrates and complements other access control related IVOA standards such as single-sign-on (SSO) using X.509 proxy certificates and the Credential Delegation Protocol (CDP). The GMS has been used at CADC for several years now, initially as a subsystem and then as a stand-alone Web service. It is part of the authorization mechanism for controlling the access to restricted Web resources as well as the VOSpace service hosted by the CADC. We present the role that GMS plays within the access control system at the CADC, including the functionality of the service and how the different CADC services make use of it to assert user authorization to resources. We also describe the main advantages and challenges of using the service as well as future work to increase its robustness and functionality.
Approaching semantic interoperability in Health Level Seven
Alschuler, Liora
2010-01-01
‘Semantic Interoperability’ is a driving objective behind many of Health Level Seven's standards. The objective in this paper is to take a step back, and consider what semantic interoperability means, assess whether or not it has been achieved, and, if not, determine what concrete next steps can be taken to get closer. A framework for measuring semantic interoperability is proposed, using a technique called the ‘Single Logical Information Model’ framework, which relies on an operational definition of semantic interoperability and an understanding that interoperability improves incrementally. Whether semantic interoperability tomorrow will enable one computer to talk to another, much as one person can talk to another person, is a matter for speculation. It is assumed, however, that what gets measured gets improved, and in that spirit this framework is offered as a means to improvement. PMID:21106995
NASA Astrophysics Data System (ADS)
Asmi, Ari; Powers, Lindsay
2015-04-01
Research Infrastructures (RIs) are major long-term investments supporting innovative, bottom-up research activities. In the environmental research, they range from high atmosphere radars, to field observation networks and coordinated laboratory facilities. The Earth system is highly interactive and each part of the system interconnected across the spatial and disciplinary borders. However, due practical and historical reasons, the RIs are built from disciplinary points-of-view and separately in different parts of the world, with differing standards, policies, methods and research cultures. This heterogeneity provides necessary diversity to study the complex Earth system, but makes cross-disciplinary and/or global interoperability a challenge. Global actions towards better interoperability are surfacing, especially with EU and US. For example, recent mandates within the US government prioritize open data for federal agencies and federally funded science, and encourage collaboration among agencies to reduce duplication of efforts and increase efficient use of resources. There are several existing initiatives working toward these goals (e.g., COOPEUS, EarthCube, RDA, ICSU-WDS, DataOne, ESIP, USGEO, GEO). However, there is no cohesive framework to coordinate efforts among these, and other, entities. COOPEUS and EarthCube have now begun to map the landscape of interoperability efforts across earth science domains. The COOPEUS mapping effort describes the EU and US landscape of environmental research infrastructures to accomplish the following: identify gaps in services (data provision) necessary to address societal priorities; provide guidance for development of future research infrastructures; and identify opportunities for Research Infrastructures (RIs) to collaborate on issues of common interest. EarthCube mapping effort identifies opportunities to engage a broader community by identifying scientific domain organizations and entities. We present the current situation of the landscape analysis to create a sustainable effort towards removing barriers to interoperability on a global scale.
Kamimura, Emi; Tanaka, Shinpei; Takaba, Masayuki; Tachi, Keita; Baba, Kazuyoshi
2017-01-01
Purpose The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D) images of teeth captured by a digital impression technique to a conventional impression technique in vivo. Materials and methods Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE). A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE). Stereolithography (STL) data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D) laboratory scanner (D810, 3shape). The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software) for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test). Results The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm) than when using a conventional impression technique (0.023 ± 0.01 mm). Conclusion The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator. PMID:28636642
The MMI Device Ontology: Enabling Sensor Integration
NASA Astrophysics Data System (ADS)
Rueda, C.; Galbraith, N.; Morris, R. A.; Bermudez, L. E.; Graybeal, J.; Arko, R. A.; Mmi Device Ontology Working Group
2010-12-01
The Marine Metadata Interoperability (MMI) project has developed an ontology for devices to describe sensors and sensor networks. This ontology is implemented in the W3C Web Ontology Language (OWL) and provides an extensible conceptual model and controlled vocabularies for describing heterogeneous instrument types, with different data characteristics, and their attributes. It can help users populate metadata records for sensors; associate devices with their platforms, deployments, measurement capabilities and restrictions; aid in discovery of sensor data, both historic and real-time; and improve the interoperability of observational oceanographic data sets. We developed the MMI Device Ontology following a community-based approach. By building on and integrating other models and ontologies from related disciplines, we sought to facilitate semantic interoperability while avoiding duplication. Key concepts and insights from various communities, including the Open Geospatial Consortium (eg., SensorML and Observations and Measurements specifications), Semantic Web for Earth and Environmental Terminology (SWEET), and W3C Semantic Sensor Network Incubator Group, have significantly enriched the development of the ontology. Individuals ranging from instrument designers, science data producers and consumers to ontology specialists and other technologists contributed to the work. Applications of the MMI Device Ontology are underway for several community use cases. These include vessel-mounted multibeam mapping sonars for the Rolling Deck to Repository (R2R) program and description of diverse instruments on deepwater Ocean Reference Stations for the OceanSITES program. These trials involve creation of records completely describing instruments, either by individual instances or by manufacturer and model. Individual terms in the MMI Device Ontology can be referenced with their corresponding Uniform Resource Identifiers (URIs) in sensor-related metadata specifications (e.g., SensorML, NetCDF). These identifiers can be resolved through a web browser, or other client applications via HTTP against the MMI Ontology Registry and Repository (ORR), where the ontology is maintained. SPARQL-based query capabilities, which are enhanced with reasoning, along with several supported output formats, allow the effective interaction of diverse client applications with the semantic information associated with the device ontology. In this presentation we describe the process for the development of the MMI Device Ontology and illustrate extensions and applications that demonstrate the benefits of adopting this semantic approach, including example queries involving inference. We also highlight the issues encountered and future work.
Towards semantic interoperability for electronic health records.
Garde, Sebastian; Knaup, Petra; Hovenga, Evelyn; Heard, Sam
2007-01-01
In the field of open electronic health records (EHRs), openEHR as an archetype-based approach is being increasingly recognised. It is the objective of this paper to shortly describe this approach, and to analyse how openEHR archetypes impact on health professionals and semantic interoperability. Analysis of current approaches to EHR systems, terminology and standards developments. In addition to literature reviews, we organised face-to-face and additional telephone interviews and tele-conferences with members of relevant organisations and committees. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability -- both important prerequisites for semantic interoperability. Archetypes enable the formal definition of clinical content by clinicians. To enable comprehensive semantic interoperability, the development and maintenance of archetypes needs to be coordinated internationally and across health professions. Domain knowledge governance comprises a set of processes that enable the creation, development, organisation, sharing, dissemination, use and continuous maintenance of archetypes. It needs to be supported by information technology. To enable EHRs, semantic interoperability is essential. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability. However, without coordinated archetype development and maintenance, 'rank growth' of archetypes would jeopardize semantic interoperability. We therefore believe that openEHR archetypes and domain knowledge governance together create the knowledge environment required to adopt EHRs.
The role of architecture and ontology for interoperability.
Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka
2010-01-01
Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.
Recommendations for a service framework to access astronomical archives
NASA Technical Reports Server (NTRS)
Travisano, J. J.; Pollizzi, J.
1992-01-01
There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.
GeoSciML and EarthResourceML Update, 2012
NASA Astrophysics Data System (ADS)
Richard, S. M.; Commissionthe Management; Application Inte, I.
2012-12-01
CGI Interoperability Working Group activities during 2012 include deployment of services using the GeoSciML-Portrayal schema, addition of new vocabularies to support properties added in version 3.0, improvements to server software for deploying services, introduction of EarthResourceML v.2 for mineral resources, and collaboration with the IUSS on a markup language for soils information. GeoSciML and EarthResourceML have been used as the basis for the INSPIRE Geology and Mineral Resources specifications respectively. GeoSciML-Portrayal is an OGC GML simple-feature application schema for presentation of geologic map unit, contact, and shear displacement structure (fault and ductile shear zone) descriptions in web map services. Use of standard vocabularies for geologic age and lithology enables map services using shared legends to achieve visual harmonization of maps provided by different services. New vocabularies have been added to the collection of CGI vocabularies provided to support interoperable GeoSciML services, and can be accessed through http://resource.geosciml.org. Concept URIs can be dereferenced to obtain SKOS rdf or html representations using the SISSVoc vocabulary service. New releases of the FOSS GeoServer application greatly improve support for complex XML feature schemas like GeoSciML, and the ArcGIS for INSPIRE extension implements similar complex feature support for ArcGIS Server. These improved server implementations greatly facilitate deploying GeoSciML services. EarthResourceML v2 adds features for information related to mining activities. SoilML provides an interchange format for soil material, soil profile, and terrain information. Work is underway to add GeoSciML to the portfolio of Open Geospatial Consortium (OGC) specifications.
Jian, Wen-Shan; Hsu, Chien-Yeh; Hao, Te-Hui; Wen, Hsyien-Chia; Hsu, Min-Huei; Lee, Yen-Liang; Li, Yu-Chuan; Chang, Polun
2007-11-01
Traditional electronic health record (EHR) data are produced from various hospital information systems. They could not have existed independently without an information system until the incarnation of XML technology. The interoperability of a healthcare system can be divided into two dimensions: functional interoperability and semantic interoperability. Currently, no single EHR standard exists that provides complete EHR interoperability. In order to establish a national EHR standard, we developed a set of local EHR templates. The Taiwan Electronic Medical Record Template (TMT) is a standard that aims to achieve semantic interoperability in EHR exchanges nationally. The TMT architecture is basically composed of forms, components, sections, and elements. Data stored in the elements which can be referenced by the code set, data type, and narrative block. The TMT was established with the following requirements in mind: (1) transformable to international standards; (2) having a minimal impact on the existing healthcare system; (3) easy to implement and deploy, and (4) compliant with Taiwan's current laws and regulations. The TMT provides a basis for building a portable, interoperable information infrastructure for EHR exchange in Taiwan.
OPSAID Initial Design and Testing Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurd, Steven A.; Stamp, Jason Edwin; Chavez, Adrian R.
2007-11-01
Process Control System (PCS) security is critical to our national security. Yet, there are a number of technological, economic, and educational impediments to PCS owners implementing effective security on their systems. OPSAID (Open PCS Security Architecture for Interoperable Design), a project sponsored by the US Department of Energy's Office of Electricity Delivery and Reliability, aims to address this issue through developing and testing an open source architecture for PCS security. Sandia National Laboratories, along with a team of PCS vendors and owners, have developed and tested this PCS security architecture. This report describes their progress to date.2 AcknowledgementsThe authors acknowledgemore » and thank their colleagues for their assistance with the OPSAID project.Sandia National Laboratories: Alex Berry, Charles Perine, Regis Cassidy, Bryan Richardson, Laurence PhillipsTeumim Technical, LLC: Dave TeumimIn addition, the authors are greatly indebted to the invaluable help of the members of the OPSAID Core Team. Their assistance has been critical to the success and industry acceptance of the OPSAID project.Schweitzer Engineering Laboratory: Rhett Smith, Ryan Bradetich, Dennis GammelTelTone: Ori Artman Entergy: Dave Norton, Leonard Chamberlin, Mark AllenThe authors would like to acknowledge that the work that produced the results presented in this paper was funded by the U.S. Department of Energy/Office of Electricity Delivery and Energy Reliability (DOE/OE) as part of the National SCADA Test Bed (NSTB) Program. Executive SummaryProcess control systems (PCS) are very important for critical infrastructure and manufacturing operations, yet cyber security technology in PCS is generally poor. The OPSAID (Open PCS (Process Control System) Security Architecture for Interoperable Design) program is intended to address these security shortcomings by accelerating the availability and deployment of comprehensive security technology for PCS, both for existing PCS and inherently secure PCS in the future. All activities are closely linked to industry outreach and advisory efforts.Generally speaking, the OPSAID project is focused on providing comprehensive security functionality to PCS that communicate using IP. This is done through creating an interoperable PCS security architecture and developing a reference implementation, which is tested extensively for performance and reliability.This report first provides background on the PCS security problem and OPSAID, followed by goals and objectives of the project. The report also includes an overview of the results, including the OPSAID architecture and testing activities, along with results from industry outreach activities. Conclusion and recommendation sections follow. Finally, a series of appendices provide more detailed information regarding architecture and testing activities.Summarizing the project results, the OPSAID architecture was defined, which includes modular security functionality and corresponding component modules. The reference implementation, which includes the collection of component modules, was tested extensively and proved to provide more than acceptable performance in a variety of test scenarios. The primary challenge in implementation and testing was correcting initial configuration errors.OPSAID industry outreach efforts were very successful. A small group of industry partners were extensively involved in both the design and testing of OPSAID. Conference presentations resulted in creating a larger group of potential industry partners.Based upon experience implementing and testing OPSAID, as well as through collecting industry feedback, the OPSAID project has done well and is well received. Recommendations for future work include further development of advanced functionality, refinement of interoperability guidance, additional laboratory and field testing, and industry outreach that includes PCS owner education. 4 5 --This page intentionally left blank --« less
Development of high performance scientific components for interoperability of computing packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gulabani, Teena Pratap
2008-01-01
Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less
Integrating manufacturing softwares for intelligent planning execution: a CIIMPLEX perspective
NASA Astrophysics Data System (ADS)
Chu, Bei Tseng B.; Tolone, William J.; Wilhelm, Robert G.; Hegedus, M.; Fesko, J.; Finin, T.; Peng, Yun; Jones, Chris H.; Long, Junshen; Matthews, Mike; Mayfield, J.; Shimp, J.; Su, S.
1997-01-01
Recent developments have made it possible to interoperate complex business applications at much lower costs. Application interoperation, along with business process re- engineering can result in significant savings by eliminating work created by disconnected business processes due to isolated business applications. However, we believe much greater productivity benefits can be achieved by facilitating timely decision-making, utilizing information from multiple enterprise perspectives. The CIIMPLEX enterprise integration architecture is designed to enable such productivity gains by helping people to carry out integrated enterprise scenarios. An enterprise scenario is triggered typically by some external event. The goal of an enterprise scenario is to make the right decisions considering the full context of the problem. Enterprise scenarios are difficult for people to carry out because of the interdependencies among various actions. One can easily be overwhelmed by the large amount of information. We propose the use of software agents to help gathering relevant information and present them in the appropriate context of an enterprise scenario. The CIIMPLEX enterprise integration architecture is based on the FAIME methodology for application interoperation and plug-and-play. It also explores the use of software agents in application plug-and- play.
NASA Astrophysics Data System (ADS)
Orellana, Diego A.; Salas, Alberto A.; Solarz, Pablo F.; Medina Ruiz, Luis; Rotger, Viviana I.
2016-04-01
The production of clinical information about each patient is constantly increasing, and it is noteworthy that the information is created in different formats and at diverse points of care, resulting in fragmented, incomplete, inaccurate and isolated, health information. The use of health information technology has been promoted as having a decisive impact to improve the efficiency, cost-effectiveness, quality and safety of medical care delivery. However in developing countries the utilization of health information technology is insufficient and lacking of standards among other situations. In the present work we evaluate the framework EHRGen, based on the openEHR standard, as mean to reach generation and availability of patient centered information. The framework has been evaluated through the provided tools for final users, that is, without intervention of computer experts. It makes easier to adopt the openEHR ideas and provides an open source basis with a set of services, although some limitations in its current state conspire against interoperability and usability. However, despite the described limitations respect to usability and semantic interoperability, EHRGen is, at least regionally, a considerable step toward EHR adoption and interoperability, so that it should be supported from academic and administrative institutions.
Interoperability prototype between hospitals and general practitioners in Switzerland.
Alves, Bruno; Müller, Henning; Schumacher, Michael; Godel, David; Abu Khaled, Omar
2010-01-01
Interoperability in data exchange has the potential to improve the care processes and decrease costs of the health care system. Many countries have related eHealth initiatives in preparation or already implemented. In this area, Switzerland has yet to catch up. Its health system is fragmented, because of the federated nature of cantons. It is thus more difficult to coordinate efforts between the existing healthcare actors. In the Medicoordination project a pragmatic approach was selected: integrating several partners in healthcare on a regional scale in French speaking Switzerland. In parallel with the Swiss eHealth strategy, currently being elaborated by the Swiss confederation, particularly medium-sized hospitals and general practitioners were targeted in Medicoordination to implement concrete scenarios of information exchange between hospitals and general practitioners with a high added value. In this paper we focus our attention on a prototype implementation of one chosen scenario: the discharge summary. Although simple in concept, exchanging release letters shows small, hidden difficulties due to the multi-partner nature of the project. The added value of such a prototype is potentially high and it is now important to show that interoperability can work in practice.
Food product tracing technology capabilities and interoperability.
Bhatt, Tejas; Zhang, Jianrong Janet
2013-12-01
Despite the best efforts of food safety and food defense professionals, contaminated food continues to enter the food supply. It is imperative that contaminated food be removed from the supply chain as quickly as possible to protect public health and stabilize markets. To solve this problem, scores of technology companies purport to have the most effective, economical product tracing system. This study sought to compare and contrast the effectiveness of these systems at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. It also determined if these systems can work together to better secure the food supply (their interoperability). Institute of Food Technologists (IFT) hypothesized that when technology providers are given a full set of supply-chain data, even for a multi-ingredient product, their systems will generally be able to trace a contaminated product forward and backward through the supply chain. However, when provided with only a portion of supply-chain data, even for a product with a straightforward supply chain, it was expected that interoperability of the systems will be lacking and that there will be difficulty collaborating to identify sources and/or recipients of potentially contaminated product. IFT provided supply-chain data for one complex product to 9 product tracing technology providers, and then compared and contrasted their effectiveness at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. A vertically integrated foodservice restaurant agreed to work with IFT to secure data from its supply chain for both a multi-ingredient and a simpler product. Potential multi-ingredient products considered included canned tuna, supreme pizza, and beef tacos. IFT ensured that all supply-chain data collected did not include any proprietary information or information that would otherwise identify the supply-chain partner who provided the information prior to sharing this information with product tracing technology providers. The 9 traceability solution providers who agreed to participate in this project have their systems deployed in a wide range of sectors within the food industry including, but not limited to, livestock, dairy, produce, fruits, seafood, meat, and pork; as well as in pharmaceutical, automotive, retail, and other industries. Some have also been implemented across the globe including Canada, China, USA, Norway, and the EU, among others. This broad commercial use ensures that the findings of this work are applicable to a broad spectrum of the food system. Six of the 9 participants successfully completed the data entry phase of this test. To verify successful data entry for these 6, a demo or screenshots of the data set from each system's user interface was requested. Only 4 of the 6 were able to provide us with this evidence for verification. Of the 6 that completed data entry and moved on to the scenarios phase of the test, 5 were able to provide us with the responses to the scenarios. Time metrics were useful for evaluating the scalability and usability of each technology. Scalability was derived from the time it took to enter the nonstandardized data set into the system (ranges from 7 to 11 d). Usability was derived from the time it took to query the scenarios and provide the results (from a few hours to a week). The time was measured in days it took for the participants to respond after we supplied them all the information they would need to successfully execute each test/scenario. Two of the technology solution providers successfully implemented and participated in a proof-of-concept interoperable framework during Year 2 of this study. While not required, they also demonstrated this interoperability capability on the FSMA-mandated food product tracing pilots for the U.S. FDA. This has significant real-world impact since the demonstration of interoperability enables U.S. FDA to obtain evidence on the importance and impact of data-sharing moving forward. Another real-world accomplishment is the modification or upgrade of commercial technology solutions to enhance or implement interoperability. As these systems get deployed by clients in the food industry, interoperability will no longer be an afterthought but will be built into their traceability systems. In turn, industry and regulators will better understand the capabilities of the currently available technologies, and the technology provider community will identify ways in which their systems may be further developed to increase interoperability and utility. © 2013 Institute of Food Technologists®
NASA Astrophysics Data System (ADS)
Williams, J. W.; Ashworth, A. C.; Betancourt, J. L.; Bills, B.; Blois, J.; Booth, R.; Buckland, P.; Charles, D.; Curry, B. B.; Goring, S. J.; Davis, E.; Grimm, E. C.; Graham, R. W.; Smith, A. J.
2015-12-01
Community-supported data repositories (CSDRs) in paleoecology and paleoclimatology have a decades-long tradition and serve multiple critical scientific needs. CSDRs facilitate synthetic large-scale scientific research by providing open-access and curated data that employ community-supported metadata and data standards. CSDRs serve as a 'middle tail' or boundary organization between information scientists and the long-tail community of individual geoscientists collecting and analyzing paleoecological data. Over the past decades, a distributed network of CSDRs has emerged, each serving a particular suite of data and research communities, e.g. Neotoma Paleoecology Database, Paleobiology Database, International Tree Ring Database, NOAA NCEI for Paleoclimatology, Morphobank, iDigPaleo, and Integrated Earth Data Alliance. Recently, these groups have organized into a common Paleobiology Data Consortium dedicated to improving interoperability and sharing best practices and protocols. The Neotoma Paleoecology Database offers one example of an active and growing CSDR, designed to facilitate research into ecological and evolutionary dynamics during recent past global change. Neotoma combines a centralized database structure with distributed scientific governance via multiple virtual constituent data working groups. The Neotoma data model is flexible and can accommodate a variety of paleoecological proxies from many depositional contests. Data input into Neotoma is done by trained Data Stewards, drawn from their communities. Neotoma data can be searched, viewed, and returned to users through multiple interfaces, including the interactive Neotoma Explorer map interface, REST-ful Application Programming Interfaces (APIs), the neotoma R package, and the Tilia stratigraphic software. Neotoma is governed by geoscientists and provides community engagement through training workshops for data contributors, stewards, and users. Neotoma is engaged in the Paleobiological Data Consortium and other efforts to improve interoperability among cyberinfrastructure in the paleogeosciences.
Interoperability of Heliophysics Virtual Observatories
NASA Technical Reports Server (NTRS)
Thieman, J.; Roberts, A.; King, T.; King, J.; Harvey, C.
2008-01-01
If you'd like to find interrelated heliophysics (also known as space and solar physics) data for a research project that spans, for example, magnetic field data and charged particle data from multiple satellites located near a given place and at approximately the same time, how easy is this to do? There are probably hundreds of data sets scattered in archives around the world that might be relevant. Is there an optimal way to search these archives and find what you want? There are a number of virtual observatories (VOs) now in existence that maintain knowledge of the data available in subdisciplines of heliophysics. The data may be widely scattered among various data centers, but the VOs have knowledge of what is available and how to get to it. The problem is that research projects might require data from a number of subdisciplines. Is there a way to search multiple VOs at once and obtain what is needed quickly? To do this requires a common way of describing the data such that a search using a common term will find all data that relate to the common term. This common language is contained within a data model developed for all of heliophysics and known as the SPASE (Space Physics Archive Search and Extract) Data Model. NASA has funded the main part of the development of SPASE but other groups have put resources into it as well. How well is this working? We will review the use of SPASE and how well the goal of locating and retrieving data within the heliophysics community is being achieved. Can the VOs truly be made interoperable despite being developed by so many diverse groups?
System and methods of resource usage using an interoperable management framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.
Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.
Information Management Challenges in Achieving Coalition Interoperability
2001-12-01
by J. Dyer SESSION I: ARCHITECTURES AND STANDARDS: FUNDAMENTAL ISSUES Chairman: Dr I. WHITE (UK) Planning for Interoperability 1 by W.M. Gentleman...framework – a crucial step toward achieving coalition C4I interoperability. TOPICS TO BE COVERED: 1 ) Maintaining secure interoperability 2) Command...d’une coalition. SUJETS À EXAMINER : 1 ) Le maintien d’une interopérabilité sécurisée 2) Les interfaces des systèmes de commandement : 2a
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.
The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levelsmore » to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.« less
2016-09-01
training in the decisive action training environment, with rotations routinely featuring several thousand participants from many nations and operating in...teams work with exercise participants before they arrive at the training center. The goal is to ensure all formations understand — and are able to...capabilities, location, and extensive experience working with NATO and partner countries, the JMTC is uniquely positioned to implement NATO training
Bonaretti, Serena; Vilayphiou, Nicolas; Chan, Caroline Mai; Yu, Andrew; Nishiyama, Kyle; Liu, Danmei; Boutroy, Stephanie; Ghasem-Zadeh, Ali; Boyd, Steven K.; Chapurlat, Roland; McKay, Heather; Shane, Elizabeth; Bouxsein, Mary L.; Black, Dennis M.; Majumdar, Sharmila; Orwoll, Eric S.; Lang, Thomas F.; Khosla, Sundeep; Burghardt, Andrew J.
2017-01-01
Introduction HR-pQCT is increasingly used to assess bone quality, fracture risk and anti-fracture interventions. The contribution of the operator has not been adequately accounted in measurement precision. Operators acquire a 2D projection (“scout view image”) and define the region to be scanned by positioning a “reference line” on a standard anatomical landmark. In this study, we (i) evaluated the contribution of positioning variability to in vivo measurement precision, (ii) measured intra- and inter-operator positioning variability, and (iii) tested if custom training software led to superior reproducibility in new operators compared to experienced operators. Methods To evaluate the operator in vivo measurement precision we compared precision errors calculated in 64 co-registered and non-co-registered scan-rescan images. To quantify operator variability, we developed software that simulates the positioning process of the scanner’s software. Eight experienced operators positioned reference lines on scout view images designed to test intra- and inter-operator reproducibility. Finally, we developed modules for training and evaluation of reference line positioning. We enrolled 6 new operators to participate in a common training, followed by the same reproducibility experiments performed by the experienced group. Results In vivo precision errors were up to three-fold greater (Tt.BMD and Ct.Th) when variability in scan positioning was included. Inter-operator precision errors were significantly greater than short-term intra-operator precision (p<0.001). New trained operators achieved comparable intra-operator reproducibility to experienced operators, and lower inter-operator reproducibility (p<0.001). Precision errors were significantly greater for the radius than for the tibia. Conclusion Operator reference line positioning contributes significantly to in vivo measurement precision and is significantly greater for multi-operator datasets. Inter-operator variability can be significantly reduced using a systematic training platform, now available online (http://webapps.radiology.ucsf.edu/refline/). PMID:27475931
Building a Global Earth Observation System of Systems (GEOSS) and Its Interoperability Challenges
NASA Astrophysics Data System (ADS)
Ryan, B. J.
2015-12-01
Launched in 2005 by industrialized nations, the Group on Earth Observations (GEO) began building the Global Earth Observation System of Systems (GEOSS). Consisting of both a policy framework, and an information infrastructure, GEOSS, was intended to link and/or integrate the multitude of Earth observation systems, primarily operated by its Member Countries and Participating Organizations, so that users could more readily benefit from global information assets for a number of society's key environmental issues. It was recognized that having ready access to observations from multiple systems was a prerequisite for both environmental decision-making, as well as economic development. From the very start, it was also recognized that the shear complexity of the Earth's system cannot be captured by any single observation system, and that a federated, interoperable approach was necessary. While this international effort has met with much success, primarily in advancing broad, open data policies and practices, challenges remain. In 2014 (Geneva, Switzerland) and 2015 (Mexico City, Mexico), Ministers from GEO's Member Countries, including the European Commission, came together to assess progress made during the first decade (2005 to 2015), and approve implementation strategies and mechanisms for the second decade (2016 to 2025), respectively. The approved implementation strategies and mechanisms are intended to advance GEOSS development thereby facilitating the increased uptake of Earth observations for informed decision-making. Clearly there are interoperability challenges that are technological in nature, and several will be discussed in this presentation. There are, however, interoperability challenges that can be better characterized as economic, governmental and/or political in nature, and these will be discussed as well. With the emergence of the Sustainable Development Goals (SDGs), the World Conference on Disaster Risk Reduction (WCDRR), and the United Nations Framework Convention on Climate Change (UNFCCC) having occurred this year, it will be essential that the interoperability challenges described herein, regardless of their nature, be expeditiously addressed so that Earth observations can indeed inform societal decision-making.
Implementing PAT with Standards
NASA Astrophysics Data System (ADS)
Chandramohan, Laakshmana Sabari; Doolla, Suryanarayana; Khaparde, S. A.
2016-02-01
Perform Achieve Trade (PAT) is a market-based incentive mechanism to promote energy efficiency. The purpose of this work is to address the challenges inherent to inconsistent representation of business processes, and interoperability issues in PAT like cap-and-trade mechanisms especially when scaled. Studies by various agencies have highlighted that as the mechanism evolves including more industrial sectors and industries in its ambit, implementation will become more challenging. This paper analyses the major needs of PAT (namely tracking, monitoring, auditing & verifying energy-saving reports, and providing technical support & guidance to stakeholders); and how the aforesaid reasons affect them. Though current technologies can handle these challenges to an extent, standardization activities for implementation have been scanty for PAT and this work attempts to evolve them. The inconsistent modification of business processes, rules, and procedures across stakeholders, and interoperability among heterogeneous systems are addressed. This paper proposes the adoption of specifically two standards into PAT, namely Business Process Model and Notation for maintaining consistency in business process modelling, and Common Information Model (IEC 61970, 61968, 62325 combined) for information exchange. Detailed architecture and organization of these adoptions are reported. The work can be used by PAT implementing agencies, stakeholders, and standardization bodies.
Wang, Ningli; Wang, Bingsong; Zhai, Gaoshou; Lei, Kun; Wang, Lan; Congdon, Nathan
2007-05-01
To describe and evaluate a new method for measuring anterior chamber volume (ACV). Observational case series. The authors measured ACV using the anterior chamber (AC) optical coherence tomographer (OCT) and applied image-processing software developed by them. Repeatability was evaluated. The ACV was measured in patient groups with normal ACs, shallow ACs, and deep ACs. The volume difference before and after laser peripheral iridotomy (LPI) was analyzed for the shallow and deep groups. Coefficients of repeatability for intraoperator, interoperator, and interimage measurements were 0.406%, 0.958%, and 0.851%, respectively. The limits of agreement for intraoperator and interoperator measurement were -0.911 microl to 1.343 microl and -7.875 microl to -2.463 microl, respectively. There were significant ACV differences in normal, shallow, and deep AC eyes (P < .001) and before and after LPI in shallow AC (P < .001) and deep AC (P = .008) eyes. The ACV values obtained by this method were repeatable and in accord with clinical observation.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-27
... works such as video games and slide presentations). B. Computer programs that enable wireless telephone... enabling interoperability of such applications, when they have been lawfully obtained, with computer... new printer driver to a computer constitutes a `modification' of the operating system already...
Latest developments for the IAGOS database: Interoperability and metadata
NASA Astrophysics Data System (ADS)
Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume
2014-05-01
In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for intercomparison. The optimal data transfer protocol is being investigated to insure the interoperability. To facilitate satellite and model validation, tools will be made available for co-location and comparison with IAGOS. We will enhance the JOIN application in order to properly display aircraft data as vertical profiles and along individual flight tracks and to allow for graphical comparison to model results that are accessible through interoperable web services, such as the daily products from the GMES/Copernicus atmospheric service.
Impact of coalition interoperability on PKI
NASA Astrophysics Data System (ADS)
Krall, Edward J.
2003-07-01
This paper examines methods for providing PKI interoperability among units of a coalition of armed forces drawn from different nations. The area in question is tactical identity management, for the purposes of confidentiality, integrity and non-repudiation in such a dynamic coalition. The interoperating applications under consideration range from email and other forms of store-and-forward messaging to TLS and IPSEC-protected real-time communications. Six interoperability architectures are examined with advantages and disadvantages of each described in the paper.
Telemedicine system interoperability architecture: concept description and architecture overview.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard Layne, II
2004-05-01
In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.
Intelligent Mortality Reporting with FHIR
Hoffman, Ryan A.; Wu, Hang; Venugopalan, Janani; Braun, Paula; Wang, May D.
2017-01-01
One pressing need in the area of public health is timely, accurate, and complete reporting of deaths and the conditions leading up to them. Fast Healthcare Interoperability Resources (FHIR) is a new HL7 interoperability standard for electronic health record (EHR), while Sustainable Medical Applications and Reusable Technologies (SMART)-on-FHIR enables third-party app development that can work “out of the box”. This research demonstrates the feasibility of developing SMART-on-FHIR applications to enable medical professionals to perform timely and accurate death reporting within multiple different jurisdictions of US. We explored how the information on a standard certificate of death can be mapped to resources defined in the FHIR standard (DSTU2). We also demonstrated analytics for potentially improving the accuracy and completeness of mortality reporting data. PMID:28804791
Advancement of the Artificial Pancreas through the Development of Interoperability Standards
Picton, Peter E.; Yeung, Melanie; Hamming, Nathaniel; Desborough, Lane; Dassau, Eyal; Cafazzo, Joseph A.
2013-01-01
Despite advancements in the development of the artificial pancreas, barriers in the form of proprietary data and communication protocols of diabetes devices have made the integration of these components challenging. The Artificial Pancreas Standards and Technical Platform Project is an initiative funded by the JDRF Canadian Clinical Trial Network with the goal of developing device communication standards for the interoperability of diabetes devices. Stakeholders from academia, industry, regulatory agencies, and medical and patient communities have been engaged in advancing this effort. In this article, we describe this initiative along with the process involved in working with the standards organizations and stakeholders that are key to ensuring effective standards are developed and adopted. Discussion from a special session of the 12th Annual Diabetes Technology Meeting is also provided. PMID:23911190
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 1 2014-10-01 2014-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 1 2013-10-01 2013-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
Big issues, small systems: managing with information in medical research.
Jones, J; Preston, H
2000-08-01
This subject of this article is the design of a database system for handling files related to the work of the Molecular Genetics Department of the International Blood Group Reference Laboratory. It examines specialist information needs identified within this organization and it indicates how the design of the Rhesus Information Tracking System was able to meet current needs. Rapid Applications Development prototyping forms the basis of the investigation, linked to interview, questionnaire, and observation techniques in order to establish requirements for interoperability. In particular, the place of this specialist database within the much broader information strategy of the National Blood Service will be examined. This unique situation is analogous to management activities in broader environments and a number of generic issues are highlighted by the research.
NASA Astrophysics Data System (ADS)
Cole, M.; Alameh, N.; Bambacus, M.
2006-05-01
The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.
NASA Astrophysics Data System (ADS)
Bambacus, M.; Alameh, N.; Cole, M.
2006-12-01
The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.
Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A
2008-02-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).
Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.
2008-01-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service Oriented Architecture (SSOA) for cancer research by the National Cancer Institute’s cancer Biomedical Informatics Grid (caBIG™). PMID:17512259
Laplante-Lévesque, Ariane; Abrams, Harvey; Bülow, Maja; Lunner, Thomas; Nelson, John; Riis, Søren Kamaric; Vanpoucke, Filiep
2016-10-01
This article describes the perspectives of hearing device manufacturers regarding the exciting developments that the Internet makes possible. Specifically, it proposes to join forces toward interoperability and standardization of Internet and audiology. A summary of why such a collaborative effort is required is provided from historical and scientific perspectives. A roadmap toward interoperability and standardization is proposed. Information and communication technologies improve the flow of health care data and pave the way to better health care. However, hearing-related products, features, and services are notoriously heterogeneous and incompatible with other health care systems (no interoperability). Standardization is the process of developing and implementing technical standards (e.g., Noah hearing database). All parties involved in interoperability and standardization realize mutual gains by making mutually consistent decisions. De jure (officially endorsed) standards can be developed in collaboration with large national health care systems as well as spokespeople for hearing care professionals and hearing device users. The roadmap covers mutual collaboration; data privacy, security, and ownership; compliance with current regulations; scalability and modularity; and the scope of interoperability and standards. We propose to join forces to pave the way to the interoperable Internet and audiology products, features, and services that the world needs.
Reflections on the role of open source in health information system interoperability.
Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G
2007-01-01
This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.
A SOA-Based Platform to Support Clinical Data Sharing
Gazzarata, R; Giannini, B; Giacomini, M
2017-01-01
The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the “Interoperable” Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the “Interoperable” Tier, the current solution actually covers the “Connected” Tier, due to local hospital policy restrictions. © 2017 R. Gazzarata et al.
A SOA-Based Platform to Support Clinical Data Sharing
Gazzarata, R.; Giannini, B.
2017-01-01
The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the “Interoperable” Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the “Interoperable” Tier, the current solution actually covers the “Connected” Tier, due to local hospital policy restrictions. PMID:29065576
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public... persons that the Federal Communications Commission's (FCC) Communications Security, Reliability, and... the security, reliability, and interoperability of communications systems. On March 19, 2011, the FCC...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
UAS Integration in the NAS Project: DAA-TCAS Interoperability "mini" HITL Primary Results
NASA Technical Reports Server (NTRS)
Rorie, Conrad; Fern, Lisa; Shively, Jay; Santiago, Confesor
2016-01-01
At the May 2015 SC-228 meeting, requirements for TCAS II interoperability became elevated in priority. A TCAS interoperability workgroup was formed to identify and address key issues/questions. The TCAS workgroup came up with an initial list of questions and a plan to address those questions. As part of that plan, NASA proposed to run a mini HITL to address display, alerting and guidance issues. A TCAS Interoperability Workshop was held to determine potential display/alerting/guidance issues that could be explored in future NASA mini HITLS. Consensus on main functionality of DAA guidance when TCAS II RA occurs. Prioritized list of independent variables for experimental design. Set of use cases to stress TCAS Interoperability.
An Ontological Solution to Support Interoperability in the Textile Industry
NASA Astrophysics Data System (ADS)
Duque, Arantxa; Campos, Cristina; Jiménez-Ruiz, Ernesto; Chalmeta, Ricardo
Significant developments in information and communication technologies and challenging market conditions have forced enterprises to adapt their way of doing business. In this context, providing mechanisms to guarantee interoperability among heterogeneous organisations has become a critical issue. Even though prolific research has already been conducted in the area of enterprise interoperability, we have found that enterprises still struggle to introduce fully interoperable solutions, especially, in terms of the development and application of ontologies. Thus, the aim of this paper is to introduce basic ontology concepts in a simple manner and to explain the advantages of the use of ontologies to improve interoperability. We will also present a case study showing the implementation of an application ontology for an enterprise in the textile/clothing sector.
77 FR 37001 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... of the Interoperability Services Layer, Attn: Ron Chen, 400 Gigling Road, Seaside, CA 93955. Title; Associated Form; and OMB Number: Interoperability Services Layer; OMB Control Number 0704-TBD. Needs and Uses... INFORMATION: Summary of Information Collection IoLS (Interoperability Layer Services) is an application in a...
He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison
2018-01-12
Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).
Personal Health Records: Is Rapid Adoption Hindering Interoperability?
Studeny, Jana; Coustasse, Alberto
2014-01-01
The establishment of the Meaningful Use criteria has created a critical need for robust interoperability of health records. A universal definition of a personal health record (PHR) has not been agreed upon. Standardized code sets have been built for specific entities, but integration between them has not been supported. The purpose of this research study was to explore the hindrance and promotion of interoperability standards in relationship to PHRs to describe interoperability progress in this area. The study was conducted following the basic principles of a systematic review, with 61 articles used in the study. Lagging interoperability has stemmed from slow adoption by patients, creation of disparate systems due to rapid development to meet requirements for the Meaningful Use stages, and rapid early development of PHRs prior to the mandate for integration among multiple systems. Findings of this study suggest that deadlines for implementation to capture Meaningful Use incentive payments are supporting the creation of PHR data silos, thereby hindering the goal of high-level interoperability. PMID:25214822
Test Protocols for Advanced Inverter Interoperability Functions – Main Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.
2013-11-01
Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated onmore » grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not currently required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as some of these inverter capabilities are being incorporated in large demonstration and commercial projects. The test protocols are intended to be used to verify acceptable performance of inverters within the standard framework described in IEC TR 61850-90-7. These test protocols, as they are refined and validated over time, can become precursors for future certification test procedures for DER advanced grid support functions.« less
Test Protocols for Advanced Inverter Interoperability Functions - Appendices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.
2013-11-01
Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated onmore » grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not now required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as some of these inverter capabilities are being incorporated in large demonstration and commercial projects. The test protocols are intended to be used to verify acceptable performance of inverters within the standard framework described in IEC TR 61850-90-7. These test protocols, as they are refined and validated over time, can become precursors for future certification test procedures for DER advanced grid support functions.« less
ESA's Planetary Science Archive: International collaborations towards transparent data access
NASA Astrophysics Data System (ADS)
Heather, David
The European Space Agency's (ESA) Planetary Science Archive (PSA) is the central repository for science data returned by all ESA planetary missions. Current holdings include data from Giotto, SMART-1, Cassini-Huygens, Mars Express, Venus Express, and Rosetta. In addition to the basic management and distribution of these data to the community through our own interfaces, ESA has been working very closely with international partners to globalize the archiving standards used and the access to our data. Part of this ongoing effort is channelled through our participation in the International Planetary Data Alliance (IPDA), whose focus is on allowing transparent and interoperable access to data holdings from participating Agencies around the globe. One major focus of this work has been the development of the Planetary Data Access Protocol (PDAP) that will allow for the interoperability of archives and sharing of data. This is already used for transparent access to data from Venus Express, and ESA are currently working with ISRO and NASA to provide interoperable access to ISRO's Chandrayaan-1 data through our systems using this protocol. Close interactions are ongoing with NASA's Planetary Data System as the standards used for planetary data archiving evolve, and two of our upcoming missions are to be the first to implement the new 'PDS4' standards in ESA: BepiColombo and ExoMars. Projects have been established within the IPDA framework to guide these implementations to try and ensure interoperability and maximise the usability of the data by the community. BepiColombo and ExoMars are both international missions, in collaboration with JAXA and IKI respectively, and a strong focus has been placed on close interaction and collaboration throughout the development of each archive. For both of these missions there is a requirement to share data between the Agencies prior to public access, as well as providing complete open access globally once the proprietary periods have elapsed. This introduces a number of additional challenges in terms of managing different access rights to data throughout the mission lifetime. Both of these mission will have data pipelines running internally to our Science Ground Segment, in order to release the instrument teams to work more on science analyses. We have followed the IPDA recommendations of trying to start work on archiving with these missions very early in the life-cycle (especially on BepiColombo and now starting on JUICE), and endeavour to make sure that archiving requirements are clearly stated in official mission documentation at the time of selection. This has helped to ensure that adequate resources are available internally and within the instrument teams to support archive development. This year will also see major milestones for two of our operational missions. Venus Express will start an aerobraking phase in late spring / early summer, and will wind down science operations this year, while Rosetta will encounter the comet Churyamov-Gerasimenko, deploy the lander and start its main science phase. While these missions are at opposite ends of their science phases, many of the challenges from the archiving side are similar. Venus Express will have a full mission archive review this year and data pipelines will start to be updated / corrected where necessary in order to ensure long-term usability and interoperable access to the data. Rosetta will start to deliver science data in earnest towards the end of the year, and the focus will be on ensuring that data pipelines are ready and robust enough to maintain deliveries throughout the main science phase. For both missions, we aim to use the lessons learned and technologies developed through our international collaborations to maximise the availability and usability of the data delivered. In 2013, ESA established a Planetary Science Archive User Group (PSA-UG) to provide independent advice on ways to improve our services and our provision of data to the community. The PSA-UG will be a key link to the international planetary science community, providing requirements and recommendations that will allow us to better meet their needs, and promoting the use of the PSA and its data holdings. This presentation will outline the many international collaborations currently in place for the PSA, both for missions in operations and for those under development. There is a strong desire to provide full transparent science data access and improved services to the planetary science community around the world, and our continuing work with our international partners brings us ever closer to achieving this goal. Many challenges still remain, and these will be outlined in the presentation.
Organisational Interoperability: Evaluation and Further Development of the OIM Model
2003-06-01
an Organizational Interoperability Maturity Model (OIM) to evaluate interoperability at the organizational level. The OIM considers the human ... activity aspects of military operations, which are not covered in other models. This paper describes how the model has been used to identify problems and to
NASA Technical Reports Server (NTRS)
Ivancic, William D.; Glover, Daniel R.; vonDeak, Thomas C.; Bhasin, Kul B.
1998-01-01
The realization of the full potential of the National Information Infrastructure (NH) and Global Information Infrastructure (GII) requires seamless interoperability of emerging satellite networks with terrestrial networks. This requires a cooperative effort between industry, academia and government agencies to develop and advocate new, satellite-friendly communication protocols and modifications to existing communication protocol standards. These groups have recently come together to actively participating in a number of standards making bodies including: the Internet Engineering Task Force (IETF), the Asynchronous Transfer Mode (ATM) Forum, the International Telecommunication Union (ITU) and the Telecommunication Industry Association MA) to ensure that issues regarding efficient use of these protocols over satellite links are not overlooked. This paper will summarize the progress made toward standards development to achieve seamless integration and accelerate the deployment of multimedia applications.
Self-Assembling Texts & Courses of Study.
ERIC Educational Resources Information Center
Gibson, David
This paper describes the development of an interoperable meta-database system--a system of applications using metadata--that is intended to facilitate learner-centered collaboration, access to learning resources, and the fitness of channels of information to the emerging needs of learners at both individual and group levels. Highlights include:…
interoperability emerging infrastructure for data management on computational grids Software Packages Services : ATLAS: Management and Steering: Computing Management Board Software Project Management Board Database Model Group Computing TDR: 4.5 Event Data 4.8 Database and Data Management Services 6.3.4 Production and
Improving Cancer-Related Outcomes with Connected Health - Action Items at a Glance
Action Item 1.1: Health IT stakeholder groups should continue to collaborate to overcome policy and technical barriers to a nationwide, interoperable health IT system. Action Item 1.2: Technical standards for information related to cancer care across the continuum should be developed, tested, disseminated, and adopted.
DOT National Transportation Integrated Search
2000-01-01
These draft standards are intended to work together to provide a high level of interoperability among regional and local traffic control centers. They provide consistent names, definitions and concepts similar to spelling and parts of speech to world...
de Arriba-Pérez, Francisco; Caeiro-Rodríguez, Manuel; Santos-Gago, Juan M
2016-09-21
Over recent years, we have witnessed the development of mobile and wearable technologies to collect data from human vital signs and activities. Nowadays, wrist wearables including sensors (e.g., heart rate, accelerometer, pedometer) that provide valuable data are common in market. We are working on the analytic exploitation of this kind of data towards the support of learners and teachers in educational contexts. More precisely, sleep and stress indicators are defined to assist teachers and learners on the regulation of their activities. During this development, we have identified interoperability challenges related to the collection and processing of data from wearable devices. Different vendors adopt specific approaches about the way data can be collected from wearables into third-party systems. This hinders such developments as the one that we are carrying out. This paper contributes to identifying key interoperability issues in this kind of scenario and proposes guidelines to solve them. Taking into account these topics, this work is situated in the context of the standardization activities being carried out in the Internet of Things and Machine to Machine domains.
de Arriba-Pérez, Francisco; Caeiro-Rodríguez, Manuel; Santos-Gago, Juan M.
2016-01-01
Over recent years, we have witnessed the development of mobile and wearable technologies to collect data from human vital signs and activities. Nowadays, wrist wearables including sensors (e.g., heart rate, accelerometer, pedometer) that provide valuable data are common in market. We are working on the analytic exploitation of this kind of data towards the support of learners and teachers in educational contexts. More precisely, sleep and stress indicators are defined to assist teachers and learners on the regulation of their activities. During this development, we have identified interoperability challenges related to the collection and processing of data from wearable devices. Different vendors adopt specific approaches about the way data can be collected from wearables into third-party systems. This hinders such developments as the one that we are carrying out. This paper contributes to identifying key interoperability issues in this kind of scenario and proposes guidelines to solve them. Taking into account these topics, this work is situated in the context of the standardization activities being carried out in the Internet of Things and Machine to Machine domains. PMID:27657081
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-13
..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public..., Reliability, and Interoperability Council (CSRIC) will hold its fifth meeting. The CSRIC will vote on... to the FCC regarding best practices and actions the FCC can take to ensure the security, reliability...
Evaluation of Interoperability Protocols in Repositories of Electronic Theses and Dissertations
ERIC Educational Resources Information Center
Hakimjavadi, Hesamedin; Masrek, Mohamad Noorman
2013-01-01
Purpose: The purpose of this study is to evaluate the status of eight interoperability protocols within repositories of electronic theses and dissertations (ETDs) as an introduction to further studies on feasibility of deploying these protocols in upcoming areas of interoperability. Design/methodology/approach: Three surveys of 266 ETD…
Examining the Relationship between Electronic Health Record Interoperability and Quality Management
ERIC Educational Resources Information Center
Purcell, Bernice M.
2013-01-01
A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…
Interoperability of Demand Response Resources Demonstration in NY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wellington, Andre
2014-03-31
The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.
Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interop...
Reminiscing about 15 years of interoperability efforts
Van de Sompel, Herbert; Nelson, Michael L.
2015-11-01
Over the past fifteen years, our perspective on tackling information interoperability problems for web-based scholarship has evolved significantly. In this opinion piece, we look back at three efforts that we have been involved in that aptly illustrate this evolution: OAI-PMH, OAI-ORE, and Memento. Understanding that no interoperability specification is neutral, we attempt to characterize the perspectives and technical toolkits that provided the basis for these endeavors. With that regard, we consider repository-centric and web-centric interoperability perspectives, and the use of a Linked Data or a REST/HATEAOS technology stack, respectively. In addition, we lament the lack of interoperability across nodes thatmore » play a role in web-based scholarship, but end on a constructive note with some ideas regarding a possible path forward.« less
Potential interoperability problems facing multi-site radiation oncology centers in The Netherlands
NASA Astrophysics Data System (ADS)
Scheurleer, J.; Koken, Ph; Wessel, R.
2014-03-01
Aim: To identify potential interoperability problems facing multi-site Radiation Oncology (RO) departments in the Netherlands and solutions for unambiguous multi-system workflows. Specific challenges confronting the RO department of VUmc (RO-VUmc), which is soon to open a satellite department, were characterized. Methods: A nationwide questionnaire survey was conducted to identify possible interoperability problems and solutions. Further detailed information was obtained by in-depth interviews at 3 Dutch RO institutes that already operate in more than one site. Results: The survey had a 100% response rate (n=21). Altogether 95 interoperability problems were described. Most reported problems were on a strategic and semantic level. The majority were DICOM(-RT) and HL7 related (n=65), primarily between treatment planning and verification systems or between departmental and hospital systems. Seven were identified as being relevant for RO-VUmc. Departments have overcome interoperability problems with their own, or with tailor-made vendor solutions. There was little knowledge about or utilization of solutions developed by Integrating the Healthcare Enterprise Radiation Oncology (IHE-RO). Conclusions: Although interoperability problems are still common, solutions have been identified. Awareness of IHE-RO needs to be raised. No major new interoperability problems are predicted as RO-VUmc develops into a multi-site department.
Interoperability in digital electrocardiography: harmonization of ISO/IEEE x73-PHD and SCP-ECG.
Trigo, Jesús D; Chiarugi, Franco; Alesanco, Alvaro; Martínez-Espronceda, Miguel; Serrano, Luis; Chronaki, Catherine E; Escayola, Javier; Martínez, Ignacio; García, José
2010-11-01
The ISO/IEEE 11073 (x73) family of standards is a reference frame for medical device interoperability. A draft for an ECG device specialization (ISO/IEEE 11073-10406-d02) has already been presented to the Personal Health Device (PHD) Working Group, and the Standard Communications Protocol for Computer-Assisted ElectroCardioGraphy (SCP-ECG) Standard for short-term diagnostic ECGs (EN1064:2005+A1:2007) has recently been approved as part of the x73 family (ISO 11073-91064:2009). These factors suggest the coordinated use of these two standards in foreseeable telecardiology environments, and hence the need to harmonize them. Such harmonization is the subject of this paper. Thus, a mapping of the mandatory attributes defined in the second draft of the ISO/IEEE 11073-10406-d02 and the minimum SCP-ECG fields is presented, and various other capabilities of the SCP-ECG Standard (such as the messaging part) are also analyzed from an x73-PHD point of view. As a result, this paper addresses and analyzes the implications of some inconsistencies in the coordinated use of these two standards. Finally, a proof-of-concept implementation of the draft x73-PHD ECG device specialization is presented, along with the conversion from x73-PHD to SCP-ECG. This paper, therefore, provides recommendations for future implementations of telecardiology systems that are compliant with both x73-PHD and SCP-ECG.
e-Infrastructures for Astronomy: An Integrated View
NASA Astrophysics Data System (ADS)
Pasian, F.; Longo, G.
2010-12-01
As for other disciplines, the capability of performing “Big Science” in astrophysics requires the availability of large facilities. In the field of ICT, computational resources (e.g. HPC) are important, but are far from being enough for the community: as a matter of fact, the whole set of e-infrastructures (network, computing nodes, data repositories, applications) need to work in an interoperable way. This implies the development of common (or at least compatible) user interfaces to computing resources, transparent access to observations and numerical simulations through the Virtual Observatory, integrated data processing pipelines, data mining and semantic web applications. Achieving this interoperability goal is a must to build a real “Knowledge Infrastructure” in the astrophysical domain. Also, the emergence of new professional profiles (e.g. the “astro-informatician”) is necessary to allow defining and implementing properly this conceptual schema.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, Tian-Jy; Kim, Younghun
An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented thatmore » communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.« less
Data interoperability software solution for emergency reaction in the Europe Union
NASA Astrophysics Data System (ADS)
Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.
2015-07-01
Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency-first responders: the Netherlands-Germany border fire.
Connecting the clinical IT infrastructure to a service-oriented architecture of medical devices.
Andersen, Björn; Kasparick, Martin; Ulrich, Hannes; Franke, Stefan; Schlamelcher, Jan; Rockstroh, Max; Ingenerf, Josef
2018-02-23
The new medical device communication protocol known as IEEE 11073 SDC is well-suited for the integration of (surgical) point-of-care devices, so are the established Health Level Seven (HL7) V2 and Digital Imaging and Communications in Medicine (DICOM) standards for the communication of systems in the clinical IT infrastructure (CITI). An integrated operating room (OR) and other integrated clinical environments, however, need interoperability between both domains to fully unfold their potential for improving the quality of care as well as clinical workflows. This work thus presents concepts for the propagation of clinical and administrative data to medical devices, physiologic measurements and device parameters to clinical IT systems, as well as image and multimedia content in both directions. Prototypical implementations of the derived components have proven to integrate well with systems of networked medical devices and with the CITI, effectively connecting these heterogeneous domains. Our qualitative evaluation indicates that the interoperability concepts are suitable to be integrated into clinical workflows and are expected to benefit patients and clinicians alike. The upcoming HL7 Fast Healthcare Interoperability Resources (FHIR) communication standard will likely change the domain of clinical IT significantly. A straightforward mapping to its resource model thus ensures the tenability of these concepts despite a foreseeable change in demand and requirements.
Interoperability at ESA Heliophysics Science Archives: IVOA, HAPI and other implementations
NASA Astrophysics Data System (ADS)
Martinez-Garcia, B.; Cook, J. P.; Perez, H.; Fernandez, M.; De Teodoro, P.; Osuna, P.; Arnaud, M.; Arviset, C.
2017-12-01
The data of ESA heliophysics science missions are preserved at the ESAC Science Data Centre (ESDC). The ESDC aims for the long term preservation of those data, which includes missions such as Ulysses, Soho, Proba-2, Cluster, Double Star, and in the future, Solar Orbiter. Scientists have access to these data through web services, command line and graphical user interfaces for each of the corresponding science mission archives. The International Virtual Observatory Alliance (IVOA) provides technical standards that allow interoperability among different systems that implement them. By adopting some IVOA standards, the ESA heliophysics archives are able to share their data with those tools and services that are VO-compatible. Implementation of those standards can be found in the existing archives: Ulysses Final Archive (UFA) and Soho Science Archive (SSA). They already make use of VOTable format definition and Simple Application Messaging Protocol (SAMP). For re-engineered or new archives, the implementation of services through Table Access Protocol (TAP) or Universal Worker Service (UWS) will leverage this interoperability. This will be the case for the Proba-2 Science Archive (P2SA) and the Solar Orbiter Archive (SOAR). We present here which IVOA standards were already used by the ESA Heliophysics archives in the past and the work on-going.
Designing for Change: Interoperability in a scaling and adapting environment
NASA Astrophysics Data System (ADS)
Yarmey, L.
2015-12-01
The Earth Science cyberinfrastructure landscape is constantly changing. Technologies advance and technical implementations are refined or replaced. Data types, volumes, packaging, and use cases evolve. Scientific requirements emerge and mature. Standards shift while systems scale and adapt. In this complex and dynamic environment, interoperability remains a critical component of successful cyberinfrastructure. Through the resource- and priority-driven iterations on systems, interfaces, and content, questions fundamental to stable and useful Earth Science cyberinfrastructure arise. For instance, how are sociotechnical changes planned, tracked, and communicated? How should operational stability balance against 'new and shiny'? How can ongoing maintenance and mitigation of technical debt be managed in an often short-term resource environment? The Arctic Data Explorer is a metadata brokering application developed to enable discovery of international, interdisciplinary Arctic data across distributed repositories. Completely dependent on interoperable third party systems, the Arctic Data Explorer publicly launched in 2013 with an original 3000+ data records from four Arctic repositories. Since then the search has scaled to 25,000+ data records from thirteen repositories at the time of writing. In the final months of original project funding, priorities shift to lean operations with a strategic eye on the future. Here we present lessons learned from four years of Arctic Data Explorer design, development, communication, and maintenance work along with remaining questions and potential directions.
A Simple XML Producer-Consumer Protocol
NASA Technical Reports Server (NTRS)
Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)
2001-01-01
There are many different projects from government, academia, and industry that provide services for delivering events in distributed environments. The problem with these event services is that they are not general enough to support all uses and they speak different protocols so that they cannot interoperate. We require such interoperability when we, for example, wish to analyze the performance of an application in a distributed environment. Such an analysis might require performance information from the application, computer systems, networks, and scientific instruments. In this work we propose and evaluate a standard XML-based protocol for the transmission of events in distributed systems. One recent trend in government and academic research is the development and deployment of computational grids. Computational grids are large-scale distributed systems that typically consist of high-performance compute, storage, and networking resources. Examples of such computational grids are the DOE Science Grid, the NASA Information Power Grid (IPG), and the NSF Partnerships for Advanced Computing Infrastructure (PACIs). The major effort to deploy these grids is in the area of developing the software services to allow users to execute applications on these large and diverse sets of resources. These services include security, execution of remote applications, managing remote data, access to information about resources and services, and so on. There are several toolkits for providing these services such as Globus, Legion, and Condor. As part of these efforts to develop computational grids, the Global Grid Forum is working to standardize the protocols and APIs used by various grid services. This standardization will allow interoperability between the client and server software of the toolkits that are providing the grid services. The goal of the Performance Working Group of the Grid Forum is to standardize protocols and representations related to the storage and distribution of performance data. These standard protocols and representations must support tasks such as profiling parallel applications, monitoring the status of computers and networks, and monitoring the performance of services provided by a computational grid. This paper describes a proposed protocol and data representation for the exchange of events in a distributed system. The protocol exchanges messages formatted in XML and it can be layered atop any low-level communication protocol such as TCP or UDP Further, we describe Java and C++ implementations of this protocol and discuss their performance. The next section will provide some further background information. Section 3 describes the main communication patterns of our protocol. Section 4 describes how we represent events and related information using XML. Section 5 describes our protocol and Section 6 discusses the performance of two implementations of the protocol. Finally, an appendix provides the XML Schema definition of our protocol and event information.
Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaidon, Clement; Poplawski, Michael
First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.
2015-01-01
Background A transformation is underway regarding how we deal with our health. Mobile devices make it possible to have continuous access to personal health information. Wearable devices, such as Fitbit and Apple’s smartwatch, can collect data continuously and provide insights into our health and fitness. However, lack of interoperability and the presence of data silos prevent users and health professionals from getting an integrated view of health and fitness data. To provide better health outcomes, a complete picture is needed which combines informal health and fitness data collected by the user together with official health records collected by health professionals. Mobile apps are well positioned to play an important role in the aggregation since they can tap into these official and informal health and data silos. Objective The objective of this paper is to demonstrate that a mobile app can be used to aggregate health and fitness data and can enable interoperability. It discusses various technical interoperability challenges encountered while integrating data into one place. Methods For 8 years, we have worked with third-party partners, including wearable device manufacturers, electronic health record providers, and app developers, to connect an Android app to their (wearable) devices, back-end servers, and systems. Results The result of this research is a health and fitness app called myFitnessCompanion, which enables users to aggregate their data in one place. Over 6000 users use the app worldwide to aggregate their health and fitness data. It demonstrates that mobile apps can be used to enable interoperability. Challenges encountered in the research process included the different wireless protocols and standards used to communicate with wireless devices, the diversity of security and authorization protocols used to be able to exchange data with servers, and lack of standards usage, such as Health Level Seven, for medical information exchange. Conclusions By limiting the negative effects of health data silos, mobile apps can offer a better holistic view of health and fitness data. Data can then be analyzed to offer better and more personalized advice and care. PMID:26581920
M and S supporting unmanned autonomous systems (UAxS) concept development and experimentation
NASA Astrophysics Data System (ADS)
Biagini, Marco; Scaccianoce, Alfio; Corona, Fabio; Forconi, Sonia; Byrum, Frank; Fowler, Olivia; Sidoran, James L.
2017-05-01
The development of the next generation of multi-domain unmanned semi and fully autonomous C4ISR systems involves a multitude of security concerns and interoperability challenges. Conceptual solutions to capability shortfalls and gaps can be identified through Concept Development and Experimentation (CD and E) cycles. Modelling and Simulation (M and S) is a key tool in supporting unmanned autonomous systems (UAxS) CD and E activities and addressing associated security challenges. This paper serves to illustrate the application of M and S to UAxS development and highlight initiatives made by the North Atlantic Treaty Organization (NATO) M and S Centre of Excellence (CoE) to facilitate interoperability. The NATO M and S CoE collaborates with other NATO and Nations bodies in order to develop UAxS projects such as the Allied Command for Transformation Counter Unmanned Autonomous Systems (CUAxS) project or the work of Science and Technology Organization (STO) panels. Some initiatives, such as Simulated Interactive Robotics Initiative (SIRI) made the baseline for further developments and to study emerging technologies in M and S and robotics fields. Artificial Intelligence algorithm modelling, Robot Operating Systems (ROS), network operations, cyber security, interoperable languages and related data models are some of the main aspects considered in this paper. In particular, the implementation of interoperable languages like C-BML and NIEM MilOps are discussed in relation to a Command and Control - Simulation Interoperability (C2SIM) paradigm. All these technologies are used to build a conceptual architecture to support UAxS CD and E.In addition, other projects that the NATO M and S CoE is involved in, such as the NATO Urbanization Project could provide credible future operational environments and benefit UAxS project development, as dual application of UAxS technology in large urbanized areas.In conclusion, this paper contains a detailed overview regarding how applying Modelling and Simulation to support CD and E activities is a valid approach to develop and validate future capabilities requirements in general and next generation UAxS.
Gay, Valerie; Leijdekkers, Peter
2015-11-18
A transformation is underway regarding how we deal with our health. Mobile devices make it possible to have continuous access to personal health information. Wearable devices, such as Fitbit and Apple's smartwatch, can collect data continuously and provide insights into our health and fitness. However, lack of interoperability and the presence of data silos prevent users and health professionals from getting an integrated view of health and fitness data. To provide better health outcomes, a complete picture is needed which combines informal health and fitness data collected by the user together with official health records collected by health professionals. Mobile apps are well positioned to play an important role in the aggregation since they can tap into these official and informal health and data silos. The objective of this paper is to demonstrate that a mobile app can be used to aggregate health and fitness data and can enable interoperability. It discusses various technical interoperability challenges encountered while integrating data into one place. For 8 years, we have worked with third-party partners, including wearable device manufacturers, electronic health record providers, and app developers, to connect an Android app to their (wearable) devices, back-end servers, and systems. The result of this research is a health and fitness app called myFitnessCompanion, which enables users to aggregate their data in one place. Over 6000 users use the app worldwide to aggregate their health and fitness data. It demonstrates that mobile apps can be used to enable interoperability. Challenges encountered in the research process included the different wireless protocols and standards used to communicate with wireless devices, the diversity of security and authorization protocols used to be able to exchange data with servers, and lack of standards usage, such as Health Level Seven, for medical information exchange. By limiting the negative effects of health data silos, mobile apps can offer a better holistic view of health and fitness data. Data can then be analyzed to offer better and more personalized advice and care.
Barbarito, Fulvio; Pinciroli, Francesco; Mason, John; Marceglia, Sara; Mazzola, Luca; Bonacina, Stefano
2012-08-01
Information technologies (ITs) have now entered the everyday workflow in a variety of healthcare providers with a certain degree of independence. This independence may be the cause of difficulty in interoperability between information systems and it can be overcome through the implementation and adoption of standards. Here we present the case of the Lombardy Region, in Italy, that has been able, in the last 10 years, to set up the Regional Social and Healthcare Information System, connecting all the healthcare providers within the region, and providing full access to clinical and health-related documents independently from the healthcare organization that generated the document itself. This goal, in a region with almost 10 millions citizens, was achieved through a twofold approach: first, the political and operative push towards the adoption of the Health Level 7 (HL7) standard within single hospitals and, second, providing a technological infrastructure for data sharing based on interoperability specifications recognized at the regional level for messages transmitted from healthcare providers to the central domain. The adoption of such regional interoperability specifications enabled the communication among heterogeneous systems placed in different hospitals in Lombardy. Integrating the Healthcare Enterprise (IHE) integration profiles which refer to HL7 standards are adopted within hospitals for message exchange and for the definition of integration scenarios. The IHE patient administration management (PAM) profile with its different workflows is adopted for patient management, whereas the Scheduled Workflow (SWF), the Laboratory Testing Workflow (LTW), and the Ambulatory Testing Workflow (ATW) are adopted for order management. At present, the system manages 4,700,000 pharmacological e-prescriptions, and 1,700,000 e-prescriptions for laboratory exams per month. It produces, monthly, 490,000 laboratory medical reports, 180,000 radiology medical reports, 180,000 first aid medical reports, and 58,000 discharge summaries. Hence, despite there being still work in progress, the Lombardy Region healthcare system is a fully interoperable social healthcare system connecting patients, healthcare providers, healthcare organizations, and healthcare professionals in a large and heterogeneous territory through the implementation of international health standards. Copyright © 2012 Elsevier Inc. All rights reserved.
Semantic Integration for Marine Science Interoperability Using Web Technologies
NASA Astrophysics Data System (ADS)
Rueda, C.; Bermudez, L.; Graybeal, J.; Isenor, A. W.
2008-12-01
The Marine Metadata Interoperability Project, MMI (http://marinemetadata.org) promotes the exchange, integration, and use of marine data through enhanced data publishing, discovery, documentation, and accessibility. A key effort is the definition of an Architectural Framework and Operational Concept for Semantic Interoperability (http://marinemetadata.org/sfc), which is complemented with the development of tools that realize critical use cases in semantic interoperability. In this presentation, we describe a set of such Semantic Web tools that allow performing important interoperability tasks, ranging from the creation of controlled vocabularies and the mapping of terms across multiple ontologies, to the online registration, storage, and search services needed to work with the ontologies (http://mmisw.org). This set of services uses Web standards and technologies, including Resource Description Framework (RDF), Web Ontology language (OWL), Web services, and toolkits for Rich Internet Application development. We will describe the following components: MMI Ontology Registry: The MMI Ontology Registry and Repository provides registry and storage services for ontologies. Entries in the registry are associated with projects defined by the registered users. Also, sophisticated search functions, for example according to metadata items and vocabulary terms, are provided. Client applications can submit search requests using the WC3 SPARQL Query Language for RDF. Voc2RDF: This component converts an ASCII comma-delimited set of terms and definitions into an RDF file. Voc2RDF facilitates the creation of controlled vocabularies by using a simple form-based user interface. Created vocabularies and their descriptive metadata can be submitted to the MMI Ontology Registry for versioning and community access. VINE: The Vocabulary Integration Environment component allows the user to map vocabulary terms across multiple ontologies. Various relationships can be established, for example exactMatch, narrowerThan, and subClassOf. VINE can compute inferred mappings based on the given associations. Attributes about each mapping, like comments and a confidence level, can also be included. VINE also supports registering and storing resulting mapping files in the Ontology Registry. The presentation will describe the application of semantic technologies in general, and our planned applications in particular, to solve data management problems in the marine and environmental sciences.
NASA Technical Reports Server (NTRS)
Stephens, J. Briscoe; Grider, Gary W.
1992-01-01
These Earth Science and Applications Division-Data and Information System (ESAD-DIS) interoperability requirements are designed to quantify the Earth Science and Application Division's hardware and software requirements in terms of communications between personal and visualization workstation, and mainframe computers. The electronic mail requirements and local area network (LAN) requirements are addressed. These interoperability requirements are top-level requirements framed around defining the existing ESAD-DIS interoperability and projecting known near-term requirements for both operational support and for management planning. Detailed requirements will be submitted on a case-by-case basis. This document is also intended as an overview of ESAD-DIs interoperability for new-comers and management not familiar with these activities. It is intended as background documentation to support requests for resources and support requirements.
An open repositories network development for medical teaching resources.
Soula, Gérard; Darmoni, Stefan; Le Beux, Pierre; Renard, Jean-Marie; Dahamna, Badisse; Fieschi, Marius
2010-01-01
The lack of interoperability between repositories of heterogeneous and geographically widespread data is an obstacle to the diffusion, sharing and reutilization of those data. We present the development of an open repositories network taking into account both the syntactic and semantic interoperability of the different repositories and based on international standards in this field. The network is used by the medical community in France for the diffusion and sharing of digital teaching resources. The syntactic interoperability of the repositories is managed using the OAI-PMH protocol for the exchange of metadata describing the resources. Semantic interoperability is based, on one hand, on the LOM standard for the description of resources and on MESH for the indexing of the latter and, on the other hand, on semantic interoperability management designed to optimize compliance with standards and the quality of the metadata.
Beštek, Mate; Stanimirović, Dalibor
2017-08-09
The main aims of the paper comprise the characterization and examination of the potential approaches regarding interoperability. This includes openEHR, SNOMED, IHE, and Continua as combined interoperability approaches, possibilities for their incorporation into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. The paper represents an in-depth analysis regarding the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research and charting of existing experience in the field, and sources, both electronic and written, which include interoperability concepts and related implementation issues. The paper will try to answer the following inquiries that are complementing each other: 1. Scrutiny of the potential approaches, which could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field that critically influence development and implementation of eHealth projects in an efficient manner. Provided insights and identified success factors could serve as a constituent of the strategic starting points for continuous integration of interoperability principles into the healthcare domain. Moreover, the general implementation of the identified success factors could facilitate better penetration of ICT into the healthcare environment and enable the eHealth-based transformation of the health system especially in the countries which are still in an early phase of eHealth planning and development and are often confronted with differing interests, requirements, and contending strategies.
A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World
NASA Astrophysics Data System (ADS)
Wright, D. J.; Sankaran, S.
2015-12-01
In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted as part of our everyday use of technology.
Groundwater data network interoperability
Brodaric, Boyan; Booth, Nathaniel; Boisvert, Eric; Lucido, Jessica M.
2016-01-01
Water data networks are increasingly being integrated to answer complex scientific questions that often span large geographical areas and cross political borders. Data heterogeneity is a major obstacle that impedes interoperability within and between such networks. It is resolved here for groundwater data at five levels of interoperability, within a Spatial Data Infrastructure architecture. The result is a pair of distinct national groundwater data networks for the United States and Canada, and a combined data network in which they are interoperable. This combined data network enables, for the first time, transparent public access to harmonized groundwater data from both sides of the shared international border.
Report on the Second Catalog Interoperability Workshop
NASA Technical Reports Server (NTRS)
Thieman, James R.; James, Mary E.
1988-01-01
The events, resolutions, and recommendations of the Second Catalog Interoperability Workshop, held at JPL in January, 1988, are discussed. This workshop dealt with the issues of standardization and communication among directories, catalogs, and inventories in the earth and space science data management environment. The Directory Interchange Format, being constructed as a standard for the exchange of directory information among participating data systems, is discussed. Involvement in the Interoperability effort by NASA, NOAA, ISGS, and NSF is described, and plans for future interoperability considered. The NASA Master Directory prototype is presented and critiqued and options for additional capabilities debated.
A logical approach to semantic interoperability in healthcare.
Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni
2011-01-01
Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.
Analysis of ISO/IEEE 11073 built-in security and its potential IHE-based extensibility.
Rubio, Óscar J; Trigo, Jesús D; Alesanco, Álvaro; Serrano, Luis; García, José
2016-04-01
The ISO/IEEE 11073 standard for Personal Health Devices (X73PHD) aims to ensure interoperability between Personal Health Devices and aggregators-e.g. health appliances, routers-in ambulatory setups. The Integrating the Healthcare Enterprise (IHE) initiative promotes the coordinated use of different standards in healthcare systems (e.g. Personal/Electronic Health Records, alert managers, Clinical Decision Support Systems) by defining profiles intended for medical use cases. X73PHD provides a robust syntactic model and a comprehensive terminology, but it places limited emphasis on security and on interoperability with IHE-compliant systems and frameworks. However, the implementation of eHealth/mHealth applications in environments such as health and fitness monitoring, independent living and disease management (i.e. the X73PHD domains) increasingly requires features such as secure connections to mobile aggregators-e.g. smartphones, tablets-, the sharing of devices among different users with privacy, and interoperability with certain IHE-compliant healthcare systems. This work proposes a comprehensive IHE-based X73PHD extension consisting of additive layers adapted to different eHealth/mHealth applications, after having analyzed the features of X73PHD (especially its built-in security), IHE profiles related with these applications and other research works. Both the new features proposed for each layer and the procedures to support them have been carefully chosen to minimize the impact on X73PHD, on its architecture (in terms of delays and overhead) and on its framework. Such implications are thoroughly analyzed in this paper. As a result, an extended model of X73PHD is proposed, preserving its essential features while extending them with added value. Copyright © 2016 Elsevier Inc. All rights reserved.
Development of Extended Content Standards for Biodiversity Data
NASA Astrophysics Data System (ADS)
Hugo, Wim; Schmidt, Jochen; Saarenmaa, Hannu
2013-04-01
Interoperability in the field of Biodiversity observation has been strongly driven by the development of a number of global initiatives (GEO, GBIF, OGC, TDWG, GenBank, …) and its supporting standards (OGC-WxS, OGC-SOS, Darwin Core (DwC), NetCDF, …). To a large extent, these initiatives have focused on discoverability and standardization of syntactic and schematic interoperability. Semantic interoperability is more complex, requiring development of domain-dependent conceptual data models, and extension of these models with appropriate ontologies (typically manifested as controlled vocabularies). Biodiversity content has been standardized partly, for example through Darwin Core for occurrence data and associated taxonomy, and through Genbank for genetic data, but other contexts of biodiversity observation have lagged behind - making it difficult to achieve semantic interoperability between distributed data sources. With this in mind, WG8 of GEO BON (charged with data and systems interoperability) has started a work programme to address a number of concerns, one of which is the gap in content standards required to make Biodiversity data truly interoperable. The paper reports on the framework developed by WG8 for the classification of Biodiversity observation data into 'families' of use cases and its supporting data schema, where gaps, if any, in the availability if content standards have been identified, and how these are to be addressed by way of an abstract data model and the development of associated content standards. It is proposed that a minimum set of standards (1) will be required to address the scope of Biodiversity content, aligned with levels and dimensions of observation, and based on the 'Essential Biodiversity Variables' (2) being developed by GEO BON . The content standards are envisaged as loosely separated from the syntactic and schematic standards used for the base data exchange: typically, services would offer an existing data standard (DwC, WFS, SOS, NetCDF), with a use-case dependent 'payload' embedded into the data stream. This enables the re-use of the abstract schema, and sometimes the implementation specification (for example XML, JSON, or NetCDF conventions) across services. An explicit aim will be to make the XML implementation specification re-usable as a DwC and a GML (SOS end WFS) extension. (1) Olga Lyashevska, Keith D. Farnsworth, How many dimensions of biodiversity do we need?, Ecological Indicators, Volume 18, July 2012, Pages 485-492, ISSN 1470-160X, 10.1016/j.ecolind.2011.12.016. (2) GEO BON: Workshop on Essential Biodiversity Variables (27-29 February 2012, Frascati, Italy). (http://www.earthobservations.org/geobon_docs_20120227.shtml)
Brokerage services for Earth Science data: the EuroGEOSS legacy (Invited)
NASA Astrophysics Data System (ADS)
Nativi, S.; Craglia, M.; Pearlman, J.
2013-12-01
Global sustainability research requires an integrated multidisciplinary effort underpinned by a collaborative environment discovering and accessing heterogeneous data across disciplines. Traditionally, interoperability has been achieved by implementing federation of systems. The federating approach entails the adoption of a set of common technologies and standards. This presentation argues that for complex (and uncontrolled) environments (such as global, multidisciplinary, and voluntary-based infrastructures) federated solutions must be completed and enhanced by a brokering approach -making available a set of brokerage services. In fact, brokerage services allows a cyber-infrastructure to lower entry barriers (for both data producers and users) and to better address the different domain specificities. The brokering interoperability approach was successfully experimented by the EuroGEOSS project, funded by the European Commission in the FP7 framework (see http://www.eurogeoss.eu). The EuroGEOSS Brokering framework provided the EuroGEOSS Capacity with multidisciplinary interoperability functionalities. This platform was developed applying several of the principles/requirements that characterize the System of Systems (SoS) approach and the Internet of Services (IoS) philosophy. The framework consists of three main brokers (middleware components implementing intermediation and harmonization services): a basic Discovery Broker, an advanced Semantic Discovery Broker, and an Access Broker. They are empowered by a suite of tools developed by the ESSI-lab of the CNR-IIA, called: GI-cat, GI-sem, and GI-axe. The EuroGEOSS brokering framework was considered and successfully adopted by cross-disciplinary initiatives (notably GEOSS: Global Earth Observation System of Systems). The brokerage services have been advanced and extended; the new brokering framework is called GEO DAB (Discovery and Access Broker). New brokerage services have been developed in the framework of other European Commission funded projects (e.g. GeoViQua). More recently, the NSF EarthCube initiative decided to fund a project dealing with brokerage services. In the framework of the GEO AIP-6 (Architecture Implementation Pilot -phase 6), the presented brokerage platform has been used by the Water Working Group to carry out improved data access for parameterization and model development.
The European Network of Analytical and Experimental Laboratories for Geosciences
NASA Astrophysics Data System (ADS)
Freda, Carmela; Funiciello, Francesca; Meredith, Phil; Sagnotti, Leonardo; Scarlato, Piergiorgio; Troll, Valentin R.; Willingshofer, Ernst
2013-04-01
Integrating Earth Sciences infrastructures in Europe is the mission of the European Plate Observing System (EPOS).The integration of European analytical, experimental, and analogue laboratories plays a key role in this context and is the task of the EPOS Working Group 6 (WG6). Despite the presence in Europe of high performance infrastructures dedicated to geosciences, there is still limited collaboration in sharing facilities and best practices. The EPOS WG6 aims to overcome this limitation by pushing towards national and trans-national coordination, efficient use of current laboratory infrastructures, and future aggregation of facilities not yet included. This will be attained through the creation of common access and interoperability policies to foster and simplify personnel mobility. The EPOS ambition is to orchestrate European laboratory infrastructures with diverse, complementary tasks and competences into a single, but geographically distributed, infrastructure for rock physics, palaeomagnetism, analytical and experimental petrology and volcanology, and tectonic modeling. The WG6 is presently organizing its thematic core services within the EPOS distributed research infrastructure with the goal of joining the other EPOS communities (geologists, seismologists, volcanologists, etc...) and stakeholders (engineers, risk managers and other geosciences investigators) to: 1) develop tools and services to enhance visitor programs that will mutually benefit visitors and hosts (transnational access); 2) improve support and training activities to make facilities equally accessible to students, young researchers, and experienced users (training and dissemination); 3) collaborate in sharing technological and scientific know-how (transfer of knowledge); 4) optimize interoperability of distributed instrumentation by standardizing data collection, archive, and quality control standards (data preservation and interoperability); 5) implement a unified e-Infrastructure for data analysis, numerical modelling, and joint development and standardization of numerical tools (e-science implementation); 6) collect and store data in a flexible inventory database accessible within and beyond the Earth Sciences community(open access and outreach); 7) connect to environmental and hazard protection agencies, stakeholders, and public to raise consciousness of geo-hazards and geo-resources (innovation for society). We will inform scientists and industrial stakeholders on the most recent WG6 achievements in EPOS and we will show how our community is proceeding to design the thematic core services.
Linking User Identities Across the DataONE Federation of Data Repositories
NASA Astrophysics Data System (ADS)
Jones, M. B.; Mecum, B.; Leinfelder, B.; Jones, C. S.; Walker, L.
2016-12-01
DataONE provides services for identifying, authenticating, and authorizing researchers to access and contribute data to repositories within the DataONE federation. In the earth sciences, thousands of institutional and disciplinary repositories have created their own user identity and authentication systems with their own user directory based on a database or web content management systems. Thus, researchers have many identities that are neither linked nor interoperable, making it difficult to reference the identity of these users across systems. Key user information is hidden, and only a non-disambiguated name is often available. From a sample of 160,000 data sets within DataONE, a super-majority of references to the data creators lack even an email address. In an attempt to disambiguate these people via the GeoLink project, we conservatively estimate they represent at least 57,000 unique identities, but without a clear user identifier, there could be as many as 223,000. Interoperability among repositories is critical to improving the scope of scientific synthesis and capabilities for research collaboration. While many have focused on the convenience of Single Sign-On (SSO), we have found that sharing user identifiers is far more useful for interoperability. With an unambiguous user identity in incoming metadata, DataONE has built user-profiles that present that user's data across repositories, that link users and their organizational affiliations, and that allow users to work collaboratively in private groups that span repository systems. DataONE's user identity solution leverages existing systems such as InCommon, CILogon, Google, and ORCID to not further proliferate user identities. DataONE provides a core service allowing users to link their multiple identities so that authenticating with one identity (e.g., ORCID) can authorize access to data protected via another identity (e.g., InCommon). Currently, DataONE is using ORCID identities to link and identify users, but challenges must still be overcome to support historical records for which ORCIDs can not be used because the associated people are unavailable to confirm their identity. DataONE's identity systems facilitate crosslinking between user identities and scientific metadata to accelerate collaboration and synthesis.
The ISO Edi Conceptual Model Activity and Its Relationship to OSI.
ERIC Educational Resources Information Center
Fincher, Judith A.
1990-01-01
The edi conceptual model is being developed to define common structures, services, and processes that syntax-specific standards like X12 and EDIFACT could adopt. Open Systems Interconnection (OSI) is of interest to edi because of its potential to help enable global interoperability across Electronic Data Interchange (EDI) functional groups. A…
Measures of Effectiveness for Rationalization, Standardization, and Interoperability
1988-09-01
and Standardization Group, Bonn COL Weichel LTC Corn D-2 U.S. Mission NATO COL Osborne MAJ Brand COL Smith MAJ Brew LTC Sendak Supreme Headquarters...Allie, ý,%vers Europe (SHAPE) COL Hanley CDR Lachlan COL Ross MAJ Heidema COL Solli MAJ Seay COL Tudor NAI 0 Maintenance and Supply Agency (NAMSA) Mr
Local, regional and national interoperability in hospital-level systems architecture.
Mykkänen, J; Korpela, M; Ripatti, S; Rannanheimo, J; Sorri, J
2007-01-01
Interoperability of applications in health care is faced with various needs by patients, health professionals, organizations and policy makers. A combination of existing and new applications is a necessity. Hospitals are in a position to drive many integration solutions, but need approaches which combine local, regional and national requirements and initiatives with open standards to support flexible processes and applications on a local hospital level. We discuss systems architecture of hospitals in relation to various processes and applications, and highlight current challenges and prospects using a service-oriented architecture approach. We also illustrate these aspects with examples from Finnish hospitals. A set of main services and elements of service-oriented architectures for health care facilities are identified, with medium-term focus which acknowledges existing systems as a core part of service-oriented solutions. The services and elements are grouped according to functional and interoperability cohesion. A transition towards service-oriented architecture in health care must acknowledge existing health information systems and promote the specification of central processes and software services locally and across organizations. Software industry best practices such as SOA must be combined with health care knowledge to respond to central challenges such as continuous change in health care. A service-oriented approach cannot entirely rely on common standards and frameworks but it must be locally adapted and complemented.
A cloud-based approach for interoperable electronic health records (EHRs).
Bahga, Arshdeep; Madisetti, Vijay K
2013-09-01
We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.
Reuse and Interoperability of Avionics for Space Systems
NASA Technical Reports Server (NTRS)
Hodson, Robert F.
2007-01-01
The space environment presents unique challenges for avionics. Launch survivability, thermal management, radiation protection, and other factors are important for successful space designs. Many existing avionics designs use custom hardware and software to meet the requirements of space systems. Although some space vendors have moved more towards a standard product line approach to avionics, the space industry still lacks similar standards and common practices for avionics development. This lack of commonality manifests itself in limited reuse and a lack of interoperability. To address NASA s need for interoperable avionics that facilitate reuse, several hardware and software approaches are discussed. Experiences with existing space boards and the application of terrestrial standards is outlined. Enhancements and extensions to these standards are considered. A modular stack-based approach to space avionics is presented. Software and reconfigurable logic cores are considered for extending interoperability and reuse. Finally, some of the issues associated with the design of reusable interoperable avionics are discussed.
2008-08-01
facilitate the use of existing architecture descriptions in performing interoperability measurement. Noting that “everything in the world can be expressed as...biological, botanical, and genetic research, it has also been used with great success in the fields of ecology, medicine, the social sciences, the...appropriate for at least three reasons. First, systems perform different interoperations in different scenarios (i.e., they are used differently); second
Dandanell, G
1992-01-01
The interoperator distance between a synthetic operator Os and the deoP2O2-galK fusion was varied between 46 and 176 bp. The repression of the deoP2 directed galK expression as a function of the interoperator distance (center-to-center) was measured in vivo in a single-copy system. The results show that the DeoR repressor efficiently can repress transcription at all the interoperator distances tested. The degree of repression depends very little on the spacing between the operators, however, a weak periodic dependency of 8-11 bp may exist. PMID:1437558
Cloud access to interoperable IVOA-compliant VOSpace storage
NASA Astrophysics Data System (ADS)
Bertocco, S.; Dowler, P.; Gaudet, S.; Major, B.; Pasian, F.; Taffoni, G.
2018-07-01
Handling, processing and archiving the huge amount of data produced by the new generation of experiments and instruments in Astronomy and Astrophysics are among the more exciting challenges to address in designing the future data management infrastructures and computing services. We investigated the feasibility of a data management and computation infrastructure, available world-wide, with the aim of merging the FAIR data management provided by IVOA standards with the efficiency and reliability of a cloud approach. Our work involved the Canadian Advanced Network for Astronomy Research (CANFAR) infrastructure and the European EGI federated cloud (EFC). We designed and deployed a pilot data management and computation infrastructure that provides IVOA-compliant VOSpace storage resources and wide access to interoperable federated clouds. In this paper, we detail the main user requirements covered, the technical choices and the implemented solutions and we describe the resulting Hybrid cloud Worldwide infrastructure, its benefits and limitations.
An approach for the semantic interoperability of ISO EN 13606 and OpenEHR archetypes.
Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2010-10-01
The communication between health information systems of hospitals and primary care organizations is currently an important challenge to improve the quality of clinical practice and patient safety. However, clinical information is usually distributed among several independent systems that may be syntactically or semantically incompatible. This fact prevents healthcare professionals from accessing clinical information of patients in an understandable and normalized way. In this work, we address the semantic interoperability of two EHR standards: OpenEHR and ISO EN 13606. Both standards follow the dual model approach which distinguishes information and knowledge, this being represented through archetypes. The solution presented here is capable of transforming OpenEHR archetypes into ISO EN 13606 and vice versa by combining Semantic Web and Model-driven Engineering technologies. The resulting software implementation has been tested using publicly available collections of archetypes for both standards.
Standardized Representation of Clinical Study Data Dictionaries with CIMI Archetypes
Sharma, Deepak K.; Solbrig, Harold R.; Prud’hommeaux, Eric; Pathak, Jyotishman; Jiang, Guoqian
2016-01-01
Researchers commonly use a tabular format to describe and represent clinical study data. The lack of standardization of data dictionary’s metadata elements presents challenges for their harmonization for similar studies and impedes interoperability outside the local context. We propose that representing data dictionaries in the form of standardized archetypes can help to overcome this problem. The Archetype Modeling Language (AML) as developed by the Clinical Information Modeling Initiative (CIMI) can serve as a common format for the representation of data dictionary models. We mapped three different data dictionaries (identified from dbGAP, PheKB and TCGA) onto AML archetypes by aligning dictionary variable definitions with the AML archetype elements. The near complete alignment of data dictionaries helped map them into valid AML models that captured all data dictionary model metadata. The outcome of the work would help subject matter experts harmonize data models for quality, semantic interoperability and better downstream data integration. PMID:28269909
Standardized Representation of Clinical Study Data Dictionaries with CIMI Archetypes.
Sharma, Deepak K; Solbrig, Harold R; Prud'hommeaux, Eric; Pathak, Jyotishman; Jiang, Guoqian
2016-01-01
Researchers commonly use a tabular format to describe and represent clinical study data. The lack of standardization of data dictionary's metadata elements presents challenges for their harmonization for similar studies and impedes interoperability outside the local context. We propose that representing data dictionaries in the form of standardized archetypes can help to overcome this problem. The Archetype Modeling Language (AML) as developed by the Clinical Information Modeling Initiative (CIMI) can serve as a common format for the representation of data dictionary models. We mapped three different data dictionaries (identified from dbGAP, PheKB and TCGA) onto AML archetypes by aligning dictionary variable definitions with the AML archetype elements. The near complete alignment of data dictionaries helped map them into valid AML models that captured all data dictionary model metadata. The outcome of the work would help subject matter experts harmonize data models for quality, semantic interoperability and better downstream data integration.
Leveraging standards to support patient-centric interdisciplinary plans of care.
Dykes, Patricia C; DaDamio, Rebecca R; Goldsmith, Denise; Kim, Hyeon-eui; Ohashi, Kumiko; Saba, Virginia K
2011-01-01
As health care systems and providers move towards meaningful use of electronic health records, the once distant vision of collaborative patient-centric, interdisciplinary plans of care, generated and updated across organizations and levels of care, may soon become a reality. Effective care planning is included in the proposed Stages 2-3 Meaningful Use quality measures. To facilitate interoperability, standardization of plan of care messaging, content, information and terminology models are needed. This degree of standardization requires local and national coordination. The purpose of this paper is to review some existing standards that may be leveraged to support development of interdisciplinary patient-centric plans of care. Standards are then applied to a use case to demonstrate one method for achieving patient-centric and interoperable interdisciplinary plan of care documentation. Our pilot work suggests that existing standards provide a foundation for adoption and implementation of patient-centric plans of care that are consistent with federal requirements.
Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach
NASA Astrophysics Data System (ADS)
Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.
Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.
An authentication infrastructure for today and tomorrow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engert, D.E.
1996-06-01
The Open Software Foundation`s Distributed Computing Environment (OSF/DCE) was originally designed to provide a secure environment for distributed applications. By combining it with Kerberos Version 5 from MIT, it can be extended to provide network security as well. This combination can be used to build both an inter and intra organizational infrastructure while providing single sign-on for the user with overall improved security. The ESnet community of the Department of Energy is building just such an infrastructure. ESnet has modified these systems to improve their interoperability, while encouraging the developers to incorporate these changes and work more closely together tomore » continue to improve the interoperability. The success of this infrastructure depends on its flexibility to meet the needs of many applications and network security requirements. The open nature of Kerberos, combined with the vendor support of OSF/DCE, provides the infrastructure for today and tomorrow.« less
Flexible solution for interoperable cloud healthcare systems.
Vida, Mihaela Marcella; Lupşe, Oana Sorina; Stoicu-Tivadar, Lăcrămioara; Bernad, Elena
2012-01-01
It is extremely important for the healthcare domain to have a standardized communication because will improve the quality of information and in the end the resulting benefits will improve the quality of patients' life. The standards proposed to be used are: HL7 CDA and CCD. For a better access to the medical data a solution based on cloud computing (CC) is investigated. CC is a technology that supports flexibility, seamless care, and reduced costs of the medical act. To ensure interoperability between healthcare information systems a solution creating a Web Custom Control is presented. The control shows the database tables and fields used to configure the two standards. This control will facilitate the work of the medical staff and hospital administrators, because they can configure the local system easily and prepare it for communication with other systems. The resulted information will have a higher quality and will provide knowledge that will support better patient management and diagnosis.
Marschollek, Michael; Wolf, Klaus-H; Bott, Oliver-J; Geisler, Mirko; Plischke, Maik; Ludwig, Wolfram; Hornberger, Andreas; Haux, Reinhold
2007-01-01
Despite the abundance of past home care projects and the maturity of the technologies used, there is no widespread dissemination as yet. The absence of accepted standards and thus interoperability and the inadequate integration into transinstitutional health information systems (tHIS) are perceived as key factors. Based on the respective literature and previous experiences in home care projects we propose an architectural model for home care as part of a transinstitutional health information system using the HL7 clinical document architecture (CDA) as well as the HL7 Arden Syntax for Medical Logic Systems. In two short case studies we describe the practical realization of the architecture as well as first experiences. Our work can be regarded as a first step towards an interoperable - and in our view sustainable - home care architecture based on a prominent document standard from the health information system domain.
Sinaci, A Anil; Laleci Erturkmen, Gokce B
2013-10-01
In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. Copyright © 2013 Elsevier Inc. All rights reserved.
Ovies-Bernal, Diana Paola; Agudelo-Londoño, Sandra M
2014-01-01
Identify shared criteria used throughout the world in the implementation of interoperable National Health Information Systems (NHIS) and provide validated scientific information on the dimensions affecting interoperability. This systematic review sought to identify primary articles on the implementation of interoperable NHIS published in scientific journals in English, Portuguese, or Spanish between 1990 and 2011 through a search of eight databases of electronic journals in the health sciences and informatics: MEDLINE (PubMed), Proquest, Ovid, EBSCO, MD Consult, Virtual Health Library, Metapress, and SciELO. The full texts of the articles were reviewed, and those that focused on technical computer aspects or on normative issues were excluded, as well as those that did not meet the quality criteria for systematic reviews of interventions. Of 291 studies found and reviewed, only five met the inclusion criteria. These articles reported on the process of implementing an interoperable NHIS in Brazil, China, the United States, Turkey, and the Semiautonomous Region of Zanzíbar, respectively. Five common basic criteria affecting implementation of the NHIS were identified: standards in place to govern the process, availability of trained human talent, financial and structural constraints, definition of standards, and assurance that the information is secure. Four dimensions affecting interoperability were defined: technical, semantic, legal, and organizational. The criteria identified have to be adapted to the actual situation in each country and a proactive approach should be used to ensure that implementation of the interoperable NHIS is strategic, simple, and reliable.
LVC Architecture Roadmap Implementation - Results of the First Two Years
2012-03-01
NOTES Presented at the Simulation Interoperability Standards Organization?s (SISO) Spring Simulation Interoperability Workshop ( SIW ), 26-30 March...presented at the semi-annual Simulation Interoperability Workshops ( SIWs ) and the annual Interservice/Industry Training, Simulation & Education Conference...I/ITSEC), as well as other venues. For example, a full-day workshop on the initial progress of the effort was conducted at the 2010 Spring SIW [2
Informatics for multi-disciplinary ocean sciences
NASA Astrophysics Data System (ADS)
Pearlman, Jay; Delory, Eric; Pissierssens, Peter; Raymond, Lisa; Simpson, Pauline; Waldmann, Christoph; Williams 3rd, Albert; Yoder, Jim
2014-05-01
Ocean researchers must work across disciplines to provide clear and understandable assessments of the state of the ocean. With advances in technology, not only in observation, but also communication and computer science, we are in a new era where we can answer questions at the time and space scales that are relevant to our state of the art research needs. This presentation will address three areas of the informatics of the end-to-end process: sensors and information extraction in the sensing environment; using diverse data for understanding selected ocean processes; and supporting open data initiatives. A National Science Foundation funded Ocean Observations Research Coordination Network (RCN) is addressing these areas from the perspective of improving interdisciplinary research. The work includes an assessment of Open Data Access with a paper in preparation. Interoperability and sensors is a new activity that couples with European projects, COOPEUS and NeXOS, in looking at sensors and related information systems for a new generation of measurement capability. A working group on synergies of in-situ and satellite remote sensing is analyzing approaches for more effective use of these measurements. This presentation will examine the steps forward for data exchange and for addressing gaps in communication and informatics.
A Semantic Web-based System for Managing Clinical Archetypes.
Fernandez-Breis, Jesualdo Tomas; Menarguez-Tortosa, Marcos; Martinez-Costa, Catalina; Fernandez-Breis, Eneko; Herrero-Sempere, Jose; Moner, David; Sanchez, Jesus; Valencia-Garcia, Rafael; Robles, Montserrat
2008-01-01
Archetypes facilitate the sharing of clinical knowledge and therefore are a basic tool for achieving interoperability between healthcare information systems. In this paper, a Semantic Web System for Managing Archetypes is presented. This system allows for the semantic annotation of archetypes, as well for performing semantic searches. The current system is capable of working with both ISO13606 and OpenEHR archetypes.
CEOS WGISS Common Data Framework for WGISS Connected Data Assets
NASA Technical Reports Server (NTRS)
Enloe, Yonsook; Mitchell, Andrew; Albani, Mirko; Yapur, Martin
2016-01-01
This session will explore the benefits of having such a policy framework and future steps both domestically and internationally. Speakers can highlight current work being done to improve data interoperability, how the Common Framework is relevant for other data types, other countries and multinational organizations, and considerations for data management that have yet to be addressed in the Common Framework.
A Working Framework for Enabling International Science Data System Interoperability
NASA Astrophysics Data System (ADS)
Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.
2016-07-01
For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.
Roadmap for Testing and Validation of Electric Vehicle Communication Standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pratt, Richard M.; Tuffner, Francis K.; Gowri, Krishnan
Vehicle to grid communication standards are critical to the charge management and interoperability among plug-in electric vehicles (PEVs), charging stations and utility providers. The Society of Automobile Engineers (SAE), International Organization for Standardization (ISO), International Electrotechnical Commission (IEC) and the ZigBee Alliance are developing requirements for communication messages and protocols. While interoperability standards development has been in progress for more than two years, no definitive guidelines are available for the automobile manufacturers, charging station manufacturers or utility backhaul network systems. At present, there is a wide range of proprietary communication options developed and supported in the industry. Recent work bymore » the Electric Power Research Institute (EPRI), in collaboration with SAE and automobile manufacturers, has identified performance requirements and developed a test plan based on possible communication pathways using power line communication (PLC). Though the communication pathways and power line communication technology options are identified, much work needs to be done in developing application software and testing of communication modules before these can be deployed in production vehicles. This paper presents a roadmap and results from testing power line communication modules developed to meet the requirements of SAE J2847/1 standard.« less
2012-08-01
15 August 2012 2 UNCLASSIFIED VICTORY Overview • The VICTORY technical approach – Incorporate open-standards – Shared services (VICTORY services...availability) 15 August 2012 3 UNCLASSIFIED VICTORY Shared Services • VICTORY core services (Position, Orientation, Direction- of-Travel, and Time) • Service
Harmonising phenomics information for a better interoperability in the rare disease field.
Maiella, Sylvie; Olry, Annie; Hanauer, Marc; Lanneau, Valérie; Lourghi, Halima; Donadille, Bruno; Rodwell, Charlotte; Köhler, Sebastian; Seelow, Dominik; Jupp, Simon; Parkinson, Helen; Groza, Tudor; Brudno, Michael; Robinson, Peter N; Rath, Ana
2018-02-07
HIPBI-RD (Harmonising phenomics information for a better interoperability in the rare disease field) is a three-year project which started in 2016 funded via the E-Rare 3 ERA-NET program. This project builds on three resources largely adopted by the rare disease (RD) community: Orphanet, its ontology ORDO (the Orphanet Rare Disease Ontology), HPO (the Human Phenotype Ontology) as well as PhenoTips software for the capture and sharing of structured phenotypic data for RD patients. Our project is further supported by resources developed by the European Bioinformatics Institute and the Garvan Institute. HIPBI-RD aims to provide the community with an integrated, RD-specific bioinformatics ecosystem that will harmonise the way phenomics information is stored in databases and patient files worldwide, and thereby contribute to interoperability. This ecosystem will consist of a suite of tools and ontologies, optimized to work together, and made available through commonly used software repositories. The project workplan follows three main objectives: The HIPBI-RD ecosystem will contribute to the interpretation of variants identified through exome and full genome sequencing by harmonising the way phenotypic information is collected, thus improving diagnostics and delineation of RD. The ultimate goal of HIPBI-RD is to provide a resource that will contribute to bridging genome-scale biology and a disease-centered view on human pathobiology. Achievements in Year 1. Copyright © 2018. Published by Elsevier Masson SAS.
The PSML format and library for norm-conserving pseudopotential data curation and interoperability
NASA Astrophysics Data System (ADS)
García, Alberto; Verstraete, Matthieu J.; Pouillon, Yann; Junquera, Javier
2018-06-01
Norm-conserving pseudopotentials are used by a significant number of electronic-structure packages, but the practical differences among codes in the handling of the associated data hinder their interoperability and make it difficult to compare their results. At the same time, existing formats lack provenance data, which makes it difficult to track and document computational workflows. To address these problems, we first propose a file format (PSML) that maps the basic concepts of the norm-conserving pseudopotential domain in a flexible form and supports the inclusion of provenance information and other important metadata. Second, we provide a software library (libPSML) that can be used by electronic structure codes to transparently extract the information in the file and adapt it to their own data structures, or to create converters for other formats. Support for the new file format has been already implemented in several pseudopotential generator programs (including ATOM and ONCVPSP), and the library has been linked with SIESTA and ABINIT, allowing them to work with the same pseudopotential operator (with the same local part and fully non-local projectors) thus easing the comparison of their results for the structural and electronic properties, as shown for several example systems. This methodology can be easily transferred to any other package that uses norm-conserving pseudopotentials, and offers a proof-of-concept for a general approach to interoperability.
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Castronova, A. M.; Miles, B.; Li, Z.; Morsy, M. M.; Crawley, S.; Ramirez, M.; Sadler, J.; Xue, Z.; Bandaragoda, C.
2016-12-01
How do you share and publish hydrologic data and models for a large collaborative project? HydroShare is a new, web-based system for sharing hydrologic data and models with specific functionality aimed at making collaboration easier. HydroShare has been developed with U.S. National Science Foundation support under the auspices of the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) to support the collaboration and community cyberinfrastructure needs of the hydrology research community. Within HydroShare, we have developed new functionality for creating datasets, describing them with metadata, and sharing them with collaborators. We cast hydrologic datasets and models as "social objects" that can be shared, collaborated around, annotated, published and discovered. In addition to data and model sharing, HydroShare supports web application programs (apps) that can act on data stored in HydroShare, just as software programs on your PC act on your data locally. This can free you from some of the limitations of local computing capacity and challenges in installing and maintaining software on your own PC. HydroShare's web-based cyberinfrastructure can take work off your desk or laptop computer and onto infrastructure or "cloud" based data and processing servers. This presentation will describe HydroShare's collaboration functionality that enables both public and private sharing with individual users and collaborative user groups, and makes it easier for collaborators to iterate on shared datasets and models, creating multiple versions along the way, and publishing them with a permanent landing page, metadata description, and citable Digital Object Identifier (DOI) when the work is complete. This presentation will also describe the web app architecture that supports interoperability with third party servers functioning as application engines for analysis and processing of big hydrologic datasets. While developed to support the cyberinfrastructure needs of the hydrology community, the informatics infrastructure for programmatic interoperability of web resources has a generality beyond the solution of hydrology problems that will be discussed.
NASA Astrophysics Data System (ADS)
Kruger, Scott; Shasharina, S.; Vadlamani, S.; McCune, D.; Holland, C.; Jenkins, T. G.; Candy, J.; Cary, J. R.; Hakim, A.; Miah, M.; Pletzer, A.
2010-11-01
As various efforts to integrate fusion codes proceed worldwide, standards for sharing data have emerged. In the U.S., the SWIM project has pioneered the development of the Plasma State, which has a flat-hierarchy and is dominated by its use within 1.5D transport codes. The European Integrated Tokamak Modeling effort has developed a more ambitious data interoperability effort organized around the concept of Consistent Physical Objects (CPOs). CPOs have deep hierarchies as needed by an effort that seeks to encompass all of fusion computing. Here, we discuss ideas for implementing data interoperability that is complementary to both the Plasma State and CPOs. By making use of attributes within the netcdf and HDF5 binary file formats, the goals of data interoperability can be achieved with a more informal approach. In addition, a file can be simultaneously interoperable to several standards at once. As an illustration of this approach, we discuss its application to the development of synthetic diagnostics that can be used for multiple codes.
GEOSS AIP-2 Climate Change and Biodiversity Use Scenarios: Interoperability Infrastructures
NASA Astrophysics Data System (ADS)
Nativi, Stefano; Santoro, Mattia
2010-05-01
In the last years, scientific community is producing great efforts in order to study the effects of climate change on life on Earth. In this general framework, a key role is played by the impact of climate change on biodiversity. To assess this, several use scenarios require the modeling of climatological change impact on the regional distribution of biodiversity species. Designing and developing interoperability infrastructures which enable scientists to search, discover, access and use multi-disciplinary resources (i.e. datasets, services, models, etc.) is currently one of the main research fields for the Earth and Space Science Informatics. This presentation introduces and discusses an interoperability infrastructure which implements the discovery, access, and chaining of loosely-coupled resources in the climatology and biodiversity domains. This allows to set up and run forecast and processing models. The presented framework was successfully developed and experimented in the context of GEOSS AIP-2 (Global Earth Observation System of Systems, Architecture Implementation Pilot- Phase 2) Climate Change & Biodiversity thematic Working Group. This interoperability infrastructure is comprised of the following main components and services: a)GEO Portal: through this component end user is able to search, find and access the needed services for the scenario execution; b)Graphical User Interface (GUI): this component provides user interaction functionalities. It controls the workflow manager to perform the required operations for the scenario implementation; c)Use Scenario controller: this component acts as a workflow controller implementing the scenario business process -i.e. a typical climate change & biodiversity projection scenario; d)Service Broker implementing Mediation Services: this component realizes a distributed catalogue which federates several discovery and access components (exposing them through a unique CSW standard interface). Federated components publish climate, environmental and biodiversity datasets; e)Ecological Niche Model Server: this component is able to run one or more Ecological Niche Models (ENM) on selected biodiversity and climate datasets; f)Data Access Transaction server: this component publishes the model outputs. This framework was assessed in two use scenarios of GEOSS AIP-2 Climate Change and Biodiversity WG. Both scenarios concern the prediction of species distributions driven by climatological change forecasts. The first scenario dealt with the Pikas specie regional distribution in the Great Basin area (North America). While, the second one concerned the modeling of the Arctic Food Chain species in the North Pole area -the relationships between different environmental parameters and Polar Bears distribution was analyzed. The scientific patronage was provided by the University of Colorado and the University of Alaska, respectively. Results are published in the GEOSS AIP-2 web site: http://www.ogcnetwork.net/AIP2develop.
CAD Services: an Industry Standard Interface for Mechanical CAD Interoperability
NASA Technical Reports Server (NTRS)
Claus, Russell; Weitzer, Ilan
2002-01-01
Most organizations seek to design and develop new products in increasingly shorter time periods. At the same time, increased performance demands require a team-based multidisciplinary design process that may span several organizations. One approach to meet these demands is to use 'Geometry Centric' design. In this approach, design engineers team their efforts through one united representation of the design that is usually captured in a CAD system. Standards-based interfaces are critical to provide uniform, simple, distributed services that enable the 'Geometry Centric' design approach. This paper describes an industry-wide effort, under the Object Management Group's (OMG) Manufacturing Domain Task Force, to define interfaces that enable the interoperability of CAD, Computer Aided Manufacturing (CAM), and Computer Aided Engineering (CAE) tools. This critical link to enable 'Geometry Centric' design is called: Cad Services V1.0. This paper discusses the features of this standard and proposed application.
CytometryML: a data standard which has been designed to interface with other standards
NASA Astrophysics Data System (ADS)
Leif, Robert C.
2007-02-01
Because of the differences in the requirements, needs, and past histories including existing standards of the creating organizations, a single encompassing cytology-pathology standard will not, in the near future, replace the multiple existing or under development standards. Except for DICOM and FCS, these standardization efforts are all based on XML. CytometryML is a collection of XML schemas, which are based on the Digital Imaging and Communications in Medicine (DICOM) and Flow Cytometry Standard (FCS) datatypes. The CytometryML schemas contain attributes that link them to the DICOM standard and FCS. Interoperability with DICOM has been facilitated by, wherever reasonable, limiting the difference between CytometryML and the previous standards to syntax. In order to permit the Resource Description Framework, RDF, to reference the CytometryML datatypes, id attributes have been added to many CytometryML elements. The Laboratory Digital Imaging Project (LDIP) Data Exchange Specification and the Flowcyt standards development effort employ RDF syntax. Documentation from DICOM has been reused in CytometryML. The unity of analytical cytology was demonstrated by deriving a microscope type and a flow cytometer type from a generic cytometry instrument type. The feasibility of incorporating the Flowcyt gating schemas into CytometryML has been demonstrated. CytometryML is being extended to include many of the new DICOM Working Group 26 datatypes, which describe patients, specimens, and analytes. In situations where multiple standards are being created, interoperability can be facilitated by employing datatypes based on a common set of semantics and building in links to standards that employ different syntax.
NASA Technical Reports Server (NTRS)
Conroy, Mike; Gill, Paul; Ingalls, John; Bengtsson, Kjell
2014-01-01
No known system is in place to allow NASA technical data interoperability throughout the whole life cycle. Life Cycle Cost (LCC) will be higher on many developing programs if action isn't taken soon to join disparate systems efficiently. Disparate technical data also increases safety risks from poorly integrated elements. NASA requires interoperability and industry standards, but breaking legacy ways is a challenge.
Interacting with Multi-Robot Systems Using BML
2013-06-01
Pullen, U. Schade, J. Simonsen & R. Gomez-Veiga, NATO MSG-048 C-BML Final Report Summary. 2010 Fall Simulation Interoperability Workshop (10F- SIW -039...NATO MSG-085. 2012 Spring Simulation Interoperability Workshop (12S- SIW -045), Orlando, FL, March 2012. [3] T. Remmersmann, U. Schade, L. Khimeche...B. Grautreau & R. El Abdouni Khayari, Lessons Recognized: How to Combine BML and MSDL. 2012 Spring Simulation Interoperability Workshop (12S- SIW -012
A Linguistic Foundation for Communicating Geo-Information in the context of BML and geoBML
2010-03-23
BML Standard. 2009 Spring Simulation Interoperability Workshop (09S- SIW -046). San Diego, CA. Rein, K., Schade, U. & Hieb, M.R. (2009). Battle...Formalizing Battle Management Language: A Grammar for Specifying Orders. 2006 Spring Simulation Interoperability Workshop (06S- SIW - 068). Huntsville...Hieb, M.R. (2007). Battle Management Language: A Grammar for Specifying Reports. 2007 Spring Simulation Interoperability Workshop (07S- SIW -036
Semantic and syntactic interoperability in online processing of big Earth observation data.
Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea
2018-01-01
The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover).
Semantic and syntactic interoperability in online processing of big Earth observation data
Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea
2018-01-01
ABSTRACT The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover). PMID:29387171
The NASA Space Communications Data Networking Architecture
NASA Technical Reports Server (NTRS)
Israel, David J.; Hooke, Adrian J.; Freeman, Kenneth; Rush, John J.
2006-01-01
The NASA Space Communications Architecture Working Group (SCAWG) has recently been developing an integrated agency-wide space communications architecture in order to provide the necessary communication and navigation capabilities to support NASA's new Exploration and Science Programs. A critical element of the space communications architecture is the end-to-end Data Networking Architecture, which must provide a wide range of services required for missions ranging from planetary rovers to human spaceflight, and from sub-orbital space to deep space. Requirements for a higher degree of user autonomy and interoperability between a variety of elements must be accommodated within an architecture that necessarily features minimum operational complexity. The architecture must also be scalable and evolvable to meet mission needs for the next 25 years. This paper will describe the recommended NASA Data Networking Architecture, present some of the rationale for the recommendations, and will illustrate an application of the architecture to example NASA missions.
Packet telemetry and packet telecommand - The new generation of spacecraft data handling techniques
NASA Technical Reports Server (NTRS)
Hooke, A. J.
1983-01-01
Because of rising costs and reduced reliability of spacecraft and ground network hardware and software customization, standardization Packet Telemetry and Packet Telecommand concepts are emerging as viable alternatives. Autonomous packets of data, within each concept, which are created within ground and space application processes through the use of formatting techniques, are switched end-to-end through the space data network to their destination application processes through the use of standard transfer protocols. This process may result in facilitating a high degree of automation and interoperability because of completely mission-independent-designed intermediate data networks. The adoption of an international guideline for future space telemetry formatting of the Packet Telemetry concept, and the advancement of the NASA-ESA Working Group's Packet Telecommand concept to a level of maturity parallel to the of Packet Telemetry are the goals of the Consultative Committee for Space Data Systems. Both the Packet Telemetry and Packet Telecommand concepts are reviewed.
Hopkins, Mark E; Summers-Ables, Joy E; Clifton, Shari C; Coffman, Michael A
2011-06-01
To make electronic resources available to library users while effectively harnessing intellectual capital within the library, ultimately fostering the library's use of technology to interact asynchronously with its patrons (users). The methods used in the project included: (1) developing a new library website to facilitate the creation, management, accessibility, maintenance and dissemination of library resources; and (2) establishing ownership by those who participated in the project, while creating effective work allocation strategies through the implementation of a content management system that allowed the library to manage cost, complexity and interoperability. Preliminary results indicate that contributors to the system benefit from an increased understanding of the library's resources and add content valuable to library patrons. These strategies have helped promote the manageable creation and maintenance of electronic content in accomplishing the library's goal of interacting with its patrons. Establishment of a contributive system for adding to the library's electronic resources and electronic content has been successful. Further work will look at improving asynchronous interaction, particularly highlighting accessibility of electronic content and resources. © 2010 The authors. Health Information and Libraries Journal © 2010 Health Libraries Group.
NASA Astrophysics Data System (ADS)
Hucka, M.
2015-09-01
In common with many fields, including astronomy, a vast number of software tools for computational modeling and simulation are available today in systems biology. This wealth of resources is a boon to researchers, but it also presents interoperability problems. Despite working with different software tools, researchers want to disseminate their work widely as well as reuse and extend the models of other researchers. This situation led in the year 2000 to an effort to create a tool-independent, machine-readable file format for representing models: SBML, the Systems Biology Markup Language. SBML has since become the de facto standard for its purpose. Its success and general approach has inspired and influenced other community-oriented standardization efforts in systems biology. Open standards are essential for the progress of science in all fields, but it is often difficult for academic researchers to organize successful community-based standards. I draw on personal experiences from the development of SBML and summarize some of the lessons learned, in the hope that this may be useful to other groups seeking to develop open standards in a community-oriented fashion.
Coalition Interoperability Measurement Frameworks Literature Survey
2011-08-01
National Defence, 2011 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la Défense nationale, 2011 Abstract...interarmées élabore actuellement ce cadre et les travaux sont parrainés par le Groupe du Sous- ministre adjoint ( Gestion de l’information). L’intérêt...Groupe du Sous-ministre adjoint ( Gestion de l’information). Le type d’interopérabilité qui nous intéresse : 1. touche les Cinq pays (Australie
2016-07-13
ELECTRONIC HEALTH RECORDS VA’s Efforts Raise Concerns about Interoperability Goals and Measures, Duplication with DOD...Agencies, Committee on Appropriations, U.S. Senate July 13, 2016 ELECTRONIC HEALTH RECORDS VA’s Efforts Raise Concerns about Interoperability Goals...initiatives with the Department of Defense (DOD) that were intended to advance the ability of the two departments to share electronic health records , the
2011-07-01
Orlando, Florida, September 2009, 09F- SIW -090. [HLA (2000) - 1] - Modeling and Simulation Standard - High Level Architecture (HLA) – Framework and...Simulation Interoperability Workshop, Orlando, FL, USA, September 2009, 09F- SIW -023. [MaK] - www.mak.com [MIL-STD-3011] - MIL-STD-3011...Spring Simulation Interoperability Workshop, Norfolk, VA, USA, March 2007, 07S- SIW -072. [Ross] - Ross, P. and Clark, P. (2005), “Recommended
An HLA-Based Approach to Quantify Achievable Performance for Tactical Edge Applications
2011-05-01
in: Proceedings of the 2002 Fall Simulation Interoperability Workshop, 02F- SIW -068, Nov 2002. [16] P. Knight, et al. ―WBT RTI Independent...Benchmark Tests: Design, Implementation, and Updated Results‖, in: Proceedings of the 2002 Spring Simulation Interoperability Workshop, 02S- SIW -081, March...Interoperability Workshop, 98F- SIW -085, Nov 1998. [18] S. Ferenci and R. Fujimoto. ―RTI Performance on Shared Memory and Message Passing Architectures‖, in
2010-06-01
Military Scenario Definition Language (MSDL) for Nontraditional Warfare Scenarios," Paper 09S- SIW -001, Proceedings of the Spring Simulation...Update to the M&S Community," Paper 09S- SIW -002, Proceedings of the Spring Simulation Interoperability Workshop, Simulation Interoperability...Multiple Simulations: An Application of the Military Scenario Definition Language (MSDL)," Paper 09S- SIW -003, Proc. of the Spring Simulation
Planetary Sciences Interoperability at VO Paris Data Centre
NASA Astrophysics Data System (ADS)
Le Sidaner, P.; Aboudarham, J.; Birlan, M.; Briot, D.; Bonnin, X.; Cecconi, B.; Chauvin, C.; Erard, S.; Henry, F.; Lamy, L.; Mancini, M.; Normand, J.; Popescu, F.; Roques, F.; Savalle, R.; Schneider, J.; Shih, A.; Thuillot, W.; Vinatier, S.
2015-10-01
The Astronomy community has been developing interoperability since more than 10 years, by standardizing data access, data formats, and metadata. This international action is led by the International Virtual Observatory Alliance (IVOA). Observatoire de Paris is an active participant in this project. All actions on interoperability, data and service provision are centralized in and managed by VOParis Data Centre (VOPDC). VOPDC is a coordinated project from all scientific departments of Observatoire de Paris..
Interoperable and standard e-Health solution over Bluetooth.
Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J
2010-01-01
The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.
Evolution of System Architectures: Where Do We Need to Fail Next?
NASA Astrophysics Data System (ADS)
Bermudez, Luis; Alameh, Nadine; Percivall, George
2013-04-01
Innovation requires testing and failing. Thomas Edison was right when he said "I have not failed. I've just found 10,000 ways that won't work". For innovation and improvement of standards to happen, service Architectures have to be tested and tested. Within the Open Geospatial Consortium (OGC), testing of service architectures has occurred for the last 15 years. This talk will present an evolution of these service architectures and a possible future path. OGC is a global forum for the collaboration of developers and users of spatial data products and services, and for the advancement and development of international standards for geospatial interoperability. The OGC Interoperability Program is a series of hands-on, fast paced, engineering initiatives to accelerate the development and acceptance of OGC standards. Each initiative is organized in threads that provide focus under a particular theme. The first testbed, OGC Web Services phase 1, completed in 2003 had four threads: Common Architecture, Web Mapping, Sensor Web and Web Imagery Enablement. The Common Architecture was a cross-thread theme, to ensure that the Web Mapping and Sensor Web experiments built on a base common architecture. The architecture was based on the three main SOA components: Broker, Requestor and Provider. It proposed a general service model defining service interactions and dependencies; categorization of service types; registries to allow discovery and access of services; data models and encodings; and common services (WMS, WFS, WCS). For the latter, there was a clear distinction on the different services: Data Services (e.g. WMS), Application services (e.g. Coordinate transformation) and server-side client applications (e.g. image exploitation). The latest testbed, OGC Web Service phase 9, completed in 2012 had 5 threads: Aviation, Cross-Community Interoperability (CCI), Security and Services Interoperability (SSI), OWS Innovations and Compliance & Interoperability Testing & Evaluation (CITE). Compared to the first testbed, OWS-9 did not have a separate common architecture thread. Instead the emphasis was on brokering information models, securing them and making data available efficiently on mobile devices. The outcome is an architecture based on usability and non-intrusiveness while leveraging mediation of information models from different communities. This talk will use lessons learned from the evolution from OGC Testbed phase 1 to phase 9 to better understand how global and complex infrastructures evolve to support many communities including the Earth System Science Community.
Integrating Data and Networks: Human Factors
NASA Astrophysics Data System (ADS)
Chen, R. S.
2012-12-01
The development of technical linkages and interoperability between scientific networks is a necessary but not sufficient step towards integrated use and application of networked data and information for scientific and societal benefit. A range of "human factors" must also be addressed to ensure the long-term integration, sustainability, and utility of both the interoperable networks themselves and the scientific data and information to which they provide access. These human factors encompass the behavior of both individual humans and human institutions, and include system governance, a common framework for intellectual property rights and data sharing, consensus on terminology, metadata, and quality control processes, agreement on key system metrics and milestones, the compatibility of "business models" in the short and long term, harmonization of incentives for cooperation, and minimization of disincentives. Experience with several national and international initiatives and research programs such as the International Polar Year, the Group on Earth Observations, the NASA Earth Observing Data and Information System, the U.S. National Spatial Data Infrastructure, the Global Earthquake Model, and the United Nations Spatial Data Infrastructure provide a range of lessons regarding these human factors. Ongoing changes in science, technology, institutions, relationships, and even culture are creating both opportunities and challenges for expanded interoperability of scientific networks and significant improvement in data integration to advance science and the use of scientific data and information to achieve benefits for society as a whole.
Data specifications for INSPIRE
NASA Astrophysics Data System (ADS)
Portele, Clemens; Woolf, Andrew; Cox, Simon
2010-05-01
In Europe a major recent development has been the entering in force of the INSPIRE Directive in May 2007, establishing an infrastructure for spatial information in Europe to support Community environmental policies, and policies or activities which may have an impact on the environment. INSPIRE is based on the infrastructures for spatial information established and operated by the 27 Member States of the European Union. The Directive addresses 34 spatial data themes needed for environmental applications, with key components specified through technical implementing rules. This makes INSPIRE a unique example of a legislative "regional" approach. One of the requirements of the INSPIRE Directive is to make existing spatial data sets with relevance for one of the spatial data themes available in an interoperable way, i.e. where the spatial data from different sources in Europe can be combined to a coherent result. Since INSPIRE covers a wide range of spatial data themes, the first step has been the development of a modelling framework that provides a common foundation for all themes. This framework is largely based on the ISO 19100 series of standards. The use of common generic spatial modelling concepts across all themes is an important enabler for interoperability. As a second step, data specifications for the first set of themes has been developed based on the modelling framework. The themes include addresses, transport networks, protected sites, hydrography, administrative areas and others. The data specifications were developed by selected experts nominated by stakeholders from all over Europe. For each theme a working group was established in early 2008 working on their specific theme and collaborating with the other working groups on cross-theme issues. After a public review of the draft specifications starting in December 2008, an open testing process and thorough comment resolution process, the draft technical implementing rules for these themes have been approved by the INSPIRE Committee. After they enter into force they become part of the legal framework and European Member States have to implement these rules. The next step is the development of the remaining 25 spatial data themes, which include many themes of interest for the Earth Sciences including geology, meteorological and oceanographic geographic features, atmospheric conditions, habitats and biotopes, species distribution, environmental monitoring facilities, and land cover to name a few. The process will follow in general the same steps as for the first themes and the working groups are expected to start their work in March/April 2010. The first draft specifications for public comment are expected at the end of 2010 and the work is scheduled to be completed in 2012. At the same time, other initiatives like GMES (Global Monitoring for Environment and Security) and GEOSS (Global Earth Observation System of Systems) are also dealing with spatial data from the themes covered by INSPIRE. With the EU-funded project GIGAS, a support action, a step has been made towards architectural coherence between these initiatives. Recommendations to improve the coherence of the information architectures across the initiatives have been discussed in January 2010 with stakeholders from all initiatives, the standards organisations and EU-funded research projects. Based on the general agreements achieved in these discussions, the next step will be to start working towards the implementation of these recommendations, which are in line with the approach taken by the INSPIRE data specifications.
NASA Astrophysics Data System (ADS)
Arko, R. A.; Stocks, K.; Chandler, C. L.; Smith, S. R.; Miller, S. P.; Maffei, A. R.; Glaves, H. M.; Carbotte, S. M.
2013-12-01
The U.S. National Science Foundation supports a fleet of academic research vessels operating throughout the world's oceans. In addition to supporting the mission-specific goals of each expedition, these vessels routinely deploy a suite of underway environmental sensors, operating like mobile observatories. Recognizing that the data from these instruments have value beyond each cruise, NSF funded R2R in 2009 to ensure that these data are routinely captured, cataloged and described, and submitted to the appropriate national repository for long-term public access. In 2013, R2R joined the Ocean Data Interoperability Platform (ODIP; http://odip.org/). The goal of ODIP is to remove barriers to the effective sharing of data across scientific domains and international boundaries, by providing a forum to harmonize diverse regional systems. To advance this goal, ODIP organizes international workshops to foster the development of common standards and develop prototypes to evaluate and test potential standards and interoperability solutions. ODIP includes major organizations engaged in ocean data stewardship in the EU, US, and Australia, supported by the International Oceanographic Data and Information Exchange (IODE). Within the broad scope of ODIP, R2R focuses on contributions in 4 key areas: ● Implement a 'Linked Open Data' approach to disseminate data and documentation, using existing World Wide Web Consortium (W3C) specifications and machine-readable formats. Exposing content as Linked Open Data will provide a simple mechanism for ODIP collaborators to browse and compare data sets among repositories. ● Map key vocabularies used by R2R to their European and Australian counterparts. The existing heterogeneity among terms inhibits data discoverability, as a user searching on the term with which s/he is familiar may not find all data of interest. Mapping key terms across the different ODIP partners, relying on the backbone thesaurus provided by the NERC Vocabulary Server (http://vocab.nerc.ac.uk/), is a first step towards wider data discoverability. ● Upgrade existing R2R ISO metadata records to be compatible with the new SeaDataNet II Cruise Summary Report (CSR) profile, and publish the records in a standards-compliant Web portal, built on the GeoNetwork open-source package. ● Develop the future workforce. R2R is enlisting and exposing a group of five students to new informatics technologies and international collaboration. Students are undertaking a coordinated series of projects in 2013 and 2014 at each of the R2R partner institutions, combined with travel to selected meetings where they will engage in the ODIP Workshop process; present results; and exchange ideas with working scientists and software developers in Europe and Australia. Students work closely with staff at the R2R partner institutions, in projects that build on the R2R-ODIP technical work components described above.
NASA Astrophysics Data System (ADS)
Michener, W.
2010-12-01
Addressing grand environmental science challenges requires unprecedented access to easily understood data that cross the breadth of temporal, spatial, and thematic scales. From a scientist’s perspective, the big challenges lie in discovering the relevant data, dealing with extreme data heterogeneity, and converting data to information and knowledge. Addressing these challenges requires new approaches for managing, preserving, analyzing, and sharing data. DataONE is designed to be the foundation of new innovative environmental research that addresses questions of relevance to science and society. DataONE will ensure preservation and access to multi-scale, multi-discipline, and multi-national data. Operationally, DataONE encompasses a distributed global network of Member Nodes (i.e., data repositories) that provide open and persistent access to well-described and easily discovered Earth observational data. In addition, a smaller number of Coordinating Nodes (i.e., metadata repositories and service centers) support network-wide services such as data replication and access to an array of enabling tools. DataONE’s objectives are to: make biological data available from the genome to the ecosystem; make environmental data available from atmospheric, ecological, hydrological, and oceanographic sources; provide secure and long-term preservation and access; and engage scientists, land-managers, policy makers, students, educators, and the public through logical access and intuitive visualizations. The foundation for excellence of DataONE is the established collaboration among participating organizations that have multi-decade expertise in a wide range of fields that includes: existing archive initiatives, libraries, environmental observing systems and research networks, data and information management, science synthesis centers, and professional societies. DataONE is a means to serve a broad range of science domains directly and indirectly through interoperability with partnering networks. DataONE engages its community of partners through working groups focused on identifying, describing, and implementing the DataONE cyberinfrastructure, governance, and sustainability models. These working groups, which consist of a diverse group of graduate students, educators, government and industry representatives, and leading computer, information, and library scientists: (1) perform computer science, informatics, and social science research related to all stages of the data life cycle; (2) develop DataONE interfaces and prototypes; (3) adopt/adapt interoperability standards; (4) create value-added technologies (e.g., semantic mediation, scientific workflow, and visualization) that facilitate data integration, analysis, and understanding; (5) address socio-cultural barriers to sustainable data preservation and data sharing; and (6) promote the adoption of best practices for managing the full data life cycle.
Interoperability in healthcare: major challenges in the creation of the enterprise environment
NASA Astrophysics Data System (ADS)
Lindsköld, L.; Wintell, M.; Lundberg, N.
2009-02-01
There is today a lack of interoperability in healthcare although the need for it is obvious. A new healthcare enterprise environment has been deployed for secure healthcare interoperability in the Western Region in Sweden (WRS). This paper is an empirical overview of the new enterprise environment supporting regional shared and transparent radiology domain information in the WRS. The enterprise environment compromises 17 radiology departments, 1,5 million inhabitants, using different RIS and PACS in a joint work-oriented network and additional cardiology, dentistry and clinical physiology departments. More than 160 terabytes of information are stored in the enterprise repository. Interoperability is developed according to the IHE mission, i.e. applying standards such as Digital Imaging and Communication in Medicine (DICOM) and Health Level 7 (HL7) to address specific clinical communication needs and support optimal patient care. The entire enterprise environment is implemented and used daily in WRS. The central prerequisites in the development of the enterprise environment in western region of Sweden were: 1) information harmonization, 2) reuse of standardized messages e.g. HL7 v2.x and v3.x, 3) development of a holistic information domain including both text and images, and 4) to create a continuous and dynamic update functionality. The central challenges in this project were: 1) the many different vendors acting in the region and the negotiations with them to apply communication roles/profiles such as HL7 (CDA, CCR), DICOM, and XML, 2) the question of whom owns the data, and 3) incomplete technical standards. This study concludes that to create a workflow that runs within an enterprise environment there are a number of central prerequisites and challenges that needs to be in place. This calls for negotiations on an international, national and regional level with standardization organizations, vendors, health management and health personnel.
Meaningful use of health information technology and declines in in-hospital adverse drug events.
Furukawa, Michael F; Spector, William D; Rhona Limcangco, M; Encinosa, William E
2017-07-01
Nationwide initiatives have promoted greater adoption of health information technology as a means to reduce adverse drug events (ADEs). Hospital adoption of electronic health records with Meaningful Use (MU) capabilities expected to improve medication safety has grown rapidly. However, evidence that MU capabilities are associated with declines in in-hospital ADEs is lacking. Data came from the 2010-2013 Medicare Patient Safety Monitoring System and the 2008-2013 Healthcare Information and Management Systems Society (HIMSS) Analytics Database. Two-level random intercept logistic regression was used to estimate the association of MU capabilities and occurrence of ADEs, adjusting for patient characteristics, hospital characteristics, and year of observation. Rates of in-hospital ADEs declined by 19% from 2010 to 2013. Adoption of MU capabilities was associated with 11% lower odds of an ADE (95% confidence interval [CI], 0.84-0.96). Interoperability capability was associated with 19% lower odds of an ADE (95% CI, 0.67- 0.98). Adoption of MU capabilities explained 22% of the observed reduction in ADEs, or 67,000 fewer ADEs averted by MU. Concurrent with the rapid uptake of MU and interoperability, occurrence of in-hospital ADEs declined significantly from 2010 to 2013. MU capabilities and interoperability were associated with lower occurrence of ADEs, but the effects did not vary by experience with MU. About one-fifth of the decline in ADEs from 2010 to 2013 was attributable to MU capabilities. Findings support the contention that adoption of MU capabilities and interoperability spurred by the Health Information Technology for Economic and Clinical Health Act contributed in part to the recent decline in ADEs. Published by Oxford University Press on behalf of the American Medical Informatics Association 2017. This work is written by US Government employees and is in the public domain in the United States.
Semantic Interoperability Almost Without Using The Same Vocabulary: Is It Possible?
NASA Astrophysics Data System (ADS)
Krisnadhi, A. A.
2016-12-01
Semantic interoperability, which is a key requirement in realizing cross-repository data integration, is often understood as using the same ontology or vocabulary. Consequently, within a particular domain, one can easily assume that there has to be one unifying domain ontology covering as many vocabulary terms in the domain as possible in order to realize any form of data integration across multiple data sources. Furthermore, the desire to provide very precise definition of those many terms led to the development of huge, foundational and domain ontologies that are comprehensive, but too complicated, restrictive, monolithic, and difficult to use and reuse, which cause common data providers to avoid using them. This problem is especially true in a domain as diverse as geosciences as it is virtually impossible to reach an agreement to the semantics of many terms (e.g., there are hundreds of definitions of forest used throughout the world). To overcome this challenge, modular ontology architecture has emerged in recent years, fueled among others, by advances in the ontology design pattern research. Each ontology pattern models only one key notion. It can act as a small module of a larger ontology. Such a module is developed in such a way that it is largely independent of how other notions in the same domain are modeled. This leads to an increased reusability. Furthermore, an ontology formed out of such modules would have an improved understandability over large, monolithic ontologies. Semantic interoperability in the aforementioned architecture is not achieved by enforcing the use of the same vocabulary, but rather, promoting alignment to the same ontology patterns. In this work, we elaborate how this architecture realizes the above idea. In particular, we describe how multiple data sources with differing perspectives and vocabularies can interoperate through this architecture. Building the solution upon semantic technologies such as Linked Data and the Web Ontology Language (OWL), we demonstrate how a data integration solution based on this idea can be realized over different data repositories.
Progress of Interoperability in Planetary Research for Geospatial Data Analysis
NASA Astrophysics Data System (ADS)
Hare, T. M.; Gaddis, L. R.
2015-12-01
For nearly a decade there has been a push in the planetary science community to support interoperable methods of accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (i.e., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized image formats that retain geographic information (e.g., GeoTiff, GeoJpeg2000), digital geologic mapping conventions, planetary extensions for symbols that comply with U.S. Federal Geographic Data Committee cartographic and geospatial metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they have been modified to support the planetary domain. The motivation to support common, interoperable data format and delivery standards is not only to improve access for higher-level products but also to address the increasingly distributed nature of the rapidly growing volumes of data. The strength of using an OGC approach is that it provides consistent access to data that are distributed across many facilities. While data-steaming standards are well-supported by both the more sophisticated tools used in Geographic Information System (GIS) and remote sensing industries, they are also supported by many light-weight browsers which facilitates large and small focused science applications and public use. Here we provide an overview of the interoperability initiatives that are currently ongoing in the planetary research community, examples of their successful application, and challenges that remain.
Enabling OpenID Authentication for VO-integrated Portals
NASA Astrophysics Data System (ADS)
Plante, R.; Yekkirala, V.; Baker, W.
2012-09-01
To support interoperating services that share proprietary data and other user-specific information, the VAO Project provides login services for browser-based portals built on the open standard, OpenID. To help portal developers take advantage of this service, we have developed a downloadable toolkit for integrating OpenID single sign-on support into any portal. This toolkit provides APIs in a few languages commonly used on the server-side as well as a command-line version for use in any language. In addition to describing how to use this toolkit, we also discuss the general VAO framework for single sign-on. While a portal may, if it wishes, support any OpenID provider, the VAO service provides a few extra features to support VO interoperability. This includes a portal's ability to retrieve (with the user's permission) an X.509 certificate representing the authenticated user so that the portal can access other restricted services on the user's behalf. Other standard features of OpenID allow portals to request other information about the user; this feature will be used in the future for sharing information about a user's group membership to enable sharing within a group of collaborating scientists.
NASA Astrophysics Data System (ADS)
Pulsifer, P. L.; Parsons, M. A.; Duerr, R. E.; Fox, P. A.; Khalsa, S. S.; McCusker, J. P.; McGuinness, D. L.
2012-12-01
To address interoperability, we first need to understand how human perspectives and worldviews influence the way people conceive of and describe geophysical phenomena. There is never a single, unambiguous description of a phenomenon - the terminology used is based on the relationship people have with it and what their interests are. So how can these perspectives be reconciled in a way that is not only clear to different people but also formally described so that information systems can interoperate? In this paper we explore conceptions of Arctic sea ice as a means of exploring these issues. We examine multiple conceptions of sea ice and related processes as fundamental components of the Earth system. Arctic sea ice is undergoing rapid and dramatic decline. This will have huge impact on climate and biological systems as well as on shipping, exploration, human culture, and geopolitics. Local hunters, operational shipping forecasters, global climate researchers, and others have critical needs for sea ice data and information, but they conceive of, and describe sea ice phenomena in very different ways. Our hypothesis is that formally representing these diverse conceptions in a suite of formal ontologies can help facilitate sharing of information across communities and enhance overall Arctic data interoperability. We present initial work to model operational, research, and Indigenous (Iñupiat and Yup'ik) concepts of sea ice phenomena and data. Our results illustrate important and surprising differences in how these communities describe and represent sea ice, and we describe our approach to resolving incongruities and inconsistencies. We begin by exploring an intriguing information artifact, the World Meteorological Organization "egg code". The egg code is a compact, information rich way of illustrating detailed ice conditions that has been used broadly for a century. There is much agreement on construction and content encoding, but there are important regional differences in its application. Furthermore, it is an analog encoding scheme whose meaning has evolved over time. By semantically modeling the egg code, its subtle variations, and how it connects to other data, we illustrate a mechanism for translating across data formats and representations. But there are limits to what semantically modeling the egg-code can achieve. The egg-code and common operational sea ice formats do not address community needs, notably the timing and processes of sea ice freeze-up and break-up which have profound impact on local hunting, shipping, oil exploration, and safety. We work with local experts from four very different Indigenous communities and scientific creators of sea ice forecasts to establish an understanding of concepts and terminology related to fall freeze-up and spring break up from the individually represented regions. This helps expand our conceptions of sea ice while also aiding in understanding across cultures and communities, and in passing knowledge to younger generations. This is an early step to expanding concepts of interoperability to very different ways of knowing to make data truly relevant and locally useful.
NASA Astrophysics Data System (ADS)
Simonis, Ingo
2015-04-01
Traditional Spatial Data Infrastructures focus on aspects such as description and discovery of geospatial data, integration of these data into processing workflows, and representation of fusion or other data analysis results. Though lots of interoperability agreements still need to be worked out to achieve a satisfying level of interoperability within large scale initiatives such as INSPIRE, new technologies, use cases and requirements are constantly emerging from the user community. This paper focuses on three aspects that came up recently: The integration of social media data into SDIs, synchronization aspects between datasets used by field workers in shared resources environments, and the generation and maintenance of data for mixed mode online/offline situations that can be easily packed, delivered, modified, and synchronized with reference data sets. The work described in this paper results from the latest testbed executed by the Open Geospatial Consortium, OGC. The testbed is part of the interoperability program (IP), which constitutes a significant part of the OGC standards development process. The IP has a number of instruments to enhance geospatial standards and technologies, such as Testbeds, Pilot Projects, Interoperability Experiments, and Interoperability Expert Services. These activities are designed to encourage rapid development, testing, validation, demonstration and adoption of open, consensus based standards and best practices. The latest global activity, testbed-11, aims at exploring new technologies and architectural approaches to enrich and extend traditional spatial data infrastructures with data from Social Media, improved data synchronization, and the capability to take data to the field in new synchronized data containers called GeoPackages. Social media sources are a valuable supplement to providing up to date information in distributed environments. Following an uncoordinated crowdsourcing approach, social media data can be both overwhelming in volume and questionable in its accuracy and legitimacy. Testbed-11 explores how best to make use of such sources of information and how to deal with immanent issues with data from platforms such as OpenStreetMap, Twitter, tumblr, flickr, Snapchat, Facebook, Instagram, YouTube, Vimeo, Panoramio, Pinterest, Picasa or storyful. Further important aspects highlighted here are the synchronization of data and the capability to take complex data sets of any size on mobile devices to the field - and keeping them in sync with reference data stores. In particular in emergency management situations, it is crucial to ensure properly synchronized data sets across different types of data stores and applications. Often data is taken to the field on mobile devices, where it gets updated or annotated. Though bandwidth permanently improves, requirements on data quality and complexity grow in parallel. Intermitted connectivity is paired with high security requirements that have to be fulfilled. This paper discusses the latest approaches using synchronization services and synchronized GeoPackages, the new container format for geospatial data.
Data Modeling Challenges of Advanced Interoperability.
Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka
2018-01-01
Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.
Semantic interoperability--HL7 Version 3 compared to advanced architecture standards.
Blobel, B G M E; Engel, K; Pharow, P
2006-01-01
To meet the challenge for high quality and efficient care, highly specialized and distributed healthcare establishments have to communicate and co-operate in a semantically interoperable way. Information and communication technology must be open, flexible, scalable, knowledge-based and service-oriented as well as secure and safe. For enabling semantic interoperability, a unified process for defining and implementing the architecture, i.e. structure and functions of the cooperating systems' components, as well as the approach for knowledge representation, i.e. the used information and its interpretation, algorithms, etc. have to be defined in a harmonized way. Deploying the Generic Component Model, systems and their components, underlying concepts and applied constraints must be formally modeled, strictly separating platform-independent from platform-specific models. As HL7 Version 3 claims to represent the most successful standard for semantic interoperability, HL7 has been analyzed regarding the requirements for model-driven, service-oriented design of semantic interoperable information systems, thereby moving from a communication to an architecture paradigm. The approach is compared with advanced architectural approaches for information systems such as OMG's CORBA 3 or EHR systems such as GEHR/openEHR and CEN EN 13606 Electronic Health Record Communication. HL7 Version 3 is maturing towards an architectural approach for semantic interoperability. Despite current differences, there is a close collaboration between the teams involved guaranteeing a convergence between competing approaches.
Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung
2014-08-01
Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.
A Framework for Seamless Interoperation of Heterogeneous Distributed Software Components
2005-05-01
interoperability, b) distributed resource discovery, and c) validation of quality requirements. Principles and prototypical systems were created to demonstrate the successful completion of the research.
The BACnet Campus Challenge - Part 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masica, Ken; Tom, Steve
Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less
Identity Management Systems in Healthcare: The Issue of Patient Identifiers
NASA Astrophysics Data System (ADS)
Soenens, Els
According to a recent recommendation of the European Commission, now is the time for Europe to enhance interoperability in eHealth. Although interoperability of patient identifiers seems promising for matters of patient mobility, patient empowerment and effective access to care, we see that today there is indeed a considerable lack of interoperability in the field of patient identification. Looking from a socio-technical rather than a merely technical point of view, one can understand the fact that the development and implementation of an identity management system in a specific healthcare context is influenced by particular social practices, affected by socio-economical history and the political climate and regulated by specific data protection legislations. Consequently, the process of making patient identification in Europe more interoperable is a development beyond semantic and syntactic levels. In this paper, we gives some examples of today’s patient identifier systems in Europe, discuss the issue of interoperability of (unique) patient identifiers from a socio-technical point of view and try not to ignore the ‘privacy side’ of the story.
The BACnet Campus Challenge - Part 1
Masica, Ken; Tom, Steve
2015-12-01
Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less
Smart Grid Interoperability Maturity Model Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Drummond, R.; Giroti, Tony
The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across anmore » information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.« less
Javitt, Daniel C; Rabinowicz, Esther; Silipo, Gail; Dias, Elisa C
2007-03-01
Deficits in working memory performance are among the most widely replicated findings in schizophrenia. Roles of encoding vs. memory retention in working memory remain unresolved. The present study evaluated working memory performance in schizophrenia using an AX-type continuous performance test (AX-CPT) paradigm. Participants included 48 subjects with schizophrenia and 27 comparison subjects. Behavior was obtained in 3 versions of the task, which differed based upon ease of cue interoperability. In a simple cue version of the task, cue letters were replaced with red or green circles. In the complex cue version, letter/color conjunctions served as cues. In the base version of the task, patients showed increased rates of false alarms to invalidly cued targets, similar to prior reports. However, when the cue stimuli were replaced with green or red circles to ease interpretation, patients showed similar false alarm rates to controls. When feature conjunction cues were used, patients were also disproportionately affected relative to controls. No significant group by interstimulus interval interaction effects were observed in either the simple or complex cue conditions, suggesting normal retention of information even in the presence of overall performance decrements. These findings suggest first, that cue manipulation disproportionately affects AX-CPT performance in schizophrenia and, second, that substantial behavioral deficits may be observed on working memory tasks even in the absence of disturbances in mnemonic retention.
Exploring NASA GES DISC Data with Interoperable Services
NASA Technical Reports Server (NTRS)
Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey
2015-01-01
Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.
NASA Technical Reports Server (NTRS)
Jones, Michael K.
1998-01-01
Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.
Dhaval, Rakesh; Borlawsky, Tara; Ostrander, Michael; Santangelo, Jennifer; Kamal, Jyoti; Payne, Philip R O
2008-11-06
In order to enhance interoperability between enterprise systems, and improve data validity and reliability throughout The Ohio State University Medical Center (OSUMC), we have initiated the development of an ontology-anchored metadata architecture and knowledge collection for our enterprise data warehouse. The metadata and corresponding semantic relationships stored in the OSUMC knowledge collection are intended to promote consistency and interoperability across the heterogeneous clinical, research, business and education information managed within the data warehouse.
Interoperability Outlook in the Big Data Future
NASA Astrophysics Data System (ADS)
Kuo, K. S.; Ramachandran, R.
2015-12-01
The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center" interoperability is almost guaranteed because data, analysis, and results all can be readily shared and reused. Effectively, with the establishment of "distributed active analysis centers", interoperation turns from a many-to-many problem into a less complicated few-to-few problem and becomes easier to solve.
NASA Astrophysics Data System (ADS)
Thomas, Paul A.; Marshall, Gillian; Faulkner, David; Kent, Philip; Page, Scott; Islip, Simon; Oldfield, James; Breckon, Toby P.; Kundegorski, Mikolaj E.; Clark, David J.; Styles, Tim
2016-05-01
Currently, most land Intelligence, Surveillance and Reconnaissance (ISR) assets (e.g. EO/IR cameras) are simply data collectors. Understanding, decision making and sensor control are performed by the human operators, involving high cognitive load. Any automation in the system has traditionally involved bespoke design of centralised systems that are highly specific for the assets/targets/environment under consideration, resulting in complex, non-flexible systems that exhibit poor interoperability. We address a concept of Autonomous Sensor Modules (ASMs) for land ISR, where these modules have the ability to make low-level decisions on their own in order to fulfil a higher-level objective, and plug in, with the minimum of preconfiguration, to a High Level Decision Making Module (HLDMM) through a middleware integration layer. The dual requisites of autonomy and interoperability create challenges around information fusion and asset management in an autonomous hierarchical system, which are addressed in this work. This paper presents the results of a demonstration system, known as Sensing for Asset Protection with Integrated Electronic Networked Technology (SAPIENT), which was shown in realistic base protection scenarios with live sensors and targets. The SAPIENT system performed sensor cueing, intelligent fusion, sensor tasking, target hand-off and compensation for compromised sensors, without human control, and enabled rapid integration of ISR assets at the time of system deployment, rather than at design-time. Potential benefits include rapid interoperability for coalition operations, situation understanding with low operator cognitive burden and autonomous sensor management in heterogenous sensor systems.
Katayama, Toshiaki; Arakawa, Kazuharu; Nakao, Mitsuteru; Ono, Keiichiro; Aoki-Kinoshita, Kiyoko F; Yamamoto, Yasunori; Yamaguchi, Atsuko; Kawashima, Shuichi; Chun, Hong-Woo; Aerts, Jan; Aranda, Bruno; Barboza, Lord Hendrix; Bonnal, Raoul Jp; Bruskiewich, Richard; Bryne, Jan C; Fernández, José M; Funahashi, Akira; Gordon, Paul Mk; Goto, Naohisa; Groscurth, Andreas; Gutteridge, Alex; Holland, Richard; Kano, Yoshinobu; Kawas, Edward A; Kerhornou, Arnaud; Kibukawa, Eri; Kinjo, Akira R; Kuhn, Michael; Lapp, Hilmar; Lehvaslaiho, Heikki; Nakamura, Hiroyuki; Nakamura, Yasukazu; Nishizawa, Tatsuya; Nobata, Chikashi; Noguchi, Tamotsu; Oinn, Thomas M; Okamoto, Shinobu; Owen, Stuart; Pafilis, Evangelos; Pocock, Matthew; Prins, Pjotr; Ranzinger, René; Reisinger, Florian; Salwinski, Lukasz; Schreiber, Mark; Senger, Martin; Shigemoto, Yasumasa; Standley, Daron M; Sugawara, Hideaki; Tashiro, Toshiyuki; Trelles, Oswaldo; Vos, Rutger A; Wilkinson, Mark D; York, William; Zmasek, Christian M; Asai, Kiyoshi; Takagi, Toshihisa
2010-08-21
Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies.
2010-01-01
Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies. PMID:20727200
Biomedical imaging ontologies: A survey and proposal for future work
Smith, Barry; Arabandi, Sivaram; Brochhausen, Mathias; Calhoun, Michael; Ciccarese, Paolo; Doyle, Scott; Gibaud, Bernard; Goldberg, Ilya; Kahn, Charles E.; Overton, James; Tomaszewski, John; Gurcan, Metin
2015-01-01
Background: Ontology is one strategy for promoting interoperability of heterogeneous data through consistent tagging. An ontology is a controlled structured vocabulary consisting of general terms (such as “cell” or “image” or “tissue” or “microscope”) that form the basis for such tagging. These terms are designed to represent the types of entities in the domain of reality that the ontology has been devised to capture; the terms are provided with logical definitions thereby also supporting reasoning over the tagged data. Aim: This paper provides a survey of the biomedical imaging ontologies that have been developed thus far. It outlines the challenges, particularly faced by ontologies in the fields of histopathological imaging and image analysis, and suggests a strategy for addressing these challenges in the example domain of quantitative histopathology imaging. Results and Conclusions: The ultimate goal is to support the multiscale understanding of disease that comes from using interoperable ontologies to integrate imaging data with clinical and genomics data. PMID:26167381
A Framework for a Decision Support System in a Hierarchical Extended Enterprise Decision Context
NASA Astrophysics Data System (ADS)
Boza, Andrés; Ortiz, Angel; Vicens, Eduardo; Poler, Raul
Decision Support System (DSS) tools provide useful information to decision makers. In an Extended Enterprise, a new goal, changes in the current objectives or small changes in the extended enterprise configuration produce a necessary adjustment in its decision system. A DSS in this context must be flexible and agile to make suitable an easy and quickly adaptation to this new context. This paper proposes to extend the Hierarchical Production Planning (HPP) structure to an Extended Enterprise decision making context. In this way, a framework for DSS in Extended Enterprise context is defined using components of HPP. Interoperability details have been reviewed to identify the impact in this framework. The proposed framework allows overcoming some interoperability barriers, identifying and organizing components for a DSS in Extended Enterprise context, and working in the definition of an architecture to be used in the design process of a flexible DSS in Extended Enterprise context which can reuse components for futures Extended Enterprise configurations.
Software design and implementation concepts for an interoperable medical communication framework.
Besting, Andreas; Bürger, Sebastian; Kasparick, Martin; Strathen, Benjamin; Portheine, Frank
2018-02-23
The new IEEE 11073 service-oriented device connectivity (SDC) standard proposals for networked point-of-care and surgical devices constitutes the basis for improved interoperability due to its independence of vendors. To accelerate the distribution of the standard a reference implementation is indispensable. However, the implementation of such a framework has to overcome several non-trivial challenges. First, the high level of complexity of the underlying standard must be reflected in the software design. An efficient implementation has to consider the limited resources of the underlying hardware. Moreover, the frameworks purpose of realizing a distributed system demands a high degree of reliability of the framework itself and its internal mechanisms. Additionally, a framework must provide an easy-to-use and fail-safe application programming interface (API). In this work, we address these challenges by discussing suitable software engineering principles and practical coding guidelines. A descriptive model is developed that identifies key strategies. General feasibility is shown by outlining environments in which our implementation has been utilized.
Ubiquitous healthcare computing with SEnsor Grid Enhancement with Data Management System (SEGEDMA).
Preve, Nikolaos
2011-12-01
Wireless Sensor Network (WSN) can be deployed to monitor the health of patients suffering from critical diseases. Also a wireless network consisting of biomedical sensors can be implanted into the patient's body and can monitor the patients' conditions. These sensor devices, apart from having an enormous capability of collecting data from their physical surroundings, are also resource constraint in nature with a limited processing and communication ability. Therefore we have to integrate them with the Grid technology in order to process and store the collected data by the sensor nodes. In this paper, we proposed the SEnsor Grid Enhancement Data Management system, called SEGEDMA ensuring the integration of different network technologies and the continuous data access to system users. The main contribution of this work is to achieve the interoperability of both technologies through a novel network architecture ensuring also the interoperability of Open Geospatial Consortium (OGC) and HL7 standards. According to the results, SEGEDMA can be applied successfully in a decentralized healthcare environment.
Hoehndorf, Robert; Dumontier, Michel; Oellrich, Anika; Rebholz-Schuhmann, Dietrich; Schofield, Paul N; Gkoutos, Georgios V
2011-01-01
Researchers design ontologies as a means to accurately annotate and integrate experimental data across heterogeneous and disparate data- and knowledge bases. Formal ontologies make the semantics of terms and relations explicit such that automated reasoning can be used to verify the consistency of knowledge. However, many biomedical ontologies do not sufficiently formalize the semantics of their relations and are therefore limited with respect to automated reasoning for large scale data integration and knowledge discovery. We describe a method to improve automated reasoning over biomedical ontologies and identify several thousand contradictory class definitions. Our approach aligns terms in biomedical ontologies with foundational classes in a top-level ontology and formalizes composite relations as class expressions. We describe the semi-automated repair of contradictions and demonstrate expressive queries over interoperable ontologies. Our work forms an important cornerstone for data integration, automatic inference and knowledge discovery based on formal representations of knowledge. Our results and analysis software are available at http://bioonto.de/pmwiki.php/Main/ReasonableOntologies.
OGC and Grid Interoperability in enviroGRIDS Project
NASA Astrophysics Data System (ADS)
Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas
2010-05-01
EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/
NASA Astrophysics Data System (ADS)
Richards, C. J.; Evans, B. J. K.; Wyborn, L. A.; Wang, J.; Trenham, C. E.; Druken, K. A.
2016-12-01
The Australian National Computational Infrastructure (NCI) has ingested over 10PB of national and international environmental, Earth systems science and geophysics reference data onto a single platform to advance high performance data (HPD) techniques that enable interdisciplinary Data-intensive Science. Improved Data Stewardship is critical to evolve both data and data services that support the increasing need for programmatic usability and that prioritises interoperability rather than just traditional data download or portal access. A data platform designed for programmatic access requires quality checked collections that better utilise interoperable data formats and standards. Achieving this involves strategies to meet both the technical and `social' challenges. Aggregating datasets used by different communities and organisations requires satisfying multiple use cases for the broader research community, whilst addressing existing BAU requirements. For NCI, this requires working with data stewards to manage the process of replicating data to the common platform, community representatives and developers to confirm their requirements, and with international peers to better enable globally integrated data communities. It is particularly important to engage with representatives from each community who can work collaboratively to a common goal, as well as capture their community needs, apply quality assurance, determine any barriers to change and to understand priorities. This is critical when managing the aggregation of data collections from multiple producers with different levels of stewardship maturity, technologies and standards, and where organisational barriers can impact the transformation to interoperable and performant data access. To facilitate the management, development and operation of the HPD platform, NCI coordinates technical and domain committees made up of user representatives, data stewards and informatics experts to provide a forum to discuss, learn and advise NCI's management. This experience has been a useful collaboration and suggests that in the age of interdisciplinary HPD research, Data Stewardship is evolving from a focus on the needs of a single community to one which helps balance priorities and navigates change for multiple communities.
Applying Sensor Web Technology to Marine Sensor Data
NASA Astrophysics Data System (ADS)
Jirka, Simon; del Rio, Joaquin; Mihai Toma, Daniel; Nüst, Daniel; Stasch, Christoph; Delory, Eric
2015-04-01
In this contribution we present two activities illustrating how Sensor Web technology helps to enable a flexible and interoperable sharing of marine observation data based on standards. An important foundation is the Sensor Web Architecture developed by the European FP7 project NeXOS (Next generation Low-Cost Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management). This architecture relies on the Open Geospatial Consortium's (OGC) Sensor Web Enablement (SWE) framework. It is an exemplary solution for facilitating the interoperable exchange of marine observation data within and between (research) organisations. The architecture addresses a series of functional and non-functional requirements which are fulfilled through different types of OGC SWE components. The diverse functionalities offered by the NeXOS Sensor Web architecture are shown in the following overview: - Pull-based observation data download: This is achieved through the OGC Sensor Observation Service (SOS) 2.0 interface standard. - Push-based delivery of observation data to allow users the subscription to new measurements that are relevant for them: For this purpose there are currently several specification activities under evaluation (e.g. OGC Sensor Event Service, OGC Publish/Subscribe Standards Working Group). - (Web-based) visualisation of marine observation data: Implemented through SOS client applications. - Configuration and controlling of sensor devices: This is ensured through the OGC Sensor Planning Service 2.0 interface. - Bridging between sensors/data loggers and Sensor Web components: For this purpose several components such as the "Smart Electronic Interface for Sensor Interoperability" (SEISI) concept are developed; this is complemented by a more lightweight SOS extension (e.g. based on the W3C Efficient XML Interchange (EXI) format). To further advance this architecture, there is on-going work to develop dedicated profiles of selected OGC SWE specifications that provide stricter guidance how these standards shall be applied to marine data (e.g. SensorML 2.0 profiles stating which metadata elements are mandatory building upon the ESONET Sensor Registry developments, etc.). Within the NeXOS project the presented architecture is implemented as a set of open source components. These implementations can be re-used by all interested scientists and data providers needing tools for publishing or consuming oceanographic sensor data. In further projects such as the European project FixO3 (Fixed-point Open Ocean Observatories), these software development activities are complemented with additional efforts to provide guidance how Sensor Web technology can be applied in an efficient manner. This way, not only software components are made available but also documentation and information resources that help to understand which types of Sensor Web deployments are best suited to fulfil different types of user requirements.
A Framework for Resilient Remote Monitoring
2014-08-01
of low-level observables are availa- ble, audited , and recorded. This establishes the need for a re- mote monitoring framework that can integrate with...Security, WS-Policy, SAML, XML Signature, and XML Encryption. Pearson Higher Education, 2004. [3] OMG, “Common Secure Interoperability Protocol...www.darpa.mil/Our_Work/I2O/Programs/Integrated_Cyb er_Analysis_System_%28ICAS%29.aspx. [8] D. Miller and B. Pearson , Security information and event man
Bayer, Jennifer M.; Weltzin, Jake F.; Scully, Rebecca A.
2017-01-01
Objectives of the workshop were: 1) identify resources that support natural resource monitoring programs working across the data life cycle; 2) prioritize desired capacities and tools to facilitate monitoring design and implementation; 3) identify standards and best practices that improve discovery, accessibility, and interoperability of data across programs and jurisdictions; and 4) contribute to an emerging community of practice focused on natural resource monitoring.
Army Sustainment. Volume 47, Issue 5, September-October 2015
2016-02-29
stu- dents with the opportunity to work directly with noncommissioned officers and warrant officers as part of their training. By Maj. Brian J ...Need to Expand Training and Education on Nonstandard Logistics Capt. Christopher J . Sheehan 12 Multinational Logistics Interoperability Capt...Theresa D. Christie DEPARTMENTS “ “ Ex Off icio Brig. Gen. Kurt J . Ryan Chief of Ordnance Brig. Gen. Ronald Kirklin The Quartermaster General Lt. Gen
Text mining resources for the life sciences.
Przybyła, Piotr; Shardlow, Matthew; Aubin, Sophie; Bossy, Robert; Eckart de Castilho, Richard; Piperidis, Stelios; McNaught, John; Ananiadou, Sophia
2016-01-01
Text mining is a powerful technology for quickly distilling key information from vast quantities of biomedical literature. However, to harness this power the researcher must be well versed in the availability, suitability, adaptability, interoperability and comparative accuracy of current text mining resources. In this survey, we give an overview of the text mining resources that exist in the life sciences to help researchers, especially those employed in biocuration, to engage with text mining in their own work. We categorize the various resources under three sections: Content Discovery looks at where and how to find biomedical publications for text mining; Knowledge Encoding describes the formats used to represent the different levels of information associated with content that enable text mining, including those formats used to carry such information between processes; Tools and Services gives an overview of workflow management systems that can be used to rapidly configure and compare domain- and task-specific processes, via access to a wide range of pre-built tools. We also provide links to relevant repositories in each section to enable the reader to find resources relevant to their own area of interest. Throughout this work we give a special focus to resources that are interoperable-those that have the crucial ability to share information, enabling smooth integration and reusability. © The Author(s) 2016. Published by Oxford University Press.
Multiple video sequences synchronization during minimally invasive surgery
NASA Astrophysics Data System (ADS)
Belhaoua, Abdelkrim; Moreau, Johan; Krebs, Alexandre; Waechter, Julien; Radoux, Jean-Pierre; Marescaux, Jacques
2016-03-01
Hybrid operating rooms are an important development in the medical ecosystem. They allow integrating, in the same procedure, the advantages of radiological imaging and surgical tools. However, one of the challenges faced by clinical engineers is to support the connectivity and interoperability of medical-electrical point-of-care devices. A system that could enable plug-and-play connectivity and interoperability for medical devices would improve patient safety, save hospitals time and money, and provide data for electronic medical records. In this paper, we propose a hardware platform dedicated to collect and synchronize multiple videos captured from medical equipment in real-time. The final objective is to integrate augmented reality technology into an operation room (OR) in order to assist the surgeon during a minimally invasive operation. To the best of our knowledge, there is no prior work dealing with hardware based video synchronization for augmented reality applications on OR. Whilst hardware synchronization methods can embed temporal value, so called timestamp, into each sequence on-the-y and require no post-processing, they require specialized hardware. However the design of our hardware is simple and generic. This approach was adopted and implemented in this work and its performance is evaluated by comparison to the start-of-the-art methods.
Archetype modeling methodology.
Moner, David; Maldonado, José Alberto; Robles, Montserrat
2018-03-01
Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.
Transportation communications interoperability : phase 2, resource evaluation.
DOT National Transportation Integrated Search
2006-12-01
Based on the Arizona Department of Transportations (ADOT) previous SPR-561 Needs Assessment study, this : report continues the efforts to enhance radio interoperability between Department of Public Safety (DPS) Highway : Patrol officers and ...
Analysis of OPACITY and PLAID Protocols for Contactless Smart Cards
2012-09-01
9 3. Access Control ........................................................................ 9 E . THREATS AND...Synchronization .............................. 23 c. Simple Integration and Interoperability ..................... 24 E . MODES OF OPERATION...Interoperability ..................... 47 E . MODES OF OPERATIONS ................................................................ 47 F. SUGGESTED KEY
Chao, Tian-Jy; Kim, Younghun
2015-02-03
Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.
The Long Road to Semantic Interoperability in Support of Public Health: Experiences from Two States
Vreeman, Daniel J.; Grannis, Shaun J.
2014-01-01
Proliferation of health information technologies creates opportunities to improve clinical and public health, including high quality, safer care and lower costs. To maximize such potential benefits, health information technologies must readily and reliably exchange information with other systems. However, evidence from public health surveillance programs in two states suggests that operational clinical information systems often fail to use available standards, a barrier to semantic interoperability. Furthermore, analysis of existing policies incentivizing semantic interoperability suggests they have limited impact and are fragmented. In this essay, we discuss three approaches for increasing semantic interoperability to support national goals for using health information technologies. A clear, comprehensive strategy requiring collaborative efforts by clinical and public health stakeholders is suggested as a guide for the long road towards better population health data and outcomes. PMID:24680985
Geoscience Information Network (USGIN) Solutions for Interoperable Open Data Access Requirements
NASA Astrophysics Data System (ADS)
Allison, M. L.; Richard, S. M.; Patten, K.
2014-12-01
The geosciences are leading development of free, interoperable open access to data. US Geoscience Information Network (USGIN) is a freely available data integration framework, jointly developed by the USGS and the Association of American State Geologists (AASG), in compliance with international standards and protocols to provide easy discovery, access, and interoperability for geoscience data. USGIN standards include the geologic exchange language 'GeoSciML' (v 3.2 which enables instant interoperability of geologic formation data) which is also the base standard used by the 117-nation OneGeology consortium. The USGIN deployment of NGDS serves as a continent-scale operational demonstration of the expanded OneGeology vision to provide access to all geoscience data worldwide. USGIN is developed to accommodate a variety of applications; for example, the International Renewable Energy Agency streams data live to the Global Atlas of Renewable Energy. Alternatively, users without robust data sharing systems can download and implement a free software packet, "GINstack" to easily deploy web services for exposing data online for discovery and access. The White House Open Data Access Initiative requires all federally funded research projects and federal agencies to make their data publicly accessible in an open source, interoperable format, with metadata. USGIN currently incorporates all aspects of the Initiative as it emphasizes interoperability. The system is successfully deployed as the National Geothermal Data System (NGDS), officially launched at the White House Energy Datapalooza in May, 2014. The USGIN Foundation has been established to ensure this technology continues to be accessible and available.
NASA Astrophysics Data System (ADS)
2018-01-01
The large amount of data generated by modern space missions calls for a change of organization of data distribution and access procedures. Although long term archives exist for telescopic and space-borne observations, high-level functions need to be developed on top of these repositories to make Planetary Science and Heliophysics data more accessible and to favor interoperability. Results of simulations and reference laboratory data also need to be integrated to support and interpret the observations. Interoperable software and interfaces have recently been developed in many scientific domains. The Virtual Observatory (VO) interoperable standards developed for Astronomy by the International Virtual Observatory Alliance (IVOA) can be adapted to Planetary Sciences, as demonstrated by the VESPA (Virtual European Solar and Planetary Access) team within the Europlanet-H2020-RI project. Other communities have developed their own standards: GIS (Geographic Information System) for Earth and planetary surfaces tools, SPASE (Space Physics Archive Search and Extract) for space plasma, PDS4 (NASA Planetary Data System, version 4) and IPDA (International Planetary Data Alliance) for planetary mission archives, etc, and an effort to make them interoperable altogether is starting, including automated workflows to process related data from different sources.
Interoperability of Neuroscience Modeling Software
Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik
2009-01-01
Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374
A development framework for semantically interoperable health information systems.
Lopez, Diego M; Blobel, Bernd G M E
2009-02-01
Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.
NASA Astrophysics Data System (ADS)
Simonis, I.; Alameh, N.; Percivall, G.
2012-04-01
The GEOSS Architecture Implementation Pilots (AIP) develop and pilot new process and infrastructure components for the GEOSS Common Infrastructure (GCI) and the broader GEOSS architecture through an evolutionary development process consisting of a set of phases. Each phase addresses a set of Societal Benefit Areas (SBA) and geoinformatic topics. The first three phases consisted of architecture refinements based on interactions with users; component interoperability testing; and SBA-driven demonstrations. The fourth phase (AIP-4) documented here focused on fostering interoperability arrangements and common practices for GEOSS by facilitating access to priority earth observation data sources and by developing and testing specific clients and mediation components to enable such access. Additionally, AIP-4 supported the development of a thesaurus for earth observation parameters and tutorials to guide data providers to make their data available through GEOSS. The results of AIP-4 are documented in two engineering reports and captured in a series of videos posted online. Led by the Open Geospatial Consortium (OGC), AIP-4 built on contributions from over 60 organizations. This wide portfolio helped testing interoperability arrangements in a highly heterogeneous environment. AIP-4 participants cooperated closely to test available data sets, access services, and client applications in multiple workflows and set ups. Eventually, AIP-4 improved the accessibility of GEOSS datasets identified as supporting Critical Earth Observation Priorities by the GEO User Interface Committee (UIC), and increased the use of the data through promoting availability of new data services, clients, and applications. During AIP-4, A number of key earth observation data sources have been made available online at standard service interfaces, discovered using brokered search approaches, and processed and visualized in generalized client applications. AIP-4 demonstrated the level of interoperability that can be achieved using currently available standards and corresponding products and implementations. The AIP-4 integration testing process proved that the integration of heterogeneous data resources available via interoperability arrangements such as WMS, WFS, WCS and WPS indeed works. However, the integration often required various levels of customizations on the client side to accommodate for variations in the service implementations. Those variations seem to be based on both malfunctioning service implementations as well as varying interpretations of or inconsistencies in existing standards. Other interoperability issues identified revolve around missing metadata or using unrecognized identifiers in the description of GEOSS resources. Once such issues are resolved, continuous compliance testing is necessary to ensure minimizing variability of implementations. Once data providers can choose from a set of enhanced implementations for offering their data using consistent interoperability arrangements, the barrier to client and decision support implementation developers will be lowered, leading to true leveraging of earth observation data through GEOSS. AIP-4 results, lessons learnt from previous AIPs 1-3 and close coordination with the Infrastructure Implementation Board (IIB), the successor of the Architecture and Data Committee (ADC), form the basis in the current preparation phase for the next Architecture Implementation Pilot, AIP-5. The Call For Participation will be launched in February and the pilot will be conducted from May to November 2012. The current planning foresees a scenario- oriented approach, with possible scenarios coming from the domains of disaster management, health (including air quality and waterborne diseases), water resource observations, energy, biodiversity and climate change, and agriculture.
Interoperable Archetypes With a Three Folded Terminology Governance.
Pederson, Rune; Ellingsen, Gunnar
2015-01-01
The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded.
CCP interoperability and system stability
NASA Astrophysics Data System (ADS)
Feng, Xiaobing; Hu, Haibo
2016-09-01
To control counterparty risk, financial regulations such as the Dodd-Frank Act are increasingly requiring standardized derivatives trades to be cleared by central counterparties (CCPs). It is anticipated that in the near term future, CCPs across the world will be linked through interoperability agreements that facilitate risk sharing but also serve as a conduit for transmitting shocks. This paper theoretically studies a networked network with CCPs that are linked through interoperability arrangements. The major finding is that the different configurations of networked network CCPs contribute to the different properties of the cascading failures.
The Importance of State and Context in Safe Interoperable Medical Systems
Jaffe, Michael B.; Robkin, Michael; Rausch, Tracy; Arney, David; Goldman, Julian M.
2016-01-01
This paper describes why “device state” and “patient context” information are necessary components of device models for safe interoperability. This paper includes a discussion of the importance of describing the roles of devices with respect to interactions (including human user workflows involving devices, and device to device communication) within a system, particularly those intended for use at the point-of-care, and how this role information is communicated. In addition, it describes the importance of clinical scenarios in creating device models for interoperable devices. PMID:27730013
Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz
2016-01-01
As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optimizations1 to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor a , an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions. PMID:27896971
Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz
2017-01-01
As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.
Emergence of a Common Modeling Architecture for Earth System Science (Invited)
NASA Astrophysics Data System (ADS)
Deluca, C.
2010-12-01
Common modeling architecture can be viewed as a natural outcome of common modeling infrastructure. The development of model utility and coupling packages (ESMF, MCT, OpenMI, etc.) over the last decade represents the realization of a community vision for common model infrastructure. The adoption of these packages has led to increased technical communication among modeling centers and newly coupled modeling systems. However, adoption has also exposed aspects of interoperability that must be addressed before easy exchange of model components among different groups can be achieved. These aspects include common physical architecture (how a model is divided into components) and model metadata and usage conventions. The National Unified Operational Prediction Capability (NUOPC), an operational weather prediction consortium, is collaborating with weather and climate researchers to define a common model architecture that encompasses these advanced aspects of interoperability and looks to future needs. The nature and structure of the emergent common modeling architecture will be discussed along with its implications for future model development.
Interoperation transfer in Chinese-English bilinguals' arithmetic.
Campbell, Jamie I D; Dowd, Roxanne R
2012-10-01
We examined interoperation transfer of practice in adult Chinese-English bilinguals' memory for simple multiplication (6 × 8 = 48) and addition (6 + 8 = 14) facts. The purpose was to determine whether they possessed distinct number-fact representations in both Chinese (L1) and English (L2). Participants repeatedly practiced multiplication problems (e.g., 4 × 5 = ?), answering a subset in L1 and another subset in L2. Then separate groups answered corresponding addition problems (4 + 5 = ?) and control addition problems in either L1 (N = 24) or L2 (N = 24). The results demonstrated language-specific negative transfer of multiplication practice to corresponding addition problems. Specifically, large simple addition problems (sum > 10) presented a significant response time cost (i.e., retrieval-induced forgetting) after their multiplication counterparts were practiced in the same language, relative to practice in the other language. The results indicate that our Chinese-English bilinguals had multiplication and addition facts represented in distinct language-specific memory stores.
Integrating technology to improve medication administration.
Prusch, Amanda E; Suess, Tina M; Paoletti, Richard D; Olin, Stephen T; Watts, Starann D
2011-05-01
The development, implementation, and evaluation of an i.v. interoperability program to advance medication safety at the bedside are described. I.V. interoperability integrates intelligent infusion devices (IIDs), the bar-code-assisted medication administration system, and the electronic medication administration record system into a bar-code-driven workflow that populates provider-ordered, pharmacist-validated infusion parameters on IIDs. The purpose of this project was to improve medication safety through the integration of these technologies and decrease the potential for error during i.v. medication administration. Four key phases were essential to developing and implementing i.v. interoperability: (a) preparation, (b) i.v. interoperability pilot, (c) preliminary validation, and (d) expansion. The establishment of pharmacy involvement in i.v. interoperability resulted in two additional safety checks: pharmacist infusion rate oversight and nurse independent validation of the autoprogrammed rate. After instituting i.v. interoperability, monthly compliance to the telemetry drug library increased to a mean ± S.D. of 72.1% ± 2.1% from 56.5% ± 1.5%, and the medical-surgical nursing unit's drug library monthly compliance rate increased to 58.6% ± 2.9% from 34.1% ± 2.6% (p < 0.001 for both comparisons). The number of manual pump edits decreased with both telemetry and medical-surgical drug libraries, demonstrating a reduction from 56.9 ± 12.8 to 14.2 ± 3.9 and from 61.2 ± 15.4 to 14.7 ± 3.8, respectively (p < 0.001 for both comparisons). Through the integration and incorporation of pharmacist oversight for rate changes, the telemetry and medical-surgical patient care areas demonstrated a 32% reduction in reported monthly errors involving i.v. administration of heparin. By integrating two stand-alone technologies, i.v. interoperability was implemented to improve medication administration. Medication errors were reduced, nursing workflow was simplified, and pharmacists became involved in checking infusion rates of i.v. medications.
Turning Interoperability Operational with GST
NASA Astrophysics Data System (ADS)
Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha
2013-04-01
GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially how it complies with existing or developing standards or quasi-standards and how it applies and extents services towards interoperability in the Earth sciences.
NASA Astrophysics Data System (ADS)
Fox, P. A.; Diviacco, P.; Busato, A.
2016-12-01
Geo-scientific research collaboration commonly faces of complex systems where multiple skills and competences are needed at the same time. Efficacy of such collaboration among researchers then becomes of paramount importance. Multidisciplinary studies draw from domains that are far from each other. Researchers also need to understand: how to extract what data they need and eventually produce something that can be used by others. The management of information and knowledge in this perspective is non-trivial. Interoperability is frequently sought in computer-to-computer environements, so-as to overcome mismatches in vocabulary, data formats, coordinate reference system and so on. Successful researcher collaboration also relies on interoperability of the people! Smaller, synchronous and face-to-face settings for researchers are knownn to enhance people interoperability. However changing settings; either geographically; temporally; or with increasing the team size, diversity, and expertise requires people-computer-people-computer (...) interoperability. To date, knowledge representation framework have been proposed but not proven as necessary and sufficient to achieve multi-way interoperability. In this contribution, we address epistemology and sociology of science advocating for a fluid perspective where science is mostly a social construct, conditioned by cognitive issues; especially cognitive bias. Bias cannot be obliterated. On the contrary it must be carefully taken into consideration. Information-centric interfaces built from different perspectives and ways of thinking by actors with different point of views, approaches and aims, are proposed as a means for enhancing people interoperability in computer-based settings. The contribution will provide details on the approach of augmenting and interfacing to knowledge representation frameworks to the cognitive-conceptual frameworks for people that are needed to meet and exceed collaborative research goals in the 21st century. A web based collaborative portal has been developed that integrates both approaches and will be presented. Reports will be given on initial tests that have encouraging results.
NASA Astrophysics Data System (ADS)
Glaves, H. M.
2015-12-01
In recent years marine research has become increasingly multidisciplinary in its approach with a corresponding rise in the demand for large quantities of high quality interoperable data as a result. This requirement for easily discoverable and readily available marine data is currently being addressed by a number of regional initiatives with projects such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Integrated Marine Observing System (IMOS) in Australia, having implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these systems has been developed to address local requirements and created in isolation from those in other regions.Multidisciplinary marine research on a global scale necessitates a common framework for marine data management which is based on existing data systems. The Ocean Data Interoperability Platform project is seeking to address this requirement by bringing together selected regional marine e-infrastructures for the purposes of developing interoperability across them. By identifying the areas of commonality and incompatibility between these data infrastructures, and leveraging the development activities and expertise of these individual systems, three prototype interoperability solutions are being created which demonstrate the effective sharing of marine data and associated metadata across the participating regional data infrastructures as well as with other target international systems such as GEO, COPERNICUS etc.These interoperability solutions combined with agreed best practice and approved standards, form the basis of a common global approach to marine data management which can be adopted by the wider marine research community. To encourage implementation of these interoperability solutions by other regional marine data infrastructures an impact assessment is being conducted to determine both the technical and financial implications of deploying them alongside existing services. The associated best practice and common standards are also being disseminated to the user community through relevant accreditation processes and related initiatives such as the Research Data Alliance and the Belmont Forum.
Parel, I; Cutti, A G; Fiumana, G; Porcellini, G; Verni, G; Accardo, A P
2012-04-01
To measure the scapulohumeral rhythm (SHR) in outpatient settings, the motion analysis protocol named ISEO (INAIL Shoulder and Elbow Outpatient protocol) was developed, based on inertial and magnetic sensors. To complete the sensor-to-segment calibration, ISEO requires the involvement of an operator for sensor placement and for positioning the patient's arm in a predefined posture. Since this can affect the measure, this study aimed at quantifying ISEO intra- and inter-operator agreement. Forty subjects were considered, together with two operators, A and B. Three measurement sessions were completed for each subject: two by A and one by B. In each session, the humerus and scapula rotations were measured during sagittal and scapular plane elevation movements. ISEO intra- and inter-operator agreement were assessed by computing, between sessions, the: (1) similarity of the scapulohumeral patterns through the Coefficient of Multiple Correlation (CMC(2)), both considering and excluding the difference of the initial value of the scapula rotations between two sessions (inter-session offset); (2) 95% Smallest Detectable Difference (SDD(95)) in scapula range of motion. Results for CMC(2) showed that the intra- and inter-operator agreement is acceptable (median≥0.85, lower-whisker ≥ 0.75) for most of the scapula rotations, independently from the movement and the inter-session offset. The only exception is the agreement for scapula protraction-retraction and for scapula medio-lateral rotation during abduction (inter-operator), which is acceptable only if the inter-session offset is removed. SDD(95) values ranged from 4.4° to 8.6° for the inter-operator and between 4.9° and 8.5° for the intra-operator agreement. In conclusion, ISEO presents a high intra- and inter-operator agreement, particularly with the scapula inter-session offset removed. Copyright © 2011 Elsevier B.V. All rights reserved.
Positive train control interoperability and networking research : final report.
DOT National Transportation Integrated Search
2015-12-01
This document describes the initial development of an ITC PTC Shared Network (IPSN), a hosted : environment to support the distribution, configuration management, and IT governance of Interoperable : Train Control (ITC) Positive Train Control (PTC) s...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-01
... FEDERAL COMMUNICATIONS COMMISSION Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public... persons that the Federal Communications Commission's (FCC or Commission) Communications Security...
CCSDS SM and C Mission Operations Interoperability Prototype
NASA Technical Reports Server (NTRS)
Lucord, Steven A.
2010-01-01
This slide presentation reviews the prototype of the Spacecraft Monitor and Control (SM&C) Operations for interoperability among other space agencies. This particular prototype uses the German Space Agency (DLR) to test the ideas for interagency coordination.
RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service
NASA Astrophysics Data System (ADS)
Yang, Chao; Chen, Nengcheng; Di, Liping
2012-10-01
Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.
Metadata behind the Interoperability of Wireless Sensor Networks
Ballari, Daniela; Wachowicz, Monica; Callejo, Miguel Angel Manso
2009-01-01
Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability. PMID:22412330
Metadata behind the Interoperability of Wireless Sensor Networks.
Ballari, Daniela; Wachowicz, Monica; Callejo, Miguel Angel Manso
2009-01-01
Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, Tian-Jy; Kim, Younghun
Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function tomore » convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.« less
Capurro, Daniel; Echeverry, Aisen; Figueroa, Rosa; Guiñez, Sergio; Taramasco, Carla; Galindo, César; Avendaño, Angélica; García, Alejandra; Härtel, Steffen
2017-01-01
Despite the continuous technical advancements around health information standards, a critical component to their widespread adoption involves political agreement between a diverse set of stakeholders. Countries that have addressed this issue have used diverse strategies. In this vision paper we present the path that Chile is taking to establish a national program to implement health information standards and achieve interoperability. The Chilean government established an inter-agency program to define the current interoperability situation, existing gaps, barriers, and facilitators for interoperable health information systems. As an answer to the identified issues, the government decided to fund a consortium of Chilean universities to create the National Center for Health Information Systems. This consortium should encourage the interaction between all health care stakeholders, both public and private, to advance the selection of national standards and define certification procedures for software and human resources in health information technologies.
Dynamic Business Networks: A Headache for Sustainable Systems Interoperability
NASA Astrophysics Data System (ADS)
Agostinho, Carlos; Jardim-Goncalves, Ricardo
Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.
Large scale healthcare data integration and analysis using the semantic web.
Timm, John; Renly, Sondra; Farkash, Ariel
2011-01-01
Healthcare data interoperability can only be achieved when the semantics of the content is well defined and consistently implemented across heterogeneous data sources. Achieving these objectives of interoperability requires the collaboration of experts from several domains. This paper describes tooling that integrates Semantic Web technologies with common tools to facilitate cross-domain collaborative development for the purposes of data interoperability. Our approach is divided into stages of data harmonization and representation, model transformation, and instance generation. We applied our approach on Hypergenes, an EU funded project, where we use our method to the Essential Hypertension disease model using a CDA template. Our domain expert partners include clinical providers, clinical domain researchers, healthcare information technology experts, and a variety of clinical data consumers. We show that bringing Semantic Web technologies into the healthcare interoperability toolkit increases opportunities for beneficial collaboration thus improving patient care and clinical research outcomes.
Operational Interoperability Challenges on the Example of GEOSS and WIS
NASA Astrophysics Data System (ADS)
Heene, M.; Buesselberg, T.; Schroeder, D.; Brotzer, A.; Nativi, S.
2015-12-01
The following poster highlights the operational interoperability challenges on the example of Global Earth Observation System of Systems (GEOSS) and World Meteorological Organization Information System (WIS). At the heart of both systems is a catalogue of earth observation data, products and services but with different metadata management concepts. While in WIS a strong governance with an own metadata profile for the hundreds of thousands metadata records exists, GEOSS adopted a more open approach for the ten million records. Furthermore, the development of WIS - as an operational system - follows a roadmap with committed downwards compatibility while the GEOSS development process is more agile. The poster discusses how the interoperability can be reached for the different metadata management concepts and how a proxy concept helps to couple two different systems which follow a different development methodology. Furthermore, the poster highlights the importance of monitoring and backup concepts as a verification method for operational interoperability.
Data Management challenges in Astronomy and Astroparticle Physics
NASA Astrophysics Data System (ADS)
Lamanna, Giovanni
2015-12-01
Astronomy and Astroparticle Physics domains are experiencing a deluge of data with the next generation of facilities prioritised in the European Strategy Forum on Research Infrastructures (ESFRI), such as SKA, CTA, KM3Net and with other world-class projects, namely LSST, EUCLID, EGO, etc. The new ASTERICS-H2020 project brings together the concerned scientific communities in Europe to work together to find common solutions to their Big Data challenges, their interoperability, and their data access. The presentation will highlight these new challenges and the work being undertaken also in cooperation with e-infrastructures in Europe.
A Framework for Building and Reasoning with Adaptive and Interoperable PMESII Models
2007-11-01
Description Logic SOA Service Oriented Architecture SPARQL Simple Protocol And RDF Query Language SQL Standard Query Language SROM Stability and...another by providing a more expressive ontological structure for one of the models, e.g., semantic networks can be mapped to first- order logical...Pellet is an open-source reasoner that works with OWL-DL. It accepts the SPARQL protocol and RDF query language ( SPARQL ) and provides a Java API to
NASA Technical Reports Server (NTRS)
Lynnes, Christopher
2016-01-01
The NASA representative to the Unidata Strategic Committee presented a semiannual update on NASAs work with and use of Unidata technologies. The talk covered the program of cloud computing prototypes being undertaken for the Earth Observing System Data and Information System (EOSDIS). Also discussed were dataset interoperability recommendations ratified via the EOSDIS Standards Office and the HDF Product Designer tool with respect to its possible applicability to data in network Common Data Form (NetCDF) version 4.
When They Can't Talk, Lives Are Lost : What Public Officials Need to Know About Interoperability
DOT National Transportation Integrated Search
2003-02-01
This publication was developed to facilitate education and discussion among and between elected and appointed officials, their representative associations, and public safety representatives on public safety wireless communications interoperability. T...
Design and Implementation of a REST API for the Human Well Being Index (HWBI)
Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...
Design and Implementation of a REST API for the ?Human Well Being Index (HWBI)
Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...
Community-Driven Initiatives to Achieve Interoperability for Ecological and Environmental Data
NASA Astrophysics Data System (ADS)
Madin, J.; Bowers, S.; Jones, M.; Schildhauer, M.
2007-12-01
Advances in ecology and environmental science increasingly depend on information from multiple disciplines to tackle broader and more complex questions about the natural world. Such advances, however, are hindered by data heterogeneity, which impedes the ability of researchers to discover, interpret, and integrate relevant data that have been collected by others. Here, we outline two community-building initiatives for improving data interoperability in the ecological and environmental sciences, one that is well-established (the Ecological Metadata Language [EML]), and another that is actively underway (a unified model for observations and measurements). EML is a metadata specification developed for the ecology discipline, and is based on prior work done by the Ecological Society of America and associated efforts to ensure a modular and extensible framework to document ecological data. EML "modules" are designed to describe one logical part of the total metadata that should be included with any ecological dataset. EML was developed through a series of working meetings, ongoing discussion forums and email lists, with participation from a broad range of ecological and environmental scientists, as well as computer scientists and software developers. Where possible, EML adopted syntax from the other metadata standards for other disciplines (e.g., Dublin Core, Content Standard for Digital Geospatial Metadata, and more). Although EML has not yet been ratified through a standards body, it has become the de facto metadata standard for a large range of ecological data management projects, including for the Long Term Ecological Research Network, the National Center for Ecological Analysis and Synthesis, and the Ecological Society of America. The second community-building initiative is based on work through the Scientific Environment for Ecological Knowledge (SEEK) as well as a recent workshop on multi-disciplinary data management. This initiative aims at improving interoperability by describing the semantics of data at the level of observation and measurement (rather than the traditional focus at the level of the data set) and will define the necessary specifications and technologies to facilitate semantic interpretation and integration of observational data for the environmental sciences. As such, this initiative will focus on unifying the various existing approaches for representing and describing observation data (e.g., SEEK's Observation Ontology, CUAHSI's Observation Data Model, NatureServe's Observation Data Standard, to name a few). Products of this initiative will be compatible with existing standards and build upon recent advances in knowledge representation (e.g., W3C's recommended Web Ontology Language, OWL) that have demonstrated practical utility in enhancing scientific communication and data interoperability in other communities (e.g., the genomics community). A community-sanctioned, extensible, and unified model for observational data will support metadata standards such as EML while reducing the "babel" of scientific dialects that currently impede effective data integration, which will in turn provide a strong foundation for enabling cross-disciplinary synthetic research in the ecological and environmental sciences.
Lin, Hsiu-Hsia; Chiang, Wen-Chung; Lo, Lun-Jou; Sheng-Pin Hsu, Sam; Wang, Chien-Hsuan; Wan, Shu-Yen
2013-11-01
Combining the maxillofacial cone-beam computed tomography (CBCT) model with its corresponding digital dental model enables an integrated 3-dimensional (3D) representation of skeletal structures, teeth, and occlusions. Undesired artifacts, however, introduce difficulties in the superimposition of both models. We have proposed an artifact-resistant surface-based registration method that is robust and clinically applicable and that does not require markers. A CBCT bone model and a laser-scanned dental model obtained from the same patient were used in developing the method and examining the accuracy of the superimposition. Our method included 4 phases. The first phase was to segment the maxilla from the mandible in the CBCT model. The second phase was to conduct an initial registration to bring the digital dental model and the maxilla and mandible sufficiently close to each other. Third, we manually selected at least 3 corresponding regions on both models by smearing patches on the 3D surfaces. The last phase was to superimpose the digital dental model into the maxillofacial model. Each superimposition process was performed twice by 2 operators with the same object to investigate the intra- and interoperator differences. All collected objects were divided into 3 groups with various degrees of artifacts: artifact-free, critical artifacts, and severe artifacts. The mean errors and root-mean-square (RMS) errors were used to evaluate the accuracy of the superimposition results. Repeated measures analysis of variance and the Wilcoxon rank sum test were used to calculate the intraoperator reproducibility and interoperator reliability. Twenty-four maxilla and mandible objects for evaluation were obtained from 14 patients. The experimental results showed that the mean errors between the 2 original models in the residing fused model ranged from 0.10 to 0.43 mm and that the RMS errors ranged from 0.13 to 0.53 mm. These data were consistent with previously used methods and were clinically acceptable. All measurements of the proposed study exhibited desirable intraoperator reproducibility and interoperator reliability. Regarding the intra- and interoperator mean errors and RMS errors in the nonartifact or critical artifact group, no significant difference between the repeated trials or between operators (P < .05) was observed. The results of the present study have shown that the proposed regional surface-based registration can robustly and accurately superimpose a digital dental model into its corresponding CBCT model. Copyright © 2013 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Identification of port communication equipment needs for safety, security, and interoperability
DOT National Transportation Integrated Search
2007-12-01
Identification of Port Communication Equipment Needs for Safety, Security, and Interoperability is a big concern for current and : future need. The data demonstrates that two-way radios should be the most effective method of communication in both rou...
Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd
2014-01-01
Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.
FLTSATCOM interoperability applications
NASA Astrophysics Data System (ADS)
Woolford, Lynn
A mobile Fleet Satellite Communications (FLTSATCOM) system called the Mobile Operational Control Center (MOCC) was developed which has demonstrated the ability to be interoperable with many of the current FLTSATCOM command and control channels. This low-cost system is secure in all its communications, is lightweight, and provides a gateway for other communications formats. The major elements of this system are made up of a personal computer, a protocol microprocessor, and off-the-shelf mobile communication components. It is concluded that with both FLTSATCOM channel protocol and data format interoperability, the MOCC has the ability provide vital information in or near real time, which significantly improves mission effectiveness.
Ryan, Amanda; Eklund, Peter
2008-01-01
Healthcare information is composed of many types of varying and heterogeneous data. Semantic interoperability in healthcare is especially important when all these different types of data need to interact. Presented in this paper is a solution to interoperability in healthcare based on a standards-based middleware software architecture used in enterprise solutions. This architecture has been translated into the healthcare domain using a messaging and modeling standard which upholds the ideals of the Semantic Web (HL7 V3) combined with a well-known standard terminology of clinical terms (SNOMED CT).
Scientific Digital Libraries, Interoperability, and Ontologies
NASA Technical Reports Server (NTRS)
Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.
2009-01-01
Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.
ERIC Educational Resources Information Center
Arms, William Y.; Hillmann, Diane; Lagoze, Carl; Krafft, Dean; Marisa, Richard; Saylor, John; Terizzi, Carol; Van de Sompel, Herbert; Gill, Tony; Miller, Paul; Kenney, Anne R.; McGovern, Nancy Y.; Botticelli, Peter; Entlich, Richard; Payette, Sandra; Berthon, Hilary; Thomas, Susan; Webb, Colin; Nelson, Michael L.; Allen, B. Danette; Bennett, Nuala A.; Sandore, Beth; Pianfetti, Evangeline S.
2002-01-01
Discusses digital libraries, including interoperability, metadata, and international standards; Web resource preservation efforts at Cornell University; digital preservation at the National Library of Australia; object persistence and availability; collaboration among libraries, museums and elementary schools; Asian digital libraries; and a Web…
Metadata mapping and reuse in caBIG.
Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis
2009-02-05
This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG framework or other frameworks that use metadata repositories. The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG framework and potentially any framework that uses a metadata repository. This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG. This effort contributes to facilitating the development of interoperable systems within caBIG as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies.
NASA Technical Reports Server (NTRS)
Cowen, Benjamin
2011-01-01
Simulations are essential for engineering design. These virtual realities provide characteristic data to scientists and engineers in order to understand the details and complications of the desired mission. A standard development simulation package known as Trick is used in developing a source code to model a component (federate in HLA terms). The runtime executive is integrated into an HLA based distributed simulation. TrickHLA is used to extend a Trick simulation for a federation execution, develop a source code for communication between federates, as well as foster data input and output. The project incorporates international cooperation along with team collaboration. Interactions among federates occur throughout the simulation, thereby relying on simulation interoperability. Communication through the semester went on between participants to figure out how to create this data exchange. The NASA intern team is designing a Lunar Rover federate and a Lunar Shuttle federate. The Lunar Rover federate supports transportation across the lunar surface and is essential for fostering interactions with other federates on the lunar surface (Lunar Shuttle, Lunar Base Supply Depot and Mobile ISRU Plant) as well as transporting materials to the desired locations. The Lunar Shuttle federate transports materials to and from lunar orbit. Materials that it takes to the supply depot include fuel and cargo necessary to continue moon-base operations. This project analyzes modeling and simulation technologies as well as simulation interoperability. Each team from participating universities will work on and engineer their own federate(s) to participate in the SISO Spring 2011 Workshop SIW Smackdown in Boston, Massachusetts. This paper will focus on the Lunar Rover federate.
Garcia, Macarena C; Garrett, Nedra Y; Singletary, Vivian; Brown, Sheereen; Hennessy-Burt, Tamara; Haney, Gillian; Link, Kimberly; Tripp, Jennifer; Mac Kenzie, William R; Yoon, Paula
2017-12-07
State and local public health agencies collect and use surveillance data to identify outbreaks, track cases, investigate causes, and implement measures to protect the public-s health through various surveillance systems and data exchange practices. The purpose of this assessment was to better understand current practices at state and local public health agencies for collecting, managing, processing, reporting, and exchanging notifiable disease surveillance information. Over an 18-month period (January 2014-June 2015), we evaluated the process of data exchange between surveillance systems, reporting burdens, and challenges within 3 states (California, Idaho, and Massachusetts) that were using 3 different reporting systems. All 3 states use a combination of paper-based and electronic information systems for managing and exchanging data on reportable conditions within the state. The flow of data from local jurisdictions to the state health departments varies considerably. When state and local information systems are not interoperable, manual duplicative data entry and other work-arounds are often required. The results of the assessment show the complexity of disease reporting at the state and local levels and the multiple systems, processes, and resources engaged in preparing, processing, and transmitting data that limit interoperability and decrease efficiency. Through this structured assessment, the Centers for Disease Control and Prevention (CDC) has a better understanding of the complexities for surveillance of using commercial off-the-shelf data systems (California and Massachusetts), and CDC-developed National Electronic Disease Surveillance System Base System. More efficient data exchange and use of data will help facilitate interoperability between National Notifiable Diseases Surveillance Systems.
Seebregts, Christopher; Dane, Pierre; Parsons, Annie Neo; Fogwill, Thomas; Rogers, Debbie; Bekker, Marcha; Shaw, Vincent; Barron, Peter
2018-01-01
MomConnect is a national initiative coordinated by the South African National Department of Health that sends text-based mobile phone messages free of charge to pregnant women who voluntarily register at any public healthcare facility in South Africa. We describe the system design and architecture of the MomConnect technical platform, planned as a nationally scalable and extensible initiative. It uses a health information exchange that can connect any standards-compliant electronic front-end application to any standards-compliant electronic back-end database. The implementation of the MomConnect technical platform, in turn, is a national reference application for electronic interoperability in line with the South African National Health Normative Standards Framework. The use of open content and messaging standards enables the architecture to include any application adhering to the selected standards. Its national implementation at scale demonstrates both the use of this technology and a key objective of global health information systems, which is to achieve implementation scale. The system’s limited clinical information, initially, allowed the architecture to focus on the base standards and profiles for interoperability in a resource-constrained environment with limited connectivity and infrastructural capacity. Maintenance of the system requires mobilisation of national resources. Future work aims to use the standard interfaces to include data from additional applications as well as to extend and interface the framework with other public health information systems in South Africa. The development of this platform has also shown the benefits of interoperability at both an organisational and technical level in South Africa. PMID:29713506
Seebregts, Christopher; Dane, Pierre; Parsons, Annie Neo; Fogwill, Thomas; Rogers, Debbie; Bekker, Marcha; Shaw, Vincent; Barron, Peter
2018-01-01
MomConnect is a national initiative coordinated by the South African National Department of Health that sends text-based mobile phone messages free of charge to pregnant women who voluntarily register at any public healthcare facility in South Africa. We describe the system design and architecture of the MomConnect technical platform, planned as a nationally scalable and extensible initiative. It uses a health information exchange that can connect any standards-compliant electronic front-end application to any standards-compliant electronic back-end database. The implementation of the MomConnect technical platform, in turn, is a national reference application for electronic interoperability in line with the South African National Health Normative Standards Framework. The use of open content and messaging standards enables the architecture to include any application adhering to the selected standards. Its national implementation at scale demonstrates both the use of this technology and a key objective of global health information systems, which is to achieve implementation scale. The system's limited clinical information, initially, allowed the architecture to focus on the base standards and profiles for interoperability in a resource-constrained environment with limited connectivity and infrastructural capacity. Maintenance of the system requires mobilisation of national resources. Future work aims to use the standard interfaces to include data from additional applications as well as to extend and interface the framework with other public health information systems in South Africa. The development of this platform has also shown the benefits of interoperability at both an organisational and technical level in South Africa.
Daskalakis, S; Mantas, J
2009-01-01
The evaluation of a service-oriented prototype implementation for healthcare interoperability. A prototype framework was developed, aiming to exploit the use of service-oriented architecture (SOA) concepts for achieving healthcare interoperability and to move towards a virtual patient record (VPR) paradigm. The prototype implementation was evaluated for its hypothetical adoption. The evaluation strategy was based on the initial proposition of the DeLone and McLean model of information systems (IS) success [1], as modeled by Iivari [2]. A set of SOA and VPR characteristics were empirically encapsulated within the dimensions of IS success model, combined with measures from previous research works. The data gathered was analyzed using partial least squares (PLS). The results highlighted that system quality is a partial predictor of system use but not of user satisfaction. On the contrary, information quality proved to be a significant predictor of user satisfaction and partially a strong significant predictor of system use. Moreover, system use did not prove to be a significant predictor of individual impact whereas the bi-directional relation between use and user satisfaction did not confirm. Additionally, user satisfaction was found to be a strong significant predictor of individual impact. Finally, individual impact proved to be a strong significant predictor of organizational impact. The empirical study attempted to obtain hypothetical, but still useful beliefs and perceptions regarding the SOA prototype implementation. The deduced observations can form the basis for further investigation regarding the adaptability of SOA implementations with VPR characteristics in the healthcare domain.
Creating personalised clinical pathways by semantic interoperability with electronic health records.
Wang, Hua-Qiong; Li, Jing-Song; Zhang, Yi-Fan; Suzuki, Muneou; Araki, Kenji
2013-06-01
There is a growing realisation that clinical pathways (CPs) are vital for improving the treatment quality of healthcare organisations. However, treatment personalisation is one of the main challenges when implementing CPs, and the inadequate dynamic adaptability restricts the practicality of CPs. The purpose of this study is to improve the practicality of CPs using semantic interoperability between knowledge-based CPs and semantic electronic health records (EHRs). Simple protocol and resource description framework query language is used to gather patient information from semantic EHRs. The gathered patient information is entered into the CP ontology represented by web ontology language. Then, after reasoning over rules described by semantic web rule language in the Jena semantic framework, we adjust the standardised CPs to meet different patients' practical needs. A CP for acute appendicitis is used as an example to illustrate how to achieve CP customisation based on the semantic interoperability between knowledge-based CPs and semantic EHRs. A personalised care plan is generated by comprehensively analysing the patient's personal allergy history and past medical history, which are stored in semantic EHRs. Additionally, by monitoring the patient's clinical information, an exception is recorded and handled during CP execution. According to execution results of the actual example, the solutions we present are shown to be technically feasible. This study contributes towards improving the clinical personalised practicality of standardised CPs. In addition, this study establishes the foundation for future work on the research and development of an independent CP system. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Parsons, M. A.; Yarmey, L.; Dillo, I.
2017-12-01
Data are the foundation of a robust, efficient, and reproducible scientific enterprise. The Research Data Alliance (RDA) is a community-driven, action-oriented, virtual organization committed to enabling open sharing and reuse of data by building social and technical bridges. The international RDA community includes almost 6000 members bringing diverse perspectives, domain knowledge, and expertise to a common table for identification of common challenges and holistic solutions. RDA members work together to identify common interests and form exploratory Interest Groups and outcome-oriented Working Groups. Participants exchange knowledge, share discoveries, discuss barriers and potential solutions, articulate policies, and align standards to enhance and facilitate global data sharing within and across domains and communities. With activities defined and led by members, RDA groups have organically been addressing issues across the full research cycle with community-ratified Recommendations and other outputs that begin to create the components of a global, data-sharing infrastructure. This paper examines how multiple RDA Recommendations can be implemented together to improve data and information discoverability, accessibility, and interconnection by both people and machines. For instance, the Persistent Identifier Types can support moving data across platforms through the Data Description Registry Interoperability framework following the RDA/WDS Publishing Data Workflows model. The Scholix initiative connects scholarly literature and data across numerous stakeholders can draw on the Practical Policy best practices for machine-actionable data policies. Where appropriate, we use a case study approach built around several flagship data sets from the Deep Carbon Observatory to examine how multiple RDA Recommendations can be implemented in actual practice.
Bellgard, Matthew I; Macgregor, Andrew; Janon, Fred; Harvey, Adam; O'Leary, Peter; Hunter, Adam; Dawkins, Hugh
2012-10-01
There is a need to develop Internet-based rare disease registries to support health care stakeholders to deliver improved quality patient outcomes. Such systems should be architected to enable multiple-level access by a range of user groups within a region or across regional/country borders in a secure and private way. However, this functionality is currently not available in many existing systems. A new approach to the design of an Internet-based architecture for disease registries has been developed for patients with clinical and genetic data in geographical disparate locations. The system addresses issues of multiple-level access by key stakeholders, security and privacy. The system has been successfully adopted for specific rare diseases in Australia and is open source. The results of this work demonstrate that it is feasible to design an open source Internet-based disease registry system in a scalable and customizable fashion and designed to facilitate interoperability with other systems. © 2012 Wiley Periodicals, Inc.
The Emerging Infrastructure of Autonomous Astronomy
NASA Astrophysics Data System (ADS)
Seaman, R.; Allan, A.; Axelrod, T.; Cook, K.; White, R.; Williams, R.
2007-10-01
Advances in the understanding of cosmic processes demand that sky transient events be confronted with statistical techniques honed on static phenomena. Time domain data sets require vast surveys such as LSST {http://www.lsst.org/lsst_home.shtml} and Pan-STARRS {http://www.pan-starrs.ifa.hawaii.edu}. A new autonomous infrastructure must close the loop from the scheduling of survey observations, through data archiving and pipeline processing, to the publication of transient event alerts and automated follow-up, and to the easy analysis of resulting data. The IVOA VOEvent {http://voevent.org} working group leads efforts to characterize sky transient alerts published through VOEventNet {http://voeventnet.org}. The Heterogeneous Telescope Networks (HTN {http://www.telescope-networks.org}) consortium are observatories and robotic telescope projects seeking interoperability with a long-term goal of creating an e-market for telescope time. Two projects relying on VOEvent and HTN are eSTAR {http://www.estar.org.uk} and the Thinking Telescope {http://www.thinkingtelescopes.lanl.gov} Project.
Millonig, Marsha K
2009-01-01
To convene a diverse group of stakeholders to discuss medication therapy management (MTM) documentation and billing standardization and its interoperability within the health care system. More than 70 stakeholders from pharmacy, health information systems, insurers/payers, quality, and standard-setting organizations met on October 7-8, 2008, in Bethesda, MD. The American Pharmacists Association (APhA) organized the invitational conference to facilitate discussion on strategic directions for meeting current market need for MTM documentation and billing interoperability and future market needs for MTM integration into electronic health records (EHRs). APhA recently adopted policy that specifically addresses technology barriers and encourages the use and development of standardized systems for the documentation and billing of MTM services. Day 1 of the conference featured six foundational presentations on health information technology (HIT) trends, perspectives on MTM from the profession and the Centers for Medicare & Medicaid Services, health care quality and medication-related outcome measures, integrating MTM workflow in EHRs, and the current state of MTM operalization in practice. After hearing presentations on day 1 and having the opportunity to pose questions to each speaker, conference participants were divided into three breakout groups on day 2. Each group met three times for 60 minutes each and discussed five questions from the perspective of a patient, provider, or payer. Three facilitators met with each of the groups and led discussion from one perspective (i.e., patient, provider, payer). Participants then reconvened as a complete group to participate in a discussion on next steps. HIT is expected to assist in delivering safe, effective, efficient, coordinated care as health professionals strive to improve the quality of care and outcomes for individual patients. The pharmacy profession is actively contributing to quality patient care through MTM services focused on identifying and preventing medication-related problems, improving medication use, and optimizing individual therapeutic outcomes. As MTM programs continue to expand within the health care system, one important limiting factor is the lack of standardization for documentation and billing of MTM services. This lack of interoperability between technology systems, software, and system platforms is presenting as a barrier to MTM service delivery for patients. APhA convened this invitational conference to identify strategic directions to address MTM documentation and billing standardization and interoperability. Participants viewed the meeting as highly successful in bringing together a unique, wide-ranging set of stakeholders, including the government, regulators, standards organizations, other health professions, technology firms, professional organizations, and practitioners, to share perspectives. They strongly encouraged the Association to continue this unique stakeholder dialogue. Participants provided a number of next-step suggestions for APhA to consider because of the event. Participants noted the pharmacy profession's success in building information technology systems for product transactions with systematic, organized, methodical thinking and the need to apply this success to patient services. A unique opportunity exists for the profession to influence and lead the HIT community in creating a workable health technology solution for MTM services. Reaching consensus on minimum data sets for each functional area--clinical, billing, quality improvement--would be a very important short-term gain. Further, participants said it was imperative for pharmacists and the pharmacy community at large to become actively engaged in HIT standards development efforts.
Barrón-González, Héctor Gilberto; Martínez-Espronceda, Miguel; Trigo, Jesús Daniel; Led, Santiago; Serrano, Luis
2016-01-01
The Point of Care (PoC) version of the interoperability standard ISO/IEEE11073 (X73) provided a mechanism to control remotely agents through documents X73-10201 and X73-20301. The newer version of X73 oriented to Personal Health Devices (PHD) has no mechanisms to do such a thing. The authors are working toward a common proposal with the PHD Working Group (PHD-WG) in order to adapt the remote control capabilities from X73PoC to X73PHD. However, this theoretical adaptation has to be implemented and tested to evaluate whether or not its inclusion entails an acceptable overhead and extra cost. Such proof-of-concept assessment is the main objective of this paper. For the sake of simplicity, a weighing scale with a configurable operation was chosen as use case. First, in a previous stage of the research - the model was defined. Second, the implementation methodology - both in terms of hardware and software - was defined and executed. Third, an evaluation methodology to test the remote control features was defined. Then, a thorough comparison between a weighing scale with and without remote control was performed. The results obtained indicate that, when implementing remote control in a weighing scale, the relative weight of such feature represents an overhead of as much as 53%, whereas the number of Implementation Conformance Statements (ICSs) to be satisfied by the manufacturer represent as much as 34% regarding the implementation without remote control. The new feature facilitates remote control of PHDs but, at the same time, increases overhead and costs, and, therefore, manufacturers need to weigh this trade-off. As a conclusion, this proof-of-concept helps in fostering the evolution of the remote control proposal to extend X73PHD and promotes its inclusion as part of the standard, as well as it illustrates the methodological steps for its extrapolation to other specializations. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Interoperability, Scaling, and the Digital Libraries Research Agenda.
ERIC Educational Resources Information Center
Lynch, Clifford; Garcia-Molina, Hector
1996-01-01
Summarizes reports and activities at the Information Infrastructure Technology and Applications workshop on digital libraries (Reston, Virginia, August 22, 1995). Defines digital library roles and identifies areas of needed research, including: interoperability; protocols for digital objects; collection management; interface design; human-computer…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-05
... Communication Interoperability Plan Implementation Report AGENCY: National Protection and Programs Directorate... Directorate/Cybersecurity and Communications/Office of Emergency Communications, has submitted the following... INFORMATION: The Office of Emergency Communications (OEC), formed under Title XVIII of the Homeland Security...
Interoperable Communications for Hierarchical Heterogeneous Wireless Networks
2016-04-01
The PhD student , Nan Zou supervised by the PI, won the Best Poster Award in the STEAM Research Symposium on March 21, 2014. Received Book Chapter TOTAL...Belief Propagation for spectrum awareness within one network for the multiple channel case in a previous study [86] 39 Figure 2.2 An illustration of the...wireless networks enabled by cognitive radio technology. The PIs have been working closely with students to carry out all the proposed research tasks
NASA Astrophysics Data System (ADS)
Stall, S.
2017-12-01
Integrity and transparency within research is solidified by a complete set of research products that are findable, accessible, interoperable, and reusable. In other words, they follow the FAIR Guidelines developed by FORCE11.org. Your datasets, images, video, software, scripts, models, physical samples, and other tools and technology are an integral part of the narrative you tell about your research. These research products increasingly are being captured through workflow tools and preserved and connected through persistent identifiers across multiple repositories that keep them safe. They help secure, with your publications, the supporting evidence and integrity of the scientific record. This is the direction that Earth and space science as well as other disciplines is moving. Within our community, some science domains are further along, and others are taking more measured steps. AGU as a publisher is working to support the full scientific record with peer reviewed publications. Working with our community and all the Earth and space science journals, AGU is developing new policies to encourage researchers to plan for proper data preservation and provide data citations along with their research submission and to encourage adoption of best practices throughout the research workflow and data life cycle. Providing incentives, community standards, and easy-to-use tools are some important factors for helping researchers embrace the FAIR Guidelines and support transparency and integrity.
Social network of PESCA (Open Source Platform for eHealth).
Sanchez, Carlos L; Romero-Cuevas, Miguel; Lopez, Diego M; Lorca, Julio; Alcazar, Francisco J; Ruiz, Sergio; Mercado, Carmen; Garcia-Fortea, Pedro
2008-01-01
Information and Communication Technologies (ICTs) are revolutionizing how healthcare systems deliver top-quality care to citizens. In this way, Open Source Software (OSS) has demonstrated to be an important strategy to spread ICTs use. Several human and technological barriers in adopting OSS for healthcare have been identified. Human barriers include user acceptance, limited support, technical skillfulness, awareness, resistance to change, etc., while Technological barriers embrace need for open standards, heterogeneous OSS developed without normalization and metrics, lack of initiatives to evaluate existing health OSS and need for quality control and functional validation. The goals of PESCA project are to create a platform of interoperable modules to evaluate, classify and validate good practices in health OSS. Furthermore, a normalization platform will provide interoperable solutions in the fields of healthcare services, health surveillance, health literature, and health education, knowledge and research. Within the platform, the first goal to achieve is the setup of the collaborative work infrastructure. The platform is being organized as a Social Network which works to evaluate five scopes of every existing open source tools for eHealth: Open Source Software, Quality, Pedagogical, Security and privacy and Internationalization/I18N. In the meantime, the knowledge collected from the networking will configure a Good Practice Repository on eHealth promoting the effective use of ICT on behalf of the citizen's health.
[The role of Integrating the Healthcare Enterprise (IHE) in telemedicine].
Bergh, B; Brandner, A; Heiß, J; Kutscha, U; Merzweiler, A; Pahontu, R; Schreiweis, B; Yüksekogul, N; Bronsch, T; Heinze, O
2015-10-01
Telemedicine systems are today already used in a variety of areas to improve patient care. The lack of standardization in those solutions creates a lack of interoperability of the systems. Internationally accepted standards can help to solve the lack of system interoperability. With Integrating the Healthcare Enterprise (IHE), a worldwide initiative of users and vendors is working on the use of defined standards for specific use cases by describing those use cases in so called IHE Profiles. The aim of this work is to determine how telemedicine applications can be implemented using IHE profiles. Based on a literature review, exemplary telemedicine applications are described and technical abilities of IHE Profiles are evaluated. These IHE Profiles are examined for their usability and are then evaluated in exemplary telemedicine application architectures. There are IHE Profiles which can be identified as being useful for intersectoral patient records (e.g. PEHR at Heidelberg), as well as for point to point communication where no patient record is involved. In the area of patient records, the IHE Profile "Cross-Enterprise Document Sharing (XDS)" is often used. The point to point communication can be supported using the IHE "Cross-Enterprise Document Media Interchange (XDM)". IHE-based telemedicine applications offer caregivers the possibility to be informed about their patients using data from intersectoral patient records, but also there are possible savings by reusing the standardized interfaces in other scenarios.
Interoperability in Personalized Adaptive Learning
ERIC Educational Resources Information Center
Aroyo, Lora; Dolog, Peter; Houben, Geert-Jan; Kravcik, Milos; Naeve, Ambjorn; Nilsson, Mikael; Wild, Fridolin
2006-01-01
Personalized adaptive learning requires semantic-based and context-aware systems to manage the Web knowledge efficiently as well as to achieve semantic interoperability between heterogeneous information resources and services. The technological and conceptual differences can be bridged either by means of standards or via approaches based on the…
Towards Cross-Organizational Innovative Business Process Interoperability Services
NASA Astrophysics Data System (ADS)
Karacan, Ömer; Del Grosso, Enrico; Carrez, Cyril; Taglino, Francesco
This paper presents the vision and initial results of the COIN (FP7-IST-216256) European project for the development of open source Collaborative Business Process Interoperability (CBPip) in cross-organisational business collaboration environments following the Software-as-a-Service Utility (SaaS-U) paradigm.
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a... Public Safety and Homeland Security Bureau to develop, recommend, and administer policy goals, objectives... and procedures for the 700 MHz public safety broadband wireless network and other public safety...
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a... Public Safety and Homeland Security Bureau to develop, recommend, and administer policy goals, objectives... and procedures for the 700 MHz public safety broadband wireless network and other public safety...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-12
... Alerting, E9-1-1 Location Accuracy, Network Security Best Practices, DNSSEC Implementation Practices for... FEDERAL COMMUNICATIONS COMMISSION Federal Advisory Committee Act; Communications Security... Security, Reliability, and Interoperability Council (CSRIC) meeting that was scheduled for March 6, 2013 is...
Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E.
2014-01-01
Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information. PMID:25954452
Achieving interoperability for metadata registries using comparative object modeling.
Park, Yu Rang; Kim, Ju Han
2010-01-01
Achieving data interoperability between organizations relies upon agreed meaning and representation (metadata) of data. For managing and registering metadata, many organizations have built metadata registries (MDRs) in various domains based on international standard for MDR framework, ISO/IEC 11179. Following this trend, two pubic MDRs in biomedical domain have been created, United States Health Information Knowledgebase (USHIK) and cancer Data Standards Registry and Repository (caDSR), from U.S. Department of Health & Human Services and National Cancer Institute (NCI), respectively. Most MDRs are implemented with indiscriminate extending for satisfying organization-specific needs and solving semantic and structural limitation of ISO/IEC 11179. As a result it is difficult to address interoperability among multiple MDRs. In this paper, we propose an integrated metadata object model for achieving interoperability among multiple MDRs. To evaluate this model, we developed an XML Schema Definition (XSD)-based metadata exchange format. We created an XSD-based metadata exporter, supporting both the integrated metadata object model and organization-specific MDR formats.
Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E
2014-01-01
Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.
Interoperable and accessible census and survey data from IPUMS.
Kugler, Tracy A; Fitch, Catherine A
2018-02-27
The first version of the Integrated Public Use Microdata Series (IPUMS) was released to users in 1993, and since that time IPUMS has come to stand for interoperable and accessible census and survey data. Initially created to harmonize U.S. census microdata over time, IPUMS now includes microdata from the U.S. and international censuses and from surveys on health, employment, and other topics. IPUMS also provides geo-spatial data, aggregate population data, and environmental data. IPUMS supports ten data products, each disseminating an integrated data collection with a set of tools that make complex data easy to find, access, and use. Key features are record-level integration to create interoperable datasets, user-friendly interfaces, and comprehensive metadata and documentation. The IPUMS philosophy aligns closely with the FAIR principles of findability, accessibility, interoperability, and re-usability. IPUMS data have catalyzed knowledge generation across a wide range of social science and other disciplines, as evidenced by the large volume of publications and other products created by the vast IPUMS user community.
Daniel, Christel; Ouagne, David; Sadou, Eric; Forsberg, Kerstin; Gilchrist, Mark Mc; Zapletal, Eric; Paris, Nicolas; Hussain, Sajjad; Jaulent, Marie-Christine; MD, Dipka Kalra
2016-01-01
With the development of platforms enabling the use of routinely collected clinical data in the context of international clinical research, scalable solutions for cross border semantic interoperability need to be developed. Within the context of the IMI EHR4CR project, we first defined the requirements and evaluation criteria of the EHR4CR semantic interoperability platform and then developed the semantic resources and supportive services and tooling to assist hospital sites in standardizing their data for allowing the execution of the project use cases. The experience gained from the evaluation of the EHR4CR platform accessing to semantically equivalent data elements across 11 European participating EHR systems from 5 countries demonstrated how far the mediation model and mapping efforts met the expected requirements of the project. Developers of semantic interoperability platforms are beginning to address a core set of requirements in order to reach the goal of developing cross border semantic integration of data. PMID:27570649
United States Military Posture for FY 1989.
1989-01-01
LIC) can erode a climate of force structure needs modern equipment to meet the peace and stability in the world at large, frustrate co - threat...a Egypt Sraemaet -~Saudi Arabia Oa ni Dahalak a owvt marimet-N patrol aircraft US Carrier Rattio Group,_rcraft j Aircrft ca rorper t ’ . Surfce co ...security and interoperability of NATO. Defense Co - economic stabilization and growth. In FY 1988 operation in Armaments organizations are also being
A Future Vision for Remotely Piloted Aircraft: Leveraging Interoperability and Networked Operations
2013-06-21
over the next 25 years Balances the effects envisioned in the USAF UAS Flight Plan with the reality of constrained resources and ambitious...theater-level unmanned systems must detect, avoid, or counter threats – operating from permissive to highly contested access in all weather...Rapid Reaction Group II/III SUAS Unit Light Footprint, Low Cost ISR Option Networked Autonomous C2 System Air-Launched SUAS Common
NASA Astrophysics Data System (ADS)
Tisdale, M.
2016-12-01
NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), OGC Web Coverage Services (WCS) and leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams and ASDC are utilizing these services, developing applications using the Web AppBuilder for ArcGIS and ArcGIS API for Javascript, and evaluating restructuring their data production and access scripts within the ArcGIS Python Toolbox framework and Geoprocessing service environment. These capabilities yield a greater usage and exposure of ASDC data holdings and provide improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry.
The Italian Cloud-based brokering Infrastructure to sustain Interoperability for Operative Hydrology
NASA Astrophysics Data System (ADS)
Boldrini, E.; Pecora, S.; Bussettini, M.; Bordini, F.; Nativi, S.
2015-12-01
This work presents the informatics platform carried out to implement the National Hydrological Operative Information System of Italy. In particular, the presentation will focus on the governing aspects of the cloud infrastructure and brokering software that make possible to sustain the hydrology data flow between heterogeneous user clients and data providers.The Institute for Environmental Protection and Research, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale) in collaboration with the Regional Agency for Environmental Protection in the Emilia-Romagna region, ARPA-ER (Agenzia Regionale per la Prevenzione e l´Ambiente dell´Emilia-Romagna) and CNR-IIA (National Research Council of Italy) designed and developed an innovative platform for the discovery and access of hydrological data coming from 19 Italian administrative regions and 2 Italian autonomous provinces, in near real time. ISPRA has deployed and governs such a system. The presentation will introduce and discuss the technological barriers for interoperability as well as social and policy ones. The adopted solutions will be described outlining the sustainability challenges and benefits.
OntoCR: A CEN/ISO-13606 clinical repository based on ontologies.
Lozano-Rubí, Raimundo; Muñoz Carrero, Adolfo; Serrano Balazote, Pablo; Pastor, Xavier
2016-04-01
To design a new semantically interoperable clinical repository, based on ontologies, conforming to CEN/ISO 13606 standard. The approach followed is to extend OntoCRF, a framework for the development of clinical repositories based on ontologies. The meta-model of OntoCRF has been extended by incorporating an OWL model integrating CEN/ISO 13606, ISO 21090 and SNOMED CT structure. This approach has demonstrated a complete evaluation cycle involving the creation of the meta-model in OWL format, the creation of a simple test application, and the communication of standardized extracts to another organization. Using a CEN/ISO 13606 based system, an indefinite number of archetypes can be merged (and reused) to build new applications. Our approach, based on the use of ontologies, maintains data storage independent of content specification. With this approach, relational technology can be used for storage, maintaining extensibility capabilities. The present work demonstrates that it is possible to build a native CEN/ISO 13606 repository for the storage of clinical data. We have demonstrated semantic interoperability of clinical information using CEN/ISO 13606 extracts. Copyright © 2016 Elsevier Inc. All rights reserved.
XML schemas for common bioinformatic data types and their application in workflow systems
Seibel, Philipp N; Krüger, Jan; Hartmeier, Sven; Schwarzer, Knut; Löwenthal, Kai; Mersch, Henning; Dandekar, Thomas; Giegerich, Robert
2006-01-01
Background Today, there is a growing need in bioinformatics to combine available software tools into chains, thus building complex applications from existing single-task tools. To create such workflows, the tools involved have to be able to work with each other's data – therefore, a common set of well-defined data formats is needed. Unfortunately, current bioinformatic tools use a great variety of heterogeneous formats. Results Acknowledging the need for common formats, the Helmholtz Open BioInformatics Technology network (HOBIT) identified several basic data types used in bioinformatics and developed appropriate format descriptions, formally defined by XML schemas, and incorporated them in a Java library (BioDOM). These schemas currently cover sequence, sequence alignment, RNA secondary structure and RNA secondary structure alignment formats in a form that is independent of any specific program, thus enabling seamless interoperation of different tools. All XML formats are available at , the BioDOM library can be obtained at . Conclusion The HOBIT XML schemas and the BioDOM library simplify adding XML support to newly created and existing bioinformatic tools, enabling these tools to interoperate seamlessly in workflow scenarios. PMID:17087823
Dixon, Brian E; Gamache, Roland E; Grannis, Shaun J
2013-05-01
To summarize the literature describing computer-based interventions aimed at improving bidirectional communication between clinical and public health. A systematic review of English articles using MEDLINE and Google Scholar. Search terms included public health, epidemiology, electronic health records, decision support, expert systems, and decision-making. Only articles that described the communication of information regarding emerging health threats from public health agencies to clinicians or provider organizations were included. Each article was independently reviewed by two authors. Ten peer-reviewed articles highlight a nascent but promising area of research and practice related to alerting clinicians about emerging threats. Current literature suggests that additional research and development in bidirectional communication infrastructure should focus on defining a coherent architecture, improving interoperability, establishing clear governance, and creating usable systems that will effectively deliver targeted, specific information to clinicians in support of patient and population decision-making. Increasingly available clinical information systems make it possible to deliver timely, relevant knowledge to frontline clinicians in support of population health. Future work should focus on developing a flexible, interoperable infrastructure for bidirectional communications capable of integrating public health knowledge into clinical systems and workflows.
Telemonitoring of patients with Parkinson's disease using inertia sensors.
Piro, N E; Baumann, L; Tengler, M; Piro, L; Blechschmidt-Trapp, R
2014-01-01
Medical treatment in patients suffering from Parkinson's disease is very difficult as dose-finding is mainly based on selective and subjective impressions by the physician. To allow for the objective evaluation of patients' symptoms required for optimal dosefinding, a telemonitoring system tracks the motion of patients in their surroundings. The system focuses on providing interoperability and usability in order to ensure high acceptance. Patients wear inertia sensors and perform standardized motor tasks. Data are recorded, processed and then presented to the physician in a 3D animated form. In addition, the same data is rated based on the UPDRS score. Interoperability is realized by developing the system in compliance with the recommendations of the Continua Health Alliance. Detailed requirements analysis and continuous collaboration with respective user groups help to achieve high usability. A sensor platform was developed that is capable of measuring acceleration and angular rate of motions as well as the absolute orientation of the device itself through an included compass sensor. The system architecture was designed and required infrastructure, and essential parts of the communication between the system components were implemented following Continua guidelines. Moreover, preliminary data analysis based on three-dimensional acceleration and angular rate data could be established. A prototype system for the telemonitoring of Parkinson's disease patients was successfully developed. The developed sensor platform fully satisfies the needs of monitoring patients of Parkinson's disease and is comparable to other sensor platforms, although these sensor platforms have yet to be tested rigorously against each other. Suitable approaches to provide interoperability and usability were identified and realized and remain to be tested in the field.
Kano, Yoshinobu; Nguyen, Ngan; Saetre, Rune; Yoshida, Kazuhiro; Miyao, Yusuke; Tsuruoka, Yoshimasa; Matsubayashi, Yuichiro; Ananiadou, Sophia; Tsujii, Jun'ichi
2008-01-01
Recently, several text mining programs have reached a near-practical level of performance. Some systems are already being used by biologists and database curators. However, it has also been recognized that current Natural Language Processing (NLP) and Text Mining (TM) technology is not easy to deploy, since research groups tend to develop systems that cater specifically to their own requirements. One of the major reasons for the difficulty of deployment of NLP/TM technology is that re-usability and interoperability of software tools are typically not considered during development. While some effort has been invested in making interoperable NLP/TM toolkits, the developers of end-to-end systems still often struggle to reuse NLP/TM tools, and often opt to develop similar programs from scratch instead. This is particularly the case in BioNLP, since the requirements of biologists are so diverse that NLP tools have to be adapted and re-organized in a much more extensive manner than was originally expected. Although generic frameworks like UIMA (Unstructured Information Management Architecture) provide promising ways to solve this problem, the solution that they provide is only partial. In order for truly interoperable toolkits to become a reality, we also need sharable type systems and a developer-friendly environment for software integration that includes functionality for systematic comparisons of available tools, a simple I/O interface, and visualization tools. In this paper, we describe such an environment that was developed based on UIMA, and we show its feasibility through our experience in developing a protein-protein interaction (PPI) extraction system.
The Development of Clinical Document Standards for Semantic Interoperability in China
Yang, Peng; Pan, Feng; Wan, Yi; Tu, Haibo; Tang, Xuejun; Hu, Jianping
2011-01-01
Objectives This study is aimed at developing a set of data groups (DGs) to be employed as reusable building blocks for the construction of the eight most common clinical documents used in China's general hospitals in order to achieve their structural and semantic standardization. Methods The Diagnostics knowledge framework, the related approaches taken from the Health Level Seven (HL7), the Integrating the Healthcare Enterprise (IHE), and the Healthcare Information Technology Standards Panel (HITSP) and 1,487 original clinical records were considered together to form the DG architecture and data sets. The internal structure, content, and semantics of each DG were then defined by mapping each DG data set to a corresponding Clinical Document Architecture data element and matching each DG data set to the metadata in the Chinese National Health Data Dictionary. By using the DGs as reusable building blocks, standardized structures and semantics regarding the clinical documents for semantic interoperability were able to be constructed. Results Altogether, 5 header DGs, 48 section DGs, and 17 entry DGs were developed. Several issues regarding the DGs, including their internal structure, identifiers, data set names, definitions, length and format, data types, and value sets, were further defined. Standardized structures and semantics regarding the eight clinical documents were structured by the DGs. Conclusions This approach of constructing clinical document standards using DGs is a feasible standard-driven solution useful in preparing documents possessing semantic interoperability among the disparate information systems in China. These standards need to be validated and refined through further study. PMID:22259722
Evaluation Methodology for UML and GML Application Schemas Quality
NASA Astrophysics Data System (ADS)
Chojka, Agnieszka
2014-05-01
INSPIRE Directive implementation in Poland has caused the significant increase of interest in making spatial data and services available, particularly among public administration and private institutions. This entailed a series of initiatives that aim to harmonise different spatial data sets, so to ensure their internal logical and semantic coherence. Harmonisation lets to reach the interoperability of spatial databases, then among other things enables joining them together. The process of harmonisation requires either working out new data structures or adjusting existing data structures of spatial databases to INSPIRE guidelines and recommendations. Data structures are described with the use of UML and GML application schemas. Although working out accurate and correct application schemas isn't an easy task. There should be considered many issues, for instance recommendations of ISO 19100 series of Geographic Information Standards, appropriate regulations for given problem or topic, production opportunities and limitations (software, tools). In addition, GML application schema is deeply connected with UML application schema, it should be its translation. Not everything that can be expressed in UML, though can be directly expressed in GML, and this can have significant influence on the spatial data sets interoperability, and thereby the ability to valid data exchange. For these reasons, the capability to examine and estimate UML and GML application schemas quality, therein also the capability to explore their entropy, would be very important. The principal subject of this research is to propose an evaluation methodology for UML and GML application schemas quality prepared in the Head Office of Geodesy and Cartography in Poland within the INSPIRE Directive implementation works.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Womeldorff, Geoffrey Alan; Payne, Joshua Estes; Bergen, Benjamin Karl
These are slides for a presentation on PARTISN Research and FleCSI Updates. The following topics are covered: SNAP vs PARTISN, Background Research, Production Code (structural design and changes, kernel design and implementation, lessons learned), NuT IMC Proxy, FleCSI Update (design and lessons learned). It can all be summarized in the following manner: Kokkos was shown to be effective in FY15 in implementing a C++ version of SNAP's kernel. This same methodology was applied to a production IC code, PARTISN. This was a much more complex endeavour than in FY15 for many reasons; a C++ kernel embedded in Fortran, overloading Fortranmore » memory allocations, general language interoperability, and a fully fleshed out production code versus a simplified proxy code. Lessons learned are Legion. In no particular order: Interoperability between Fortran and C++ was really not that hard, and a useful engineering effort. Tracking down all necessary memory allocations for a kernel in a production code is pretty hard. Modifying a production code to work for more than a handful of use cases is also pretty hard. Figuring out the toolchain that will allow a successful implementation of design decisions is quite hard, if making use of "bleeding edge" design choices. In terms of performance, production code concurrency architecture can be a virtual showstopper; being too complex to easily rewrite and test in a short period of time, or depending on tool features which do not exist yet. Ultimately, while the tools used in this work were not successful in speeding up the production code, they helped to identify how work would be done, and provide requirements to tools.« less
The Next Stage: Moving from Isolated Digital Collections to Interoperable Digital Libraries.
ERIC Educational Resources Information Center
Besser, Howard
2002-01-01
Presents a conceptual framework for digital library development and discusses how to move from isolated digital collections to interoperable digital libraries. Topics include a history of digital libraries; user-centered architecture; stages of technological development; standards, including metadata; and best practices. (Author/LRW)
Proceedings of the ITS Standards Program Review and Interoperability Workshop
DOT National Transportation Integrated Search
1997-12-17
An ITS Standards Program Review and Interoperability Workshop was held on Dec. 17-18, 1997 in Arlington, Va. It was sponsored by the U.S. DOT, ITS America, George Mason University (GMU) and the University of Michigan. The purpose was to review the US...
75 FR 66752 - Smart Grid Interoperability Standards; Notice of Technical Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-29
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid... adoption of Smart Grid Interoperability Standards (Standards) in their States. On October 6, 2010, the....m. Eastern time in conjunction with the NARUC/FERC Collaborative on Smart Response (Collaborative...
76 FR 4102 - Smart Grid Interoperability Standards; Supplemental Notice of Technical Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-24
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid... Federal Energy Regulatory Commission announced that a Technical Conference on Smart Grid Interoperability... National Institute of Standards and Technology are ready for Commission consideration in a rulemaking...
IHE cross-enterprise document sharing for imaging: interoperability testing software
2010-01-01
Background With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties. PMID:20858241
IHE cross-enterprise document sharing for imaging: interoperability testing software.
Noumeir, Rita; Renaud, Bérubé
2010-09-21
With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.
Interoperable web applications for sharing data and products of the International DORIS Service
NASA Astrophysics Data System (ADS)
Soudarin, L.; Ferrage, P.
2017-12-01
The International DORIS Service (IDS) was created in 2003 under the umbrella of the International Association of Geodesy (IAG) to foster scientific research related to the French satellite tracking system DORIS and to deliver scientific products, mostly related to the International Earth rotation and Reference systems Service (IERS). Since its start, the organization has continuously evolved, leading to additional and improved operational products from an expanded set of DORIS Analysis Centers. In addition, IDS has developed services for sharing data and products with the users. Metadata and interoperable web applications are proposed to explore, visualize and download the key products such as the position time series of the geodetic points materialized at the ground tracking stations. The Global Geodetic Observing System (GGOS) encourages the IAG Services to develop such interoperable facilities on their website. The objective for GGOS is to set up an interoperable portal through which the data and products produced by the IAG Services can be served to the user community. We present the web applications proposed by IDS to visualize time series of geodetic observables or to get information about the tracking ground stations and the tracked satellites. We discuss the future plans for IDS to meet the recommendations of GGOS. The presentation also addresses the needs for the IAG Services to adopt common metadata thesaurus to describe data and products, and interoperability standards to share them.
NASA Astrophysics Data System (ADS)
Mueller, Wolfgang; Mueller, Henning; Marchand-Maillet, Stephane; Pun, Thierry; Squire, David M.; Pecenovic, Zoran; Giess, Christoph; de Vries, Arjen P.
2000-10-01
While in the area of relational databases interoperability is ensured by common communication protocols (e.g. ODBC/JDBC using SQL), Content Based Image Retrieval Systems (CBIRS) and other multimedia retrieval systems are lacking both a common query language and a common communication protocol. Besides its obvious short term convenience, interoperability of systems is crucial for the exchange and analysis of user data. In this paper, we present and describe an extensible XML-based query markup language, called MRML (Multimedia Retrieval markup Language). MRML is primarily designed so as to ensure interoperability between different content-based multimedia retrieval systems. Further, MRML allows researchers to preserve their freedom in extending their system as needed. MRML encapsulates multimedia queries in a way that enable multimedia (MM) query languages, MM content descriptions, MM query engines, and MM user interfaces to grow independently from each other, reaching a maximum of interoperability while ensuring a maximum of freedom for the developer. For benefitting from this, only a few simple design principles have to be respected when extending MRML for one's fprivate needs. The design of extensions withing the MRML framework will be described in detail in the paper. MRML has been implemented and tested for the CBIRS Viper, using the user interface Snake Charmer. Both are part of the GNU project and can be downloaded at our site.
Standards to support information systems integration in anatomic pathology.
Daniel, Christel; García Rojo, Marcial; Bourquard, Karima; Henin, Dominique; Schrader, Thomas; Della Mea, Vincenzo; Gilbertson, John; Beckwith, Bruce A
2009-11-01
Integrating anatomic pathology information- text and images-into electronic health care records is a key challenge for enhancing clinical information exchange between anatomic pathologists and clinicians. The aim of the Integrating the Healthcare Enterprise (IHE) international initiative is precisely to ensure interoperability of clinical information systems by using existing widespread industry standards such as Digital Imaging and Communication in Medicine (DICOM) and Health Level Seven (HL7). To define standard-based informatics transactions to integrate anatomic pathology information to the Healthcare Enterprise. We used the methodology of the IHE initiative. Working groups from IHE, HL7, and DICOM, with special interest in anatomic pathology, defined consensual technical solutions to provide end-users with improved access to consistent information across multiple information systems. The IHE anatomic pathology technical framework describes a first integration profile, "Anatomic Pathology Workflow," dedicated to the diagnostic process including basic image acquisition and reporting solutions. This integration profile relies on 10 transactions based on HL7 or DICOM standards. A common specimen model was defined to consistently identify and describe specimens in both HL7 and DICOM transactions. The IHE anatomic pathology working group has defined standard-based informatics transactions to support the basic diagnostic workflow in anatomic pathology laboratories. In further stages, the technical framework will be completed to manage whole-slide images and semantically rich structured reports in the diagnostic workflow and to integrate systems used for patient care and those used for research activities (such as tissue bank databases or tissue microarrayers).
2010-12-01
This involves zeroing and recreating the interoperability arrays and other variables used in the simulation. Since the constants do not change from run......Using this algorithm, the process of encrypting/decrypting data requires very little computation, and the generation of the random pads can be
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-05
... FEDERAL COMMUNICATIONS COMMISSION 47 CFR Part 27 [WT Docket Nos. 12-69, 12-332; FCC 13-136] Promoting Interoperability in the 700 MHz Commercial Spectrum; Requests for Waiver and Extension of Lower 700 MHz Band Interim Construction Benchmark Deadlines AGENCY: Federal Communications Commission...
Generic Educational Knowledge Representation for Adaptive and Cognitive Systems
ERIC Educational Resources Information Center
Caravantes, Arturo; Galan, Ramon
2011-01-01
The interoperability of educational systems, encouraged by the development of specifications, standards and tools related to the Semantic Web is limited to the exchange of information in domain and student models. High system interoperability requires that a common framework be defined that represents the functional essence of educational systems.…
Global Interoperability of Broadband Networks (GIBN): Project Overview
NASA Technical Reports Server (NTRS)
DePaula, Ramon P.
1998-01-01
Various issues associated with the Global Interoperability of Broadband Networks (GIBN) are presented in viewgraph form. Specific topics include GIBN principles, objectives and goals, and background. GIBN/NASA status, the Transpacific High Definition Video experiment, GIBN experiment selection criteria, satellite industry involvement, and current experiments associated with GIBN are also discussed.
Exploring Interoperability as a Multidimensional Challenge for Effective Emergency Response
ERIC Educational Resources Information Center
Santisteban, Hiram
2010-01-01
Purpose. The purpose of this research was to further an understanding of how the federal government is addressing the challenges of interoperability for emergency response or crisis management (FEMA, 2009) by informing the development of standards through the review of current congressional law, commissions, studies, executive orders, and…
An Access Control and Trust Management Framework for Loosely-Coupled Multidomain Environments
ERIC Educational Resources Information Center
Zhang, Yue
2010-01-01
Multidomain environments where multiple organizations interoperate with each other are becoming a reality as can be seen in emerging Internet-based enterprise applications. Access control to ensure secure interoperation in such an environment is a crucial challenge. A multidomain environment can be categorized as "tightly-coupled" and…
OpenICE medical device interoperability platform overview and requirement analysis.
Arney, David; Plourde, Jeffrey; Goldman, Julian M
2018-02-23
We give an overview of OpenICE, an open source implementation of the ASTM standard F2761 for the Integrated Clinical Environment (ICE) that leverages medical device interoperability, together with an analysis of the clinical and non-functional requirements and community process that inspired its design.
47 CFR 90.525 - Administration of interoperability channels.
Code of Federal Regulations, 2010 CFR
2010-10-01
... RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Regulations Governing the Licensing and Use of... meeting the requirements of § 90.523 may operate mobile or portable units on the Interoperability channels... Commission provided it holds a part 90 license. All persons operating mobile or portable units under this...
2006-09-30
coastal phenomena. OBJECTIVES SURA is creating a SCOOP “Grid” that extends the interoperability enabled by the World Wide Web. The coastal ... community faces special challenges with respect to achieving a level of interoperability that can leverage emerging Grid technologies. With that in mind
Waveform Diversity and Design for Interoperating Radar Systems
2013-01-01
University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica , Telecomunicazioni Via Girolamo Caruso 16 Pisa, Italy 56122...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica ...DIPARTIMENTO DI INGEGNERIA DELL’INFORMAZIONE ELETTRONICA, INFORMATICA , TELECOMUNICAZIONI WAVEFORM DIVERSITY AND DESIGN FOR INTEROPERATING
RuleML-Based Learning Object Interoperability on the Semantic Web
ERIC Educational Resources Information Center
Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.
2008-01-01
Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…
75 FR 81605 - Smart Grid Interoperability Standards; Notice of Technical Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-28
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Technical Conference December 21, 2010. Take notice that the Federal Energy... National Institute of Standards and Technology and included in this proceeding are ready for Commission...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-30
...] Emission Mask Requirements for Digital Technologies on 800 MHz NPSPAC Channels; Analog FM Capability on Mutual Aid and Interoperability Channels AGENCY: Federal Communications Commission. ACTION: Proposed rule... interoperability channels. These proposals could help safeguard public safety licensees in the NPSPAC band from...
Content analysis of physical examination templates in electronic health records using SNOMED CT.
Gøeg, Kirstine Rosenbeck; Chen, Rong; Højen, Anne Randorff; Elberg, Pia
2014-10-01
Most electronic health record (EHR) systems are built on proprietary information models and terminology, which makes achieving semantic interoperability a challenge. Solving interoperability problems requires well-defined standards. In contrast, the need to support clinical work practice requires a local customization of EHR systems. Consequently, contrasting goals may be evident in EHR template design because customization means that local EHR organizations can define their own templates, whereas standardization implies consensus at some level. To explore the complexity of balancing these two goals, this study analyzes the differences and similarities between templates in use today. A similarity analysis was developed on the basis of SNOMED CT. The analysis was performed on four physical examination templates from Denmark and Sweden. The semantic relationships in SNOMED CT were used to quantify similarities and differences. Moreover, the analysis used these identified similarities to investigate the common content of a physical examination template. The analysis showed that there were both similarities and differences in physical examination templates, and the size of the templates varied from 18 to 49 fields. In the SNOMED CT analysis, exact matches and terminology similarities were represented in all template pairs. The number of exact matches ranged from 7 to 24. Moreover, the number of unrelated fields differed a lot from 1/18 to 22/35. Cross-country comparisons tended to have more unrelated content than within-country comparisons. On the basis of identified similarities, it was possible to define the common content of a physical examination. Nevertheless, a complete view on the physical examination required the inclusion of both exact matches and terminology similarities. This study revealed that a core set of items representing the physical examination templates can be generated when the analysis takes into account not only exact matches but also terminology similarities. This core set of items could be a starting point for standardization and semantic interoperability. However, both unmatched terms and terminology matched terms pose a challenge for standardization. Future work will include using local templates as a point of departure in standardization to see if local requirements can be maintained in a standardized framework. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Terminology supported archiving and publication of environmental science data in PANGAEA.
Diepenbroek, Michael; Schindler, Uwe; Huber, Robert; Pesant, Stéphane; Stocker, Markus; Felden, Janine; Buss, Melanie; Weinrebe, Matthias
2017-11-10
Exemplified on the information system PANGAEA, we describe the application of terminologies for archiving and publishing environmental science data. A terminology catalogue (TC) was embedded into the system, with interfaces allowing to replicate and to manually work on terminologies. For data ingest and archiving, we show how the TC can improve structuring and harmonizing lineage and content descriptions of data sets. Key is the conceptualization of measurement and observation types (parameters) and methods, for which we have implemented a basic syntax and rule set. For data access and dissemination, we have improved findability of data through enrichment of metadata with TC terms. Semantic annotations, e.g. adding term concepts (including synonyms and hierarchies) or mapped terms of different terminologies, facilitate comprehensive data retrievals. The PANGAEA thesaurus of classifying terms, which is part of the TC is used as an umbrella vocabulary that links the various domains and allows drill downs and side drills with various facets. Furthermore, we describe how TC terms can be linked to nominal data values. This improves data harmonization and facilitates structural transformation of heterogeneous data sets to a common schema. Technical developments are complemented by work on the metadata content. Over the last 20 years, more than 100 new parameters have been defined on average per week. Recently, PANGAEA has increasingly been submitting new terms to various terminology services. Matching terms from terminology services with our parameter or method strings is supported programmatically. However, the process ultimately needs manual input by domain experts. The quality of terminology services is an additional limiting factor, and varies with respect to content, editorial, interoperability, and sustainability. Good quality terminology services are the building blocks for the conceptualization of parameters and methods. In our view, they are essential for data interoperability and arguably the most difficult hurdle for data integration. In summary, the application of terminologies has a mutual positive effect for terminology services and information systems such as PANGAEA. On both sides, the application of terminologies improves content, reliability and interoperability. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Moroni, D. F.; Armstrong, E. M.; Tauer, E.; Hausman, J.; Huang, T.; Thompson, C. K.; Chung, N.
2013-12-01
The Physical Oceanographic Distributed Active Archive Center (PO.DAAC) is one of 12 data centers sponsored by NASA's Earth Science Data and Information System (ESDIS) project. The PO.DAAC is tasked with archival and distribution of NASA Earth science missions specific to physical oceanography, many of which have interdisciplinary applications for weather forecasting/monitoring, ocean biology, ocean modeling, and climate studies. PO.DAAC has a 20-year history of cross-project and international collaborations with partners in Europe, Japan, Australia, and the UK. Domestically, the PO.DAAC has successfully established lasting partners with non-NASA institutions and projects including the National Oceanic and Atmospheric Administration (NOAA), United States Navy, Remote Sensing Systems, and Unidata. A key component of these partnerships is PO.DAAC's direct involvement with international working groups and science teams, such as the Group for High Resolution Sea Surface Temperature (GHRSST), International Ocean Vector Winds Science Team (IOVWST), Ocean Surface Topography Science Team (OSTST), and the Committee on Earth Observing Satellites (CEOS). To help bolster new and existing collaborations, the PO.DAAC has established a standardized approach to its internal Data Management and Archiving System (DMAS), utilizing a Data Dictionary to provide the baseline standard for entry and capture of dataset and granule metadata. Furthermore, the PO.DAAC has established an end-to-end Dataset Lifecycle Policy, built upon both internal and external recommendations of best practices toward data stewardship. Together, DMAS, the Data Dictionary, and the Dataset Lifecycle Policy provide the infrastructure to enable standardized data and metadata to be fully ingested and harvested to facilitate interoperability and compatibility across data access protocols, tools, and services. The Dataset Lifecycle Policy provides the checks and balances to help ensure all incoming HDF and netCDF-based datasets meet minimum compliance requirements with the Lawrence Livermore National Laboratory's actively maintained Climate and Forecast (CF) conventions with additional goals toward metadata standards provided by the Attribute Convention for Dataset Discovery (ACDD), the International Organization for Standardization (ISO) 19100-series, and the Federal Geographic Data Committee (FGDC). By default, DMAS ensures all datasets are compliant with NASA's Global Change Master Directory (GCMD) and NASA's Reverb data discovery clearinghouse (also known as ECHO). For data access, PO.DAAC offers several widely-used technologies, including File Transfer Protocol (FTP), Open-source Project for a Network Data Access Protocol (OPeNDAP), and Thematic Realtime Environmental Distributed Data Services (THREDDS). These access technologies are available directly to users or through PO.DAAC's web interfaces, specifically the High-level Tool for Interactive Data Extraction (HiTIDE), Live Access Server (LAS), and PO.DAAC's set of search, image, and Consolidated Web Services (CWS). Lastly, PO.DAAC's newly introduced, standards-based CWS provide singular endpoints for search, imaging, and extraction capabilities, respectively, across L2/L3/L4 datasets. Altogether, these tools, services and policies serve to provide flexible, interoperable functionality for both users and data providers.
Defining the Collision Avoidance Region for DAA Systems
NASA Technical Reports Server (NTRS)
Thipphavong, David; Cone, Andrew; Park, Chunki; Lee, Seung Man; Santiago, Confesor
2016-01-01
Unmanned aircraft systems (UAS) will be required to equip with a detect--and--avoid (DAA) system in order to satisfy the federal aviation regulations to maintain well clear of other aircraft, some of which may be equipped with a Traffic Collision Avoidance System (TCAS) to mitigate the possibility of mid--air collisions. As such, the minimum operational performance standards (MOPS) for UAS DAA systems are being designed with TCAS interoperability in mind by a group of industry, government, and academic institutions named RTCA Special Committee-228 (SC-228). This document will discuss the development of the spatial--temporal volume known as the collision avoidance region in which the DAA system is not allowed to provide vertical guidance to maintain or regain DAA well clear that could conflict with resolution advisories (RAs) issued by the intruder aircraft's TCAS system. Three collision avoidance region definition candidates were developed based on the existing TCAS RA and DAA alerting definitions. They were evaluated against each other in terms of their interoperability with TCAS RAs and DAA alerts in an unmitigated factorial encounter analysis of 1.3 million simulated pairs.
PIML: the Pathogen Information Markup Language.
He, Yongqun; Vines, Richard R; Wattam, Alice R; Abramochkin, Georgiy V; Dickerman, Allan W; Eckart, J Dana; Sobral, Bruno W S
2005-01-01
A vast amount of information about human, animal and plant pathogens has been acquired, stored and displayed in varied formats through different resources, both electronically and otherwise. However, there is no community standard format for organizing this information or agreement on machine-readable format(s) for data exchange, thereby hampering interoperation efforts across information systems harboring such infectious disease data. The Pathogen Information Markup Language (PIML) is a free, open, XML-based format for representing pathogen information. XSLT-based visual presentations of valid PIML documents were developed and can be accessed through the PathInfo website or as part of the interoperable web services federation known as ToolBus/PathPort. Currently, detailed PIML documents are available for 21 pathogens deemed of high priority with regard to public health and national biological defense. A dynamic query system allows simple queries as well as comparisons among these pathogens. Continuing efforts are being taken to include other groups' supporting PIML and to develop more PIML documents. All the PIML-related information is accessible from http://www.vbi.vt.edu/pathport/pathinfo/
Military Interoperable Digital Hospital Testbed (MIDHT)
2013-10-01
activities are selected highlights completed by Northrop Grumman during the year. Cycle 4 development: - Increased the max_allowed_packet size in MySQL ...deployment with the Java install that is required by CONNECT v3.3.1.3. - Updated the MIDHT code base to work with the CONNECT v.3.3.1.3 Core Libraries...Provided TATRC the CONNECTUniversalClientGUI binaries for use with CONNECT v3.3.1.3 − Created and deployed a common Java library for the CONNECT
Force XXI: Directions in Digitization - Symposium Report.
1996-08-21
J. KWINN, JR. LTC DAVID W. HUTCHISON COL JAMES L. KAYS 7 . PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) USMA OPERATIONS RESEARCH CENTER WEST...I would also like to acknowledge the many people who worked hard to put on this symposium: Colonel James L. Kays for providing the vision, faith...Appendix D-5: Organi2ation D-5-1 Appendix D-6: Interoperability D-6-1 Appendix D- 7 : Readiness D- 7 -1 Annex E: Transcripts from Symposium E-l Annex F
2012-04-01
Systems Concepts and Integration SET Sensors and Electronics Technology SISO Simulation Interoperability Standards Organization SIW Simulation...conjunction with 2006 Fall SIW 2006 September SISO Standards Activity Committee approved beginning IEEE balloting 2006 October IEEE Project...019 published 2008 June Edinborough, UK Held in conjunction with 2008 Euro- SIW 2008 September Laurel, MD, US Work on Composite Model 2008 December
PharmML in Action: an Interoperable Language for Modeling and Simulation
Bizzotto, R; Smith, G; Yvon, F; Kristensen, NR; Swat, MJ
2017-01-01
PharmML1 is an XML‐based exchange format2, 3, 4 created with a focus on nonlinear mixed‐effect (NLME) models used in pharmacometrics,5, 6 but providing a very general framework that also allows describing mathematical and statistical models such as single‐subject or nonlinear and multivariate regression models. This tutorial provides an overview of the structure of this language, brief suggestions on how to work with it, and use cases demonstrating its power and flexibility. PMID:28575551
Marcos, Mar; Maldonado, Jose A; Martínez-Salvador, Begoña; Boscá, Diego; Robles, Montserrat
2013-08-01
Clinical decision-support systems (CDSSs) comprise systems as diverse as sophisticated platforms to store and manage clinical data, tools to alert clinicians of problematic situations, or decision-making tools to assist clinicians. Irrespective of the kind of decision-support task CDSSs should be smoothly integrated within the clinical information system, interacting with other components, in particular with the electronic health record (EHR). However, despite decades of developments, most CDSSs lack interoperability features. We deal with the interoperability problem of CDSSs and EHRs by exploiting the dual-model methodology. This methodology distinguishes a reference model and archetypes. A reference model is represented by a stable and small object-oriented model that describes the generic properties of health record information. For their part, archetypes are reusable and domain-specific definitions of clinical concepts in the form of structured and constrained combinations of the entities of the reference model. We rely on archetypes to make the CDSS compatible with EHRs from different institutions. Concretely, we use archetypes for modelling the clinical concepts that the CDSS requires, in conjunction with a series of knowledge-intensive mappings relating the archetypes to the data sources (EHR and/or other archetypes) they depend on. We introduce a comprehensive approach, including a set of tools as well as methodological guidelines, to deal with the interoperability of CDSSs and EHRs based on archetypes. Archetypes are used to build a conceptual layer of the kind of a virtual health record (VHR) over the EHR whose contents need to be integrated and used in the CDSS, associating them with structural and terminology-based semantics. Subsequently, the archetypes are mapped to the EHR by means of an expressive mapping language and specific-purpose tools. We also describe a case study where the tools and methodology have been employed in a CDSS to support patient recruitment in the framework of a clinical trial for colorectal cancer screening. The utilisation of archetypes not only has proved satisfactory to achieve interoperability between CDSSs and EHRs but also offers various advantages, in particular from a data model perspective. First, the VHR/data models we work with are of a high level of abstraction and can incorporate semantic descriptions. Second, archetypes can potentially deal with different EHR architectures, due to their deliberate independence of the reference model. Third, the archetype instances we obtain are valid instances of the underlying reference model, which would enable e.g. feeding back the EHR with data derived by abstraction mechanisms. Lastly, the medical and technical validity of archetype models would be assured, since in principle clinicians should be the main actors in their development. Copyright © 2013 Elsevier Inc. All rights reserved.
Metadata mapping and reuse in caBIG™
Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis
2009-01-01
Background This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG™). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG™ framework or other frameworks that use metadata repositories. Results The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG™ framework and potentially any framework that uses a metadata repository. Conclusion This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG™. This effort contributes to facilitating the development of interoperable systems within caBIG™ as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies. PMID:19208192