Daniel, Christel; Ouagne, David; Sadou, Eric; Forsberg, Kerstin; Gilchrist, Mark Mc; Zapletal, Eric; Paris, Nicolas; Hussain, Sajjad; Jaulent, Marie-Christine; MD, Dipka Kalra
2016-01-01
With the development of platforms enabling the use of routinely collected clinical data in the context of international clinical research, scalable solutions for cross border semantic interoperability need to be developed. Within the context of the IMI EHR4CR project, we first defined the requirements and evaluation criteria of the EHR4CR semantic interoperability platform and then developed the semantic resources and supportive services and tooling to assist hospital sites in standardizing their data for allowing the execution of the project use cases. The experience gained from the evaluation of the EHR4CR platform accessing to semantically equivalent data elements across 11 European participating EHR systems from 5 countries demonstrated how far the mediation model and mapping efforts met the expected requirements of the project. Developers of semantic interoperability platforms are beginning to address a core set of requirements in order to reach the goal of developing cross border semantic integration of data. PMID:27570649
IHE cross-enterprise document sharing for imaging: interoperability testing software
2010-01-01
Background With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties. PMID:20858241
IHE cross-enterprise document sharing for imaging: interoperability testing software.
Noumeir, Rita; Renaud, Bérubé
2010-09-21
With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.
Solving Identity Management and Interoperability Problems at Pan-European Level
NASA Astrophysics Data System (ADS)
Sánchez García, Sergio; Gómez Oliva, Ana
In a globalized digital world, it is essential for persons and entities to have a recognized and unambiguous electronic identity that allows them to communicate with one another. The management of this identity by public administrations is an important challenge that becomes even more crucial when interoperability among public administrations of different countries becomes necessary, as persons and entities have different credentials depending on their own national legal frameworks. More specifically, different credentials and legal frameworks cause interoperability problems that prevent reliable access to public services in a cross-border scenarios like today's European Union. Work in this doctoral thesis try to analyze the problem in a carefully detailed manner by studying existing proposals (basically in Europe), proposing improvements in defined architectures and performing practical work to test the viability of solutions. Moreover, this thesis will also address the long-standing security problem of identity delegation, which is especially important in complex and heterogeneous service delivery environments like those mentioned above. This is a position paper.
Interoperability after deployment: persistent challenges and regional strategies in Denmark.
Kierkegaard, Patrick
2015-04-01
The European Union has identified Denmark as one of the countries who have the potential to provide leadership and inspiration for other countries in eHealth implementation and adoption. However, Denmark has historically struggled to facilitate data exchange between their public hospitals' electronic health records (EHRs). Furthermore, state-led projects failed to adequately address the challenges of interoperability after deployment. Changes in the organizational setup and division of responsibilities concerning the future of eHealth implementations in hospitals took place, which granted the Danish regions the full responsibility for all hospital systems, specifically the consolidation of EHRs to one system per region. The regions reduced the number of different EHRs to six systems by 2014. Additionally, the first version of the National Health Record was launched to provide health care practitioners with an overview of a patient's data stored in all EHRs across the regions and within the various health sectors. The governance of national eHealth implementation plays a crucial role in the development and diffusion of interoperable technologies. Changes in the organizational setup and redistribution of responsibilities between the Danish regions and the state play a pivotal role in producing viable and coherent solutions in a timely manner. Interoperability initiatives are best managed on a regional level or by the authorities responsible for the provision of local health care services. Cross-regional communication is essential during the initial phases of planning in order to set a common goal for countrywide harmonization, coherence and collaboration. © The Author 2015. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
An approach to define semantics for BPM systems interoperability
NASA Astrophysics Data System (ADS)
Rico, Mariela; Caliusco, María Laura; Chiotti, Omar; Rosa Galli, María
2015-04-01
This article proposes defining semantics for Business Process Management systems interoperability through the ontology of Electronic Business Documents (EBD) used to interchange the information required to perform cross-organizational processes. The semantic model generated allows aligning enterprise's business processes to support cross-organizational processes by matching the business ontology of each business partner with the EBD ontology. The result is a flexible software architecture that allows dynamically defining cross-organizational business processes by reusing the EBD ontology. For developing the semantic model, a method is presented, which is based on a strategy for discovering entity features whose interpretation depends on the context, and representing them for enriching the ontology. The proposed method complements ontology learning techniques that can not infer semantic features not represented in data sources. In order to improve the representation of these entity features, the method proposes using widely accepted ontologies, for representing time entities and relations, physical quantities, measurement units, official country names, and currencies and funds, among others. When the ontologies reuse is not possible, the method proposes identifying whether that feature is simple or complex, and defines a strategy to be followed. An empirical validation of the approach has been performed through a case study.
A Semantic Cooperation and Interoperability Platform for the European Chambers of Commerce
NASA Astrophysics Data System (ADS)
Missikoff, Michele; Taglino, Francesco
The LD-CAST project aims at developing a semantic cooperation and interoperability platform for the European Chambers of Commerce. Some of the key issues that this platform addresses are: The variety and number of different kinds of resources (i.e., business processes, concrete services) that concur to achieve a business service The diversity of cultural and procedural models emerging when composing articulated cross-country services The limited possibility of reusing similar services in different contexts (for instance, supporting the same service between different countries: an Italian-Romanian cooperation is different from an Italian-Polish one) The objective of the LD-CAST platform, and in particular of the semantic services provided therein, is to address the above problems with flexible solutions. We aim at introducing high levels of flexibility, both at the time of development of business processes and concrete services (i.e., operational services offered by service providers), with the possibility of dynamically binding c-services to the selected BP, according to user needs. To this end, an approach based on semantic services and a reference ontology has been proposed.
Advanced orbiting systems test-bedding and protocol verification
NASA Technical Reports Server (NTRS)
Noles, James; De Gree, Melvin
1989-01-01
The Consultative Committee for Space Data Systems (CCSDS) has begun the development of a set of protocol recommendations for Advanced Orbiting Systems (SOS). The AOS validation program and formal definition of AOS protocols are reviewed, and the configuration control of the AOS formal specifications is summarized. Independent implementations of the AOS protocols by NASA and ESA are discussed, and cross-support/interoperability tests which will allow the space agencies of various countries to share AOS communication facilities are addressed.
Towards Cross-Organizational Innovative Business Process Interoperability Services
NASA Astrophysics Data System (ADS)
Karacan, Ömer; Del Grosso, Enrico; Carrez, Cyril; Taglino, Francesco
This paper presents the vision and initial results of the COIN (FP7-IST-216256) European project for the development of open source Collaborative Business Process Interoperability (CBPip) in cross-organisational business collaboration environments following the Software-as-a-Service Utility (SaaS-U) paradigm.
Friedman, Charles P; Iakovidis, Ilias; Debenedetti, Laurent; Lorenzi, Nancy M
2009-11-01
Countries on both sides of the Atlantic Ocean have invested in health information and communication technologies. Since eHealth challenges cross borders a European Union-United States of America conference on public policies relating to health IT and eHealth was held October 20-21, 2008 in Paris, France. The conference was organized around the four themes: (1) privacy and security, (2) health IT interoperability, (3) deployment and adoption of health IT, and (4) Public Private Collaborative Governance. The four key themes framed the discussion over the two days of plenary sessions and workshops. Key findings of the conference were organized along the four themes. (1) Privacy and security: Patients' access to their own data and key elements of a patient identification management framework were discussed. (2) Health IT interoperability: Three significant and common interoperability challenges emerged: (a) the need to establish common or compatible standards and clear guidelines for their implementation, (b) the desirability for shared certification criteria and (c) the need for greater awareness of the importance of interoperability. (3) Deployment and adoption of health IT: Three major areas of need emerged: (a) a shared knowledge base and assessment framework, (b) public-private collaboration and (c) and effective organizational change strategies. (4) Public Private Collaborative Governance: Sharing and communication are central to success in this area. Nations can learn from one another about ways to develop harmonious, effective partnerships. Three areas that were identified as highest priority for collaboration included: (1) health data security, (2) developing effective strategies to ensure healthcare professionals' acceptance of health IT tools, and (3) interoperability.
Building the Synergy between Public Sector and Research Data Infrastructures
NASA Astrophysics Data System (ADS)
Craglia, Massimo; Friis-Christensen, Anders; Ostländer, Nicole; Perego, Andrea
2014-05-01
INSPIRE is a European Directive aiming to establish a EU-wide spatial data infrastructure to give cross-border access to information that can be used to support EU environmental policies, as well as other policies and activities having an impact on the environment. In order to ensure cross-border interoperability of data infrastructures operated by EU Member States, INSPIRE sets out a framework based on common specifications for metadata, data, network services, data and service sharing, monitoring and reporting. The implementation of INSPIRE has reached important milestones: the INSPIRE Geoportal was launched in 2011 providing a single access point for the discovery of INSPIRE data and services across EU Member States (currently, about 300K), while all the technical specifications for the interoperability of data across the 34 INSPIRE themes were adopted at the end of 2013. During this period a number of EU and international initiatives has been launched, concerning cross-domain interoperability and (Linked) Open Data. In particular, the EU Open Data Portal, launched in December 2012, made provisions to access government and scientific data from EU institutions and bodies, and the EU ISA Programme (Interoperability Solutions for European Public Administrations) promotes cross-sector interoperability by sharing and re-using EU-wide and national standards and components. Moreover, the Research Data Alliance (RDA), an initiative jointly funded by the European Commission, the US National Science Foundation and the Australian Research Council, was launched in March 2013 to promote scientific data sharing and interoperability. The Joint Research Centre of the European Commission (JRC), besides being the technical coordinator of the implementation of INSPIRE, is also actively involved in the initiatives promoting cross-sector re-use in INSPIRE, and sustainable approaches to address the evolution of technologies - in particular, how to support Linked Data in INSPIRE and the use of global persistent identifiers. It is evident that government and scientific data infrastructures are currently facing a number of issues that have already been addressed in INSPIRE. Sharing experiences and competencies will avoid re-inventing the wheel, and help promoting the cross-domain adoption of consistent solutions. Actually, one of the lessons learnt from INSPIRE and the initiatives in which JRC is involved, is that government and research data are not two separate worlds. Government data are commonly used as a basis to create scientific data, and vice-versa. Consequently, it is fundamental to adopt a consistent approach to address interoperability and data management issues shared by both government and scientific data. The presentation illustrates some of the lessons learnt during the implementation of INSPIRE and in work on data and service interoperability coordinated with European and international initiatives. We describe a number of critical interoperability issues and barriers affecting both scientific and government data, concerning, e.g., data terminologies, quality and licensing, and propose how these problems could be effectively addressed by a closer collaboration of the government and scientific communities, and the sharing of experiences and practices.
An EarthCube Roadmap for Cross-Domain Interoperability in the Geosciences: Governance Aspects
NASA Astrophysics Data System (ADS)
Zaslavsky, I.; Couch, A.; Richard, S. M.; Valentine, D. W.; Stocks, K.; Murphy, P.; Lehnert, K. A.
2012-12-01
The goal of cross-domain interoperability is to enable reuse of data and models outside the original context in which these data and models are collected and used and to facilitate analysis and modeling of physical processes that are not confined to disciplinary or jurisdictional boundaries. A new research initiative of the U.S. National Science Foundation, called EarthCube, is developing a roadmap to address challenges of interoperability in the earth sciences and create a blueprint for community-guided cyberinfrastructure accessible to a broad range of geoscience researchers and students. Infrastructure readiness for cross-domain interoperability encompasses the capabilities that need to be in place for such secondary or derivative-use of information to be both scientifically sound and technically feasible. In this initial assessment we consider the following four basic infrastructure components that need to be present to enable cross-domain interoperability in the geosciences: metadata catalogs (at the appropriate community defined granularity) that provide standard discovery services over datasets, data access services, models and other resources of the domain; vocabularies that support unambiguous interpretation of domain resources and metadata; services used to access data repositories and other resources including models, visualizations and workflows; and formal information models that define structure and semantics of the information returned on service requests. General standards for these components have been proposed; they form the backbone of large scale integration activities in the geosciences. By utilizing these standards, EarthCube research designs can take advantage of data discovery across disciplines using the commonality in key data characteristics related to shared models of spatial features, time measurements, and observations. Data can be discovered via federated catalogs and linked nomenclatures from neighboring domains, while standard data services can be used to transparently compile composite data products. Key questions addressed in this presentation are: (1) How to define and assess readiness of existing domain information systems for cross-domain re-use? (2) How to determine EarthCube development priorities given a multitude of use cases that involve cross-domain data flows? and (3) How to involve a wider community of geoscientists in the development and curation of cross-domain resources and incorporate community feedback in the CI design? Answering them involves consideration of governance mechanisms for cross-domain interoperability: while domain information systems and projects developed governance mechanisms, managing cross-domain CI resources and supporting cross-domain information re-use hasn't been the development focus at the scale of the geosciences. We present a cross-domain readiness model as enabling effective communication among scientists, governance bodies, and information providers. We also present an initial readiness assessment and a cross-domain connectivity map for the geosciences, and outline processes for eliciting user requirements, setting priorities, and obtaining community consensus.
Empowering open systems through cross-platform interoperability
NASA Astrophysics Data System (ADS)
Lyke, James C.
2014-06-01
Most of the motivations for open systems lie in the expectation of interoperability, sometimes referred to as "plug-and-play". Nothing in the notion of "open-ness", however, guarantees this outcome, which makes the increased interest in open architecture more perplexing. In this paper, we explore certain themes of open architecture. We introduce the concept of "windows of interoperability", which can be used to align disparate portions of architecture. Such "windows of interoperability", which concentrate on a reduced set of protocol and interface features, might achieve many of the broader purposes assigned as benefits in open architecture. Since it is possible to engineer proprietary systems that interoperate effectively, this nuanced definition of interoperability may in fact be a more important concept to understand and nurture for effective systems engineering and maintenance.
Oluoch, Tom; Muturi, David; Kiriinya, Rose; Waruru, Anthony; Lanyo, Kevin; Nguni, Robert; Ojwang, James; Waters, Keith P; Richards, Janise
2015-01-01
Sub-Saharan Africa (SSA) bears the heaviest burden of the HIV epidemic. Health workers play a critical role in the scale-up of HIV programs. SSA also has the weakest information and communication technology (ICT) infrastructure globally. Implementing interoperable national health information systems (HIS) is a challenge, even in developed countries. Countries in resource-limited settings have yet to demonstrate that interoperable systems can be achieved, and can improve quality of healthcare through enhanced data availability and use in the deployment of the health workforce. We established interoperable HIS integrating a Master Facility List (MFL), District Health Information Software (DHIS2), and Human Resources Information Systems (HRIS) through application programmers interfaces (API). We abstracted data on HIV care, health workers deployment, and health facilities geo-coordinates. Over 95% of data elements were exchanged between the MFL-DHIS and HRIS-DHIS. The correlation between the number of HIV-positive clients and nurses and clinical officers in 2013 was R2=0.251 and R2=0.261 respectively. Wrong MFL codes, data type mis-match and hyphens in legacy data were key causes of data transmission errors. Lack of information exchange standards for aggregate data made programming time-consuming.
Elif Ekmekci, Perihan
2017-01-01
Disease outbreaks have attracted the attention of the public health community to early warning and response systems (EWRS) for communicable diseases and other cross-border threats to health. The European Union (EU) and the World Health Organization (WHO) have published regulations in this area. Decision 1082/2013/EU brought a new approach the management of public health threats in EU member states. Decision 1082/2013/EU brought several innovations, which included establishing a Health Security Committee; preparedness and response planning; joint procurement of medical countermeasures; ad hoc monitoring for biological, chemical, and environmental threats; EWRS; and recognition of an emergency situation and interoperability between various sectors. Turkey, as an acceding country to the EU and a member of the WHO, has been improving its national public health system to meet EU legislations and WHO standards. This article first explains EWRS as defined in Decision 1082/2013/EU and Turkey’s obligations to align its public health laws to the EU acquis. EWRS in Turkey are addressed, particularly their coherence with EU policies regarding preparedness and response, alert notification, and interoperability between health and other sectors. Finally, the challenges and limitations of the current Turkish system are discussed and further improvements are suggested. PMID:27511433
Groundwater data network interoperability
Brodaric, Boyan; Booth, Nathaniel; Boisvert, Eric; Lucido, Jessica M.
2016-01-01
Water data networks are increasingly being integrated to answer complex scientific questions that often span large geographical areas and cross political borders. Data heterogeneity is a major obstacle that impedes interoperability within and between such networks. It is resolved here for groundwater data at five levels of interoperability, within a Spatial Data Infrastructure architecture. The result is a pair of distinct national groundwater data networks for the United States and Canada, and a combined data network in which they are interoperable. This combined data network enables, for the first time, transparent public access to harmonized groundwater data from both sides of the shared international border.
Achieving Interoperability in GEOSS - How Close Are We?
NASA Astrophysics Data System (ADS)
Arctur, D. K.; Khalsa, S. S.; Browdy, S. F.
2010-12-01
A primary goal of the Global Earth Observing System of System (GEOSS) is improving the interoperability between the observational, modelling, data assimilation, and prediction systems contributed by member countries. The GEOSS Common Infrastructure (GCI) comprises the elements designed to enable discovery and access to these diverse data and information sources. But to what degree can the mechanisms for accessing these data, and the data themselves, be considered interoperable? Will the separate efforts by Communities of Practice within GEO to build their own portals, such as for Energy, Biodiversity, and Air Quality, lead to fragmentation or synergy? What communication and leadership do we need with these communities to improve interoperability both within and across such communities? The Standards and Interoperability Forum (SIF) of GEO's Architecture and Data Committee has assessed progress towards achieving the goal of global interoperability and made recommendations regarding evolution of the architecture and overall data strategy to ensure fulfillment of the GEOSS vision. This presentation will highlight the results of this study, and directions for further work.
Data interoperability software solution for emergency reaction in the Europe Union
NASA Astrophysics Data System (ADS)
Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.
2015-07-01
Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency-first responders: the Netherlands-Germany border fire.
Ekmekci, Perihan Elif
2016-12-01
Disease outbreaks have attracted the attention of the public health community to early warning and response systems (EWRS) for communicable diseases and other cross-border threats to health. The European Union (EU) and the World Health Organization (WHO) have published regulations in this area. Decision 1082/2013/EU brought a new approach the management of public health threats in EU member states. Decision 1082/2013/EU brought several innovations, which included establishing a Health Security Committee; preparedness and response planning; joint procurement of medical countermeasures; ad hoc monitoring for biological, chemical, and environmental threats; EWRS; and recognition of an emergency situation and interoperability between various sectors. Turkey, as an acceding country to the EU and a member of the WHO, has been improving its national public health system to meet EU legislations and WHO standards. This article first explains EWRS as defined in Decision 1082/2013/EU and Turkey's obligations to align its public health laws to the EU acquis. EWRS in Turkey are addressed, particularly their coherence with EU policies regarding preparedness and response, alert notification, and interoperability between health and other sectors. Finally, the challenges and limitations of the current Turkish system are discussed and further improvements are suggested. (Disaster Med Public Health Preparedness. 2016;10:883-892).
NASA Astrophysics Data System (ADS)
Graves, S. J.; Keiser, K.; Law, E.; Yang, C. P.; Djorgovski, S. G.
2016-12-01
ECITE (EarthCube Integration and Testing Environment) is providing both cloud-based computational testing resources and an Assessment Framework for Technology Interoperability and Integration. NSF's EarthCube program is funding the development of cyberinfrastructure building block components as technologies to address Earth science research problems. These EarthCube building blocks need to support integration and interoperability objectives to work towards a coherent cyberinfrastructure architecture for the program. ECITE is being developed to provide capabilities to test and assess the interoperability and integration across funded EarthCube technology projects. EarthCube defined criteria for interoperability and integration are applied to use cases coordinating science problems with technology solutions. The Assessment Framework facilitates planning, execution and documentation of the technology assessments for review by the EarthCube community. This presentation will describe the components of ECITE and examine the methodology of cross walking between science and technology use cases.
NASA Astrophysics Data System (ADS)
Tomas, Robert; Harrison, Matthew; Barredo, José I.; Thomas, Florian; Llorente Isidro, Miguel; Cerba, Otakar; Pfeiffer, Manuela
2014-05-01
The vast amount of information and data necessary for comprehensive hazard and risk assessment presents many challenges regarding the lack of accessibility, comparability, quality, organisation and dissemination of natural hazards spatial data. In order to mitigate these limitations an interoperable framework has been developed in the framework of the development of legally binding Implementing rules of the EU INSPIRE Directive1* aiming at the establishment of the European Spatial Data Infrastructure. The interoperability framework is described in the Data Specification on Natural risk zones - Technical Guidelines (DS) document2* that was finalized and published on 10.12. 2013. This framework provides means for facilitating access, integration, harmonisation and dissemination of natural hazard data from different domains and sources. The objective of this paper is twofold. Firstly, the paper demonstrates the applicability of the interoperable framework developed in the DS and highlights the key aspects of the interoperability to the various natural hazards communities. Secondly, the paper "translates" into common language the main features and potentiality of the interoperable framework of the DS for a wider audience of scientists and practitioners in the natural hazards domain. Further in this paper the main five aspects of the interoperable framework will be presented. First, the issue of a common terminology for the natural hazards domain will be addressed. A common data model to facilitate cross domain data integration will follow secondly. Thirdly, the common methodology developed to provide qualitative or quantitative assessments of natural hazards will be presented. Fourthly, the extensible classification schema for natural hazards developed from a literature review and key reference documents from the contributing community of practice will be shown. Finally, the applicability of the interoperable framework for the various stakeholder groups will be also presented. This paper closes discussing open issues and next steps regarding the sustainability and evolution of the interoperable framework and missing aspects such as multi-hazard and multi-risk. --------------- 1*INSPIRE - Infrastructure for spatial information in Europe, http://inspire.ec.europa.eu 2*http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_NZ_v3.0.pdf
Interconnecting Multidiscilinary Data Infrastructures: From Federation to Brokering Framework
NASA Astrophysics Data System (ADS)
Nativi, S.
2014-12-01
Standardization and federation activities have been played an essential role to push interoperability at the disciplinary and cross-disciplinary level. However, they demonstrated not to be sufficient to resolve important interoperability challenges, including: disciplinary heterogeneity, cross-organizations diversities, cultural differences. Significant international initiatives like GEOSS, IODE, and CEOS demonstrated that a federation system dealing with global and multi-disciplinary domain turns out to be rater complex, raising more the already high entry level barriers for both Providers and Users. In particular, GEOSS demonstrated that standardization and federation actions must be accompanied and complemented by a brokering approach. Brokering architecture and its implementing technologies are able to implement an effective interoperability level among multi-disciplinary systems, lowering the entry level barriers for both data providers and users. This presentation will discuss the brokering philosophy as a complementary approach for standardization and federation to interconnect existing and heterogeneous infrastructures and systems. The GEOSS experience will be analyzed, specially.
Software support in automation of medicinal product evaluations.
Juric, Radmila; Shojanoori, Reza; Slevin, Lindi; Williams, Stephen
2005-01-01
Medicinal product evaluation is one of the most important tasks undertaken by government health departments and their regulatory authorities, in every country in the world. The automation and adequate software support are critical tasks that can improve the efficiency and interoperation of regulatory systems across the world. In this paper we propose a software solution that supports the automation of the (i) submission of licensing applications, and (ii) evaluations of submitted licensing applications, according to regulatory authorities' procedures. The novelty of our solution is in allowing licensing applications to be submitted in any country in the world and evaluated according to any evaluation procedure (which can be chosen by either regulatory authorities or pharmaceutical companies). Consequently, submission and evaluation procedures become interoperable and the associated data repositories/databases can be shared between various countries and regulatory authorities.
The role of basic data registers in cross-border interconnection of eHealth solutions.
Kregar, Mirjana; Marčun, Tomaž; Dovžan, Irma; Cehovin, Lojzka
2011-01-01
The increasingly closer international business cooperation in the areas of production, trade, transport and activities such as tourism and education is promoting the mobility of people. This increases the need for the provision of health care services across borders. In order to provide increasingly safer and effective treatment that is of ever higher quality in these cases as well, it is necessary to ensure that data accompanies patients even when they travel to other regions, countries or continents. eHealth solutions are one of the key tools for achieving such objectives. When building these solutions, it is necessary to take into account the different aspects and limitations brought about by the differences in the environments where such a treatment of a patient takes place. In the debates on the various types of cross-border interoperability of eHealth solutions, it is necessary to bring to attention the necessity of suitable management and interconnection of data registers that form the basis of every information system: data on patients, health care service providers and basic code tables. It is necessary to promote well-arranged and quality data in the patient's domestic environment and the best possible options for transferring and using those data in the foreign environment where the patient is receiving medical care at a particular moment. Many of the discussions dealing with conditions for the interoperability of health care information systems actually start with questions of how to ensure the interconnectivity of basic data registers.
Capurro, Daniel; Echeverry, Aisen; Figueroa, Rosa; Guiñez, Sergio; Taramasco, Carla; Galindo, César; Avendaño, Angélica; García, Alejandra; Härtel, Steffen
2017-01-01
Despite the continuous technical advancements around health information standards, a critical component to their widespread adoption involves political agreement between a diverse set of stakeholders. Countries that have addressed this issue have used diverse strategies. In this vision paper we present the path that Chile is taking to establish a national program to implement health information standards and achieve interoperability. The Chilean government established an inter-agency program to define the current interoperability situation, existing gaps, barriers, and facilitators for interoperable health information systems. As an answer to the identified issues, the government decided to fund a consortium of Chilean universities to create the National Center for Health Information Systems. This consortium should encourage the interaction between all health care stakeholders, both public and private, to advance the selection of national standards and define certification procedures for software and human resources in health information technologies.
Turning Interoperability Operational with GST
NASA Astrophysics Data System (ADS)
Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha
2013-04-01
GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially how it complies with existing or developing standards or quasi-standards and how it applies and extents services towards interoperability in the Earth sciences.
NASA Astrophysics Data System (ADS)
Allison, M. L.; Gurney, R. J.
2015-12-01
An e-infrastructure that supports data-intensive, multidisciplinary research is needed to accelerate the pace of science to address 21st century global change challenges. Data discovery, access, sharing and interoperability collectively form core elements of an emerging shared vision of e-infrastructure for scientific discovery. The pace and breadth of change in information management across the data lifecycle means that no one country or institution can unilaterally provide the leadership and resources required to use data and information effectively, or needed to support a coordinated, global e-infrastructure. An 18-month long process involving ~120 experts in domain, computer, and social sciences from more than a dozen countries resulted in a formal set of recommendations to the Belmont Forum collaboration of national science funding agencies and others on what they are best suited to implement for development of an e-infrastructure in support of global change research, including: adoption of data principles that promote a global, interoperable e-infrastructure establishment of information and data officers for coordination of global data management and e-infrastructure efforts promotion of effective data planning determination of best practices development of a cross-disciplinary training curriculum on data management and curation The Belmont Forum is ideally poised to play a vital and transformative leadership role in establishing a sustained human and technical international data e-infrastructure to support global change research. The international collaborative process that went into forming these recommendations is contributing to national governments and funding agencies and international bodies working together to execute them.
Ovies-Bernal, Diana Paola; Agudelo-Londoño, Sandra M
2014-01-01
Identify shared criteria used throughout the world in the implementation of interoperable National Health Information Systems (NHIS) and provide validated scientific information on the dimensions affecting interoperability. This systematic review sought to identify primary articles on the implementation of interoperable NHIS published in scientific journals in English, Portuguese, or Spanish between 1990 and 2011 through a search of eight databases of electronic journals in the health sciences and informatics: MEDLINE (PubMed), Proquest, Ovid, EBSCO, MD Consult, Virtual Health Library, Metapress, and SciELO. The full texts of the articles were reviewed, and those that focused on technical computer aspects or on normative issues were excluded, as well as those that did not meet the quality criteria for systematic reviews of interventions. Of 291 studies found and reviewed, only five met the inclusion criteria. These articles reported on the process of implementing an interoperable NHIS in Brazil, China, the United States, Turkey, and the Semiautonomous Region of Zanzíbar, respectively. Five common basic criteria affecting implementation of the NHIS were identified: standards in place to govern the process, availability of trained human talent, financial and structural constraints, definition of standards, and assurance that the information is secure. Four dimensions affecting interoperability were defined: technical, semantic, legal, and organizational. The criteria identified have to be adapted to the actual situation in each country and a proactive approach should be used to ensure that implementation of the interoperable NHIS is strategic, simple, and reliable.
Large scale healthcare data integration and analysis using the semantic web.
Timm, John; Renly, Sondra; Farkash, Ariel
2011-01-01
Healthcare data interoperability can only be achieved when the semantics of the content is well defined and consistently implemented across heterogeneous data sources. Achieving these objectives of interoperability requires the collaboration of experts from several domains. This paper describes tooling that integrates Semantic Web technologies with common tools to facilitate cross-domain collaborative development for the purposes of data interoperability. Our approach is divided into stages of data harmonization and representation, model transformation, and instance generation. We applied our approach on Hypergenes, an EU funded project, where we use our method to the Essential Hypertension disease model using a CDA template. Our domain expert partners include clinical providers, clinical domain researchers, healthcare information technology experts, and a variety of clinical data consumers. We show that bringing Semantic Web technologies into the healthcare interoperability toolkit increases opportunities for beneficial collaboration thus improving patient care and clinical research outcomes.
Content analysis of physical examination templates in electronic health records using SNOMED CT.
Gøeg, Kirstine Rosenbeck; Chen, Rong; Højen, Anne Randorff; Elberg, Pia
2014-10-01
Most electronic health record (EHR) systems are built on proprietary information models and terminology, which makes achieving semantic interoperability a challenge. Solving interoperability problems requires well-defined standards. In contrast, the need to support clinical work practice requires a local customization of EHR systems. Consequently, contrasting goals may be evident in EHR template design because customization means that local EHR organizations can define their own templates, whereas standardization implies consensus at some level. To explore the complexity of balancing these two goals, this study analyzes the differences and similarities between templates in use today. A similarity analysis was developed on the basis of SNOMED CT. The analysis was performed on four physical examination templates from Denmark and Sweden. The semantic relationships in SNOMED CT were used to quantify similarities and differences. Moreover, the analysis used these identified similarities to investigate the common content of a physical examination template. The analysis showed that there were both similarities and differences in physical examination templates, and the size of the templates varied from 18 to 49 fields. In the SNOMED CT analysis, exact matches and terminology similarities were represented in all template pairs. The number of exact matches ranged from 7 to 24. Moreover, the number of unrelated fields differed a lot from 1/18 to 22/35. Cross-country comparisons tended to have more unrelated content than within-country comparisons. On the basis of identified similarities, it was possible to define the common content of a physical examination. Nevertheless, a complete view on the physical examination required the inclusion of both exact matches and terminology similarities. This study revealed that a core set of items representing the physical examination templates can be generated when the analysis takes into account not only exact matches but also terminology similarities. This core set of items could be a starting point for standardization and semantic interoperability. However, both unmatched terms and terminology matched terms pose a challenge for standardization. Future work will include using local templates as a point of departure in standardization to see if local requirements can be maintained in a standardized framework. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Secure and interoperable communication infrastructures for PPDR organisations
NASA Astrophysics Data System (ADS)
Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David
2016-05-01
The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.
Data interoperability software solution for emergency reaction in the Europe Union
NASA Astrophysics Data System (ADS)
Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.
2014-09-01
Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision-making slower and more difficult. However, spread and development of networks and IT-based Emergency Management Systems (EMS) has improved emergency responses, becoming more coordinated. Despite improvements made in recent years, EMS have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision-making. In addition, from a technical perspective, the consolidation of current EMS and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMS surrounded by different contexts. To overcome these problems we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries cultural linguistic issues. To deal with the diversity of data protocols and formats, we have designed a Service Oriented Architecture for Data Interoperability (named DISASTER) providing a flexible extensible solution to solve the mediation issues. Web Services have been adopted as specific technology to implement such paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency first responders: the Netherlands-Germany border fire.
Beštek, Mate; Stanimirović, Dalibor
2017-08-09
The main aims of the paper comprise the characterization and examination of the potential approaches regarding interoperability. This includes openEHR, SNOMED, IHE, and Continua as combined interoperability approaches, possibilities for their incorporation into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. The paper represents an in-depth analysis regarding the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research and charting of existing experience in the field, and sources, both electronic and written, which include interoperability concepts and related implementation issues. The paper will try to answer the following inquiries that are complementing each other: 1. Scrutiny of the potential approaches, which could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field that critically influence development and implementation of eHealth projects in an efficient manner. Provided insights and identified success factors could serve as a constituent of the strategic starting points for continuous integration of interoperability principles into the healthcare domain. Moreover, the general implementation of the identified success factors could facilitate better penetration of ICT into the healthcare environment and enable the eHealth-based transformation of the health system especially in the countries which are still in an early phase of eHealth planning and development and are often confronted with differing interests, requirements, and contending strategies.
Chiu, Tsz-chun Roxy; Ngo, Hiu-ching; Lau, Lai-wa; Leung, King-wah; Lo, Man-him; Yu, Ho-fai; Ying, Michael
2016-01-01
Aims This study was undertaken to investigate the immediate effect of static stretching on normal Achilles tendon morphology and stiffness, and the different effect on dominant and non-dominant legs; and to evaluate inter-operator and intra-operator reliability of using shear-wave elastography in measuring Achilles tendon stiffness. Methods 20 healthy subjects (13 males, 7 females) were included in the study. Thickness, cross-sectional area and stiffness of Achilles tendons in both legs were measured before and after 5-min static stretching using grey-scale ultrasound and shear-wave elastography. Inter-operator and intra-operator reliability of tendon stiffness measurements of six operators were evaluated. Results Result showed that there was no significant change in the thickness and cross-sectional area of Achilles tendon after static stretching in both dominant and non-dominant legs (p > 0.05). Tendon stiffness showed a significant increase in non-dominant leg (p < 0.05) but not in dominant leg (p > 0.05). The inter-operator reliability of shear-wave elastography measurements was 0.749 and the intra-operator reliability ranged from 0.751 to 0.941. Conclusion Shear-wave elastography is a useful and non-invasive imaging tool to assess the immediate stiffness change of Achilles tendon in response to static stretching with high intra-operator and inter-operator reliability. PMID:27120097
Knowledge Discovery from Biomedical Ontologies in Cross Domains.
Shen, Feichen; Lee, Yugyung
2016-01-01
In recent years, there is an increasing demand for sharing and integration of medical data in biomedical research. In order to improve a health care system, it is required to support the integration of data by facilitating semantic interoperability systems and practices. Semantic interoperability is difficult to achieve in these systems as the conceptual models underlying datasets are not fully exploited. In this paper, we propose a semantic framework, called Medical Knowledge Discovery and Data Mining (MedKDD), that aims to build a topic hierarchy and serve the semantic interoperability between different ontologies. For the purpose, we fully focus on the discovery of semantic patterns about the association of relations in the heterogeneous information network representing different types of objects and relationships in multiple biological ontologies and the creation of a topic hierarchy through the analysis of the discovered patterns. These patterns are used to cluster heterogeneous information networks into a set of smaller topic graphs in a hierarchical manner and then to conduct cross domain knowledge discovery from the multiple biological ontologies. Thus, patterns made a greater contribution in the knowledge discovery across multiple ontologies. We have demonstrated the cross domain knowledge discovery in the MedKDD framework using a case study with 9 primary biological ontologies from Bio2RDF and compared it with the cross domain query processing approach, namely SLAP. We have confirmed the effectiveness of the MedKDD framework in knowledge discovery from multiple medical ontologies.
Knowledge Discovery from Biomedical Ontologies in Cross Domains
Shen, Feichen; Lee, Yugyung
2016-01-01
In recent years, there is an increasing demand for sharing and integration of medical data in biomedical research. In order to improve a health care system, it is required to support the integration of data by facilitating semantic interoperability systems and practices. Semantic interoperability is difficult to achieve in these systems as the conceptual models underlying datasets are not fully exploited. In this paper, we propose a semantic framework, called Medical Knowledge Discovery and Data Mining (MedKDD), that aims to build a topic hierarchy and serve the semantic interoperability between different ontologies. For the purpose, we fully focus on the discovery of semantic patterns about the association of relations in the heterogeneous information network representing different types of objects and relationships in multiple biological ontologies and the creation of a topic hierarchy through the analysis of the discovered patterns. These patterns are used to cluster heterogeneous information networks into a set of smaller topic graphs in a hierarchical manner and then to conduct cross domain knowledge discovery from the multiple biological ontologies. Thus, patterns made a greater contribution in the knowledge discovery across multiple ontologies. We have demonstrated the cross domain knowledge discovery in the MedKDD framework using a case study with 9 primary biological ontologies from Bio2RDF and compared it with the cross domain query processing approach, namely SLAP. We have confirmed the effectiveness of the MedKDD framework in knowledge discovery from multiple medical ontologies. PMID:27548262
ERIC Educational Resources Information Center
Lagoze, Carl; Neylon, Eamonn; Mooney, Stephen; Warnick, Walter L.; Scott, R. L.; Spence, Karen J.; Johnson, Lorrie A.; Allen, Valerie S.; Lederman, Abe
2001-01-01
Includes four articles that discuss Dublin Core metadata, digital rights management and electronic books, including interoperability; and directed query engines, a type of search engine designed to access resources on the deep Web that is being used at the Department of Energy. (LRW)
Building a Global Earth Observation System of Systems (GEOSS) and Its Interoperability Challenges
NASA Astrophysics Data System (ADS)
Ryan, B. J.
2015-12-01
Launched in 2005 by industrialized nations, the Group on Earth Observations (GEO) began building the Global Earth Observation System of Systems (GEOSS). Consisting of both a policy framework, and an information infrastructure, GEOSS, was intended to link and/or integrate the multitude of Earth observation systems, primarily operated by its Member Countries and Participating Organizations, so that users could more readily benefit from global information assets for a number of society's key environmental issues. It was recognized that having ready access to observations from multiple systems was a prerequisite for both environmental decision-making, as well as economic development. From the very start, it was also recognized that the shear complexity of the Earth's system cannot be captured by any single observation system, and that a federated, interoperable approach was necessary. While this international effort has met with much success, primarily in advancing broad, open data policies and practices, challenges remain. In 2014 (Geneva, Switzerland) and 2015 (Mexico City, Mexico), Ministers from GEO's Member Countries, including the European Commission, came together to assess progress made during the first decade (2005 to 2015), and approve implementation strategies and mechanisms for the second decade (2016 to 2025), respectively. The approved implementation strategies and mechanisms are intended to advance GEOSS development thereby facilitating the increased uptake of Earth observations for informed decision-making. Clearly there are interoperability challenges that are technological in nature, and several will be discussed in this presentation. There are, however, interoperability challenges that can be better characterized as economic, governmental and/or political in nature, and these will be discussed as well. With the emergence of the Sustainable Development Goals (SDGs), the World Conference on Disaster Risk Reduction (WCDRR), and the United Nations Framework Convention on Climate Change (UNFCCC) having occurred this year, it will be essential that the interoperability challenges described herein, regardless of their nature, be expeditiously addressed so that Earth observations can indeed inform societal decision-making.
Interoperable cross-domain semantic and geospatial framework for automatic change detection
NASA Astrophysics Data System (ADS)
Kuo, Chiao-Ling; Hong, Jung-Hong
2016-01-01
With the increasingly diverse types of geospatial data established over the last few decades, semantic interoperability in integrated applications has attracted much interest in the field of Geographic Information System (GIS). This paper proposes a new strategy and framework to process cross-domain geodata at the semantic level. This framework leverages the semantic equivalence of concepts between domains through bridge ontology and facilitates the integrated use of different domain data, which has been long considered as an essential superiority of GIS, but is impeded by the lack of understanding about the semantics implicitly hidden in the data. We choose the task of change detection to demonstrate how the introduction of ontology concept can effectively make the integration possible. We analyze the common properties of geodata and change detection factors, then construct rules and summarize possible change scenario for making final decisions. The use of topographic map data to detect changes in land use shows promising success, as far as the improvement of efficiency and level of automation is concerned. We believe the ontology-oriented approach will enable a new way for data integration across different domains from the perspective of semantic interoperability, and even open a new dimensionality for the future GIS.
U.S.C.G. communications interoperability technology assessment
DOT National Transportation Integrated Search
1997-08-01
The U.S. Coast Guard must routinely interact with other (Federal) government agencies (OGA), State and Local agencies, and various public safety organizations in the performance of its missions. In most areas of the country this was primarily local p...
Marcelo, A; Adejumo, A; Luna, D
2011-01-01
Describe the issues surrounding health informatics in developing countries and the challenges faced by practitioners in building internal capacity. From these issues, the authors propose cost-effective strategies that can fast track health informatics development in these low to medium income countries (LMICs). The authors conducted a review of literature and consulted key opinion leaders who have experience with health informatics implementations around the world. Despite geographic and cultural differences, many LMICs share similar challenges and opportunities in developing health informatics. Partnerships, standards, and inter-operability are well known components of successful informatics programs. Establishing partnerships can be comprised of formal inter-institutional collaborations on training and research, collaborative open source software development, and effective use of social networking. Lacking legacy systems, LMICs can discuss standards and inter-operability more openly and have greater potential for success. Lastly, since cellphones are pervasive in developing countries, they can be leveraged as access points for delivering and documenting health services in remote under-served areas. Mobile health or mHealth gives LMICs a unique opportunity to leapfrog through most issues that have plagued health informatics in developed countries. By employing this proposed roadmap, LMICs can now develop capacity for health informatics using appropriate and cost-effective technologies.
Interoperability Assets for Patient Summary Components: A Gap Analysis.
Heitmann, Kai U; Cangioli, Giorgio; Melgara, Marcello; Chronaki, Catherine
2018-01-01
The International Patient Summary (IPS) standards aim to define the specifications for a minimal and non-exhaustive Patient Summary, which is specialty-agnostic and condition-independent, but still clinically relevant. Meanwhile, health systems are developing and implementing their own variation of a patient summary while, the eHealth Digital Services Infrastructure (eHDSI) initiative is deploying patient summary services across countries in the Europe. In the spirit of co-creation, flexible governance, and continuous alignment advocated by eStandards, the Trillum-II initiative promotes adoption of the patient summary by engaging standards organizations, and interoperability practitioners in a community of practice for digital health to share best practices, tools, data, specifications, and experiences. This paper compares operational aspects of patient summaries in 14 case studies in Europe, the United States, and across the world, focusing on how patient summary components are used in practice, to promote alignment and joint understanding that will improve quality of standards and lower costs of interoperability.
Trust Model to Enhance Security and Interoperability of Cloud Environment
NASA Astrophysics Data System (ADS)
Li, Wenjuan; Ping, Lingdi
Trust is one of the most important means to improve security and enable interoperability of current heterogeneous independent cloud platforms. This paper first analyzed several trust models used in large and distributed environment and then introduced a novel cloud trust model to solve security issues in cross-clouds environment in which cloud customer can choose different providers' services and resources in heterogeneous domains can cooperate. The model is domain-based. It divides one cloud provider's resource nodes into the same domain and sets trust agent. It distinguishes two different roles cloud customer and cloud server and designs different strategies for them. In our model, trust recommendation is treated as one type of cloud services just like computation or storage. The model achieves both identity authentication and behavior authentication. The results of emulation experiments show that the proposed model can efficiently and safely construct trust relationship in cross-clouds environment.
ERIC Educational Resources Information Center
Waters, John K.
2008-01-01
Data integration is one of the single most challenging tasks any district can face. Fortunately for school districts throughout the country with data scattered in disparate systems, an open specification known as the Schools Interoperability Framework (SIF) is mitigating that challenge. SIF has emerged as a cornerstone of K-12 data warehousing,…
An Architecture for the Integration of Clinical Data from a PEHR in a Regional Research Platform.
Schreiweis, Björn; Bronsch, Tobias; Stein, Katharina E; Nöst, Stefan; Aguduri, Lakshmi S; Brandner, Antje; Pensold, Peter; Weiss, Nicolas; Yüksekogul, Nilay; Bergh, Björn; Heinze, Oliver
2016-01-01
Making clinical information available for research is not only relevant for healthcare institutions, but also for regional EHRs, as cross-sectorial information can be made accessible. In the INFOPAT (INFOrmation technology for PATient-oriented health care in the Rhine-Neckar metropolitan region) project we are thus implementing both, a regional personal cross-enterprise electronic health record (PEHR) and a regional research platform (RRP) based on information from the PEHR. IHE profiles are implemented to achieve interoperability between healthcare institutions electronic medical records (EMR) and PEHR on the one hand, as well as PEHR and RRP on the other hand. The use case for the RRP is cross-sectorial quality assessment and improvement for colorectal cancer based on a quality indicator (QI) approach including patients' perspectives. For semantic interoperability the responses are transferred in the form of HL7 CDA L2 documents. The resulting architecture for a RRP shows that implementing a PEHR in combination with a RRP based on international communication standards is possible. Also IHE XDS can be used for integration of patient care and biomedical research infrastructures.
NASA Astrophysics Data System (ADS)
Fox, P. A.; Diviacco, P.; Busato, A.
2016-12-01
Geo-scientific research collaboration commonly faces of complex systems where multiple skills and competences are needed at the same time. Efficacy of such collaboration among researchers then becomes of paramount importance. Multidisciplinary studies draw from domains that are far from each other. Researchers also need to understand: how to extract what data they need and eventually produce something that can be used by others. The management of information and knowledge in this perspective is non-trivial. Interoperability is frequently sought in computer-to-computer environements, so-as to overcome mismatches in vocabulary, data formats, coordinate reference system and so on. Successful researcher collaboration also relies on interoperability of the people! Smaller, synchronous and face-to-face settings for researchers are knownn to enhance people interoperability. However changing settings; either geographically; temporally; or with increasing the team size, diversity, and expertise requires people-computer-people-computer (...) interoperability. To date, knowledge representation framework have been proposed but not proven as necessary and sufficient to achieve multi-way interoperability. In this contribution, we address epistemology and sociology of science advocating for a fluid perspective where science is mostly a social construct, conditioned by cognitive issues; especially cognitive bias. Bias cannot be obliterated. On the contrary it must be carefully taken into consideration. Information-centric interfaces built from different perspectives and ways of thinking by actors with different point of views, approaches and aims, are proposed as a means for enhancing people interoperability in computer-based settings. The contribution will provide details on the approach of augmenting and interfacing to knowledge representation frameworks to the cognitive-conceptual frameworks for people that are needed to meet and exceed collaborative research goals in the 21st century. A web based collaborative portal has been developed that integrates both approaches and will be presented. Reports will be given on initial tests that have encouraging results.
Selecting a Learning Management System (LMS) in Developing Countries: Instructors' Evaluation
ERIC Educational Resources Information Center
Cavus, Nadire
2013-01-01
Learning management systems (LMSs) contain hidden costs, unclear user environments, bulky developer and administration manuals, and limitations with regard to interoperability, integration, localization, and bandwidth requirements. Careful evaluation is required in selecting the most appropriate LMS for use, and this is a general problem in…
Bi-National Corps of Nato’s Main Defense Forces in Central Europe: Creating Interoperability
1993-06-04
reductions of their forces after the desintegration of the Soviet Union. Moreover, the European countries, whose force structure focused solely on...Korps--Anspruch und Wirklichkeit (A House for many families ; about the example LANDJUT: Multinational Corps--Request and Reality)," Truopenpraxis, 4/1992
On the Execution Control of HLA Federations using the SISO Space Reference FOM
NASA Technical Reports Server (NTRS)
Moller, Bjorn; Garro, Alfredo; Falcone, Alberto; Crues, Edwin Z.; Dexter, Daniel E.
2017-01-01
In the Space domain the High Level Architecture (HLA) is one of the reference standard for Distributed Simulation. However, for the different organizations involved in the Space domain (e.g. NASA, ESA, Roscosmos, and JAXA) and their industrial partners, it is difficult to implement HLA simulators (called Federates) able to interact and interoperate in the context of a distributed HLA simulation (called Federation). The lack of a common FOM (Federation Object Model) for the Space domain is one of the main reasons that precludes a-priori interoperability between heterogeneous federates. To fill this lack a Product Development Group (PDG) has been recently activated in the Simulation Interoperability Standards Organization (SISO) with the aim to provide a Space Reference FOM (SRFOM) for international collaboration on Space systems simulations. Members of the PDG come from several countries and contribute experiences from projects within NASA, ESA and other organizations. Participants represent government, academia and industry. The paper presents an overview of the ongoing Space Reference FOM standardization initiative by focusing on the solution provided for managing the execution of an SRFOM-based Federation.
Kautsch, Marcin; Lichoń, Mateusz; Matuszak, Natalia
2017-10-01
E-health has experienced a dynamic development across the European Union in the recent years and enjoys support from the European Commission that seeks to achieve interoperability of national healthcare systems in order to facilitate free movement. Differences that can be observed between the member states in legal regulations, cultural approaches and technological solutions may hinder this process. This study compares the legal standing of e-health in Denmark, Poland, Spain and the UK, along with key legal acts and their implications. The academic literature review along with an analysis of materials found through the desk study research (reports, legal acts, press articles, governmental web pages and so on) was performed in order to identify aspects relevant to e-health interoperability. The approach to legal regulation of e-health substantially differs by country. So do the procedures that they have developed regarding the requirement for patient's consent for the processing of their data, their rights to access to the medical data, to change the data, data confidentiality and types of electronic health records. The principles governing the assignment of responsibility for data protection are also different. These legal and technological differences must be reconciled if interoperability of European national e-health systems is to be achieved. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Loescher, H.; Fundamental Instrument Unit
2013-05-01
Ecological research addresses challenges relating to the dynamics of the planet, such as changes in climate, biodiversity, ecosystem functioning and services, carbon and energy cycles, natural and human-induced hazards, and adaptation and mitigation strategies that involve many science and engineering disciplines and cross national boundaries. Because of the global nature of these challenges, greater international collaboration is required for knowledge sharing and technology deployment to advance earth science investigations and enhance societal benefits. For example, the Working Group on Biodiversity Preservation and Ecosystem Services (PCAST 2011) noted the scale and complexity of the physical and human resources needed to address these challenges. Many of the most pressing ecological research questions require global-scale data and global scale solutions (Suresh 2012), e.g., interdisciplinary data access from data centers managing ecological resources and hazards, drought, heat islands, carbon cycle, or data used to forecast the rate of spread of invasive species or zoonotic diseases. Variability and change at one location or in one region may well result from the superposition of global processes coupled together with regional and local modes of variability. For example, we know the El Niño-Southern Oscillation large-scale modes of variability in the coupled terrestrial-aquatic-atmospheric systems' correlation with variability in regional rainfall and ecosystem functions. It is therefore a high priority of government and non-government organizations to develop the necessary large scale, world-class research infrastructures for environmental research—and the framework by which these data can be shared, discovered, and utilized by a broad user community of scientists and policymakers, alike. Given that there are many, albeit nascent, efforts to build new environmental observatories/networks globally (e.g., EU-ICOS, EU-Lifewatch, AU-TERN, China-CERN, GEOSS, GEO-BON, NutNet, etc.) and domestically, (e.g., NSF-CZO, USDA-LTAR, DOE-NGEE, Soil Carbon Network, etc.), there is a strong and mutual desire to assure interoperability of data. Developing interoperability is the degree by which each of the following is mapped between observatories (entities), defined by linking i) science requirements with science questions, ii) traceability of measurements to nationally and internationally accepted standards, iii) how data product are derived, i.e., algorithms, procedures, and methods, and iv) the bioinformatics which broadly include data formats, metadata, controlled vocabularies, and semantics. Here, we explore the rationale and focus areas for interoperability, the governance and work structures, example projects (NSF-NEON, EU-ICOS, and AU-TERN), and the emergent roles of scientists in these endeavors.
European security framework for healthcare.
Ruotsalainen, Pekka; Pohjonen, Hanna
2003-01-01
eHealth and telemedicine services are promising business areas in Europe. It is clear that eHealth products and services will be sold and ordered from a distance and over national borderlines in the future. However, there are many barriers to overcome. For both national and pan-European eHealth and telemedicine applications a common security framework is needed. These frameworks set security requirements needed for cross-border eHealth services. The next step is to build a security infrastructure which is independent of technical platforms. Most of the European eHealth platforms are regional or territorial. Some countries are looking for a Public Key Infrastructure, but no large scale solutions do exist in healthcare. There is no clear candidate solution for European-wide interoperable eHealth platform. Gross-platform integration seems to be the most practical integration method at a European level in the short run. The use of Internet as a European integration platform is a promising solution in the long run.
System architecture of communication infrastructures for PPDR organisations
NASA Astrophysics Data System (ADS)
Müller, Wilmuth
2017-04-01
The growing number of events affecting public safety and security (PS and S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on organizations responsible for PS and S. In order to respond timely and in an adequate manner to such events Public Protection and Disaster Relief (PPDR) organizations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies do not provide broadband capability, which is a major limitation in supporting new services hence new information flows and currently they have no successor. There is also no known standard that addresses interoperability of these technologies. The paper at hands provides an approach to tackle the above mentioned aspects by defining an Enterprise Architecture (EA) of PPDR organizations and a System Architecture of next generation PPDR communication networks for a variety of applications and services on broadband networks, including the ability of inter-system, inter-agency and cross-border operations. The Open Safety and Security Architecture Framework (OSSAF) provides a framework and approach to coordinate the perspectives of different types of stakeholders within a PS and S organization. It aims at bridging the silos in the chain of commands and on leveraging interoperability between PPDR organizations. The framework incorporates concepts of several mature enterprise architecture frameworks including the NATO Architecture Framework (NAF). However, OSSAF is not providing details on how NAF should be used for describing the OSSAF perspectives and views. In this contribution a mapping of the NAF elements to the OSSAF views is provided. Based on this mapping, an EA of PPDR organizations with a focus on communication infrastructure related capabilities is presented. Following the capability modeling, a system architecture for secure and interoperable communication infrastructures for PPDR organizations is presented. This architecture was implemented within a project sponsored by the European Union and successfully demonstrated in a live validation exercise in June 2016.
ERIC Educational Resources Information Center
Wang, Yanqing; Qi, Zhongying; Li, Ziru; Zhang, Lijie
2011-01-01
Engineering education has been well implemented in the majority of developed countries such as the USA, Germany, and the United Kingdom so that the gap between engineering science and engineering practice is greatly bridged. However, in China, the gap still exists, and some attempts by Chinese government, even though having made obvious progress,…
Cross-Cutting Interoperability in an Earth Science Collaboratory
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Ramachandran, Rahul; Kuo, Kuo-Sen
2011-01-01
An Earth Science Collaboratory is: A rich data analysis environment with: (1) Access to a wide spectrum of Earth Science data, (3) A diverse set of science analysis services and tools, (4) A means to collaborate on data, tools and analysis, and (5)Supports sharing of data, tools, results and knowledge
Cross-language Babel structs—making scientific interfaces more efficient
NASA Astrophysics Data System (ADS)
Prantl, Adrian; Ebner, Dietmar; Epperly, Thomas G. W.
2013-01-01
Babel is an open-source language interoperability framework tailored to the needs of high-performance scientific computing. As an integral element of the Common Component Architecture, it is employed in a wide range of scientific applications where it is used to connect components written in different programming languages. In this paper we describe how we extended Babel to support interoperable tuple data types (structs). Structs are a common idiom in (mono-lingual) scientific application programming interfaces (APIs); they are an efficient way to pass tuples of nonuniform data between functions, and are supported natively by most programming languages. Using our extended version of Babel, developers of scientific codes can now pass structs as arguments between functions implemented in any of the supported languages. In C, C++, Fortran 2003/2008 and Chapel, structs can be passed without the overhead of data marshaling or copying, providing language interoperability at minimal cost. Other supported languages are Fortran 77, Fortran 90/95, Java and Python. We will show how we designed a struct implementation that is interoperable with all of the supported languages and present benchmark data to compare the performance of all language bindings, highlighting the differences between languages that offer native struct support and an object-oriented interface with getter/setter methods. A case study shows how structs can help simplify the interfaces of scientific codes significantly.
Evolution of System Architectures: Where Do We Need to Fail Next?
NASA Astrophysics Data System (ADS)
Bermudez, Luis; Alameh, Nadine; Percivall, George
2013-04-01
Innovation requires testing and failing. Thomas Edison was right when he said "I have not failed. I've just found 10,000 ways that won't work". For innovation and improvement of standards to happen, service Architectures have to be tested and tested. Within the Open Geospatial Consortium (OGC), testing of service architectures has occurred for the last 15 years. This talk will present an evolution of these service architectures and a possible future path. OGC is a global forum for the collaboration of developers and users of spatial data products and services, and for the advancement and development of international standards for geospatial interoperability. The OGC Interoperability Program is a series of hands-on, fast paced, engineering initiatives to accelerate the development and acceptance of OGC standards. Each initiative is organized in threads that provide focus under a particular theme. The first testbed, OGC Web Services phase 1, completed in 2003 had four threads: Common Architecture, Web Mapping, Sensor Web and Web Imagery Enablement. The Common Architecture was a cross-thread theme, to ensure that the Web Mapping and Sensor Web experiments built on a base common architecture. The architecture was based on the three main SOA components: Broker, Requestor and Provider. It proposed a general service model defining service interactions and dependencies; categorization of service types; registries to allow discovery and access of services; data models and encodings; and common services (WMS, WFS, WCS). For the latter, there was a clear distinction on the different services: Data Services (e.g. WMS), Application services (e.g. Coordinate transformation) and server-side client applications (e.g. image exploitation). The latest testbed, OGC Web Service phase 9, completed in 2012 had 5 threads: Aviation, Cross-Community Interoperability (CCI), Security and Services Interoperability (SSI), OWS Innovations and Compliance & Interoperability Testing & Evaluation (CITE). Compared to the first testbed, OWS-9 did not have a separate common architecture thread. Instead the emphasis was on brokering information models, securing them and making data available efficiently on mobile devices. The outcome is an architecture based on usability and non-intrusiveness while leveraging mediation of information models from different communities. This talk will use lessons learned from the evolution from OGC Testbed phase 1 to phase 9 to better understand how global and complex infrastructures evolve to support many communities including the Earth System Science Community.
XDS-I outsourcing proxy: ensuring confidentiality while preserving interoperability.
Ribeiro, Luís S; Viana-Ferreira, Carlos; Oliveira, José Luís; Costa, Carlos
2014-07-01
The interoperability of services and the sharing of health data have been a continuous goal for health professionals, patients, institutions, and policy makers. However, several issues have been hindering this goal, such as incompatible implementations of standards (e.g., HL7, DICOM), multiple ontologies, and security constraints. Cross-enterprise document sharing (XDS) workflows were proposed by Integrating the Healthcare Enterprise (IHE) to address current limitations in exchanging clinical data among organizations. To ensure data protection, XDS actors must be placed in trustworthy domains, which are normally inside such institutions. However, due to rapidly growing IT requirements, the outsourcing of resources in the Cloud is becoming very appealing. This paper presents a software proxy that enables the outsourcing of XDS architectural parts while preserving the interoperability, confidentiality, and searchability of clinical information. A key component in our architecture is a new searchable encryption (SE) scheme-Posterior Playfair Searchable Encryption (PPSE)-which, besides keeping the same confidentiality levels of the stored data, hides the search patterns to the adversary, bringing improvements when compared to the remaining practical state-of-the-art SE schemes.
NASA Astrophysics Data System (ADS)
Orellana, Diego A.; Salas, Alberto A.; Solarz, Pablo F.; Medina Ruiz, Luis; Rotger, Viviana I.
2016-04-01
The production of clinical information about each patient is constantly increasing, and it is noteworthy that the information is created in different formats and at diverse points of care, resulting in fragmented, incomplete, inaccurate and isolated, health information. The use of health information technology has been promoted as having a decisive impact to improve the efficiency, cost-effectiveness, quality and safety of medical care delivery. However in developing countries the utilization of health information technology is insufficient and lacking of standards among other situations. In the present work we evaluate the framework EHRGen, based on the openEHR standard, as mean to reach generation and availability of patient centered information. The framework has been evaluated through the provided tools for final users, that is, without intervention of computer experts. It makes easier to adopt the openEHR ideas and provides an open source basis with a set of services, although some limitations in its current state conspire against interoperability and usability. However, despite the described limitations respect to usability and semantic interoperability, EHRGen is, at least regionally, a considerable step toward EHR adoption and interoperability, so that it should be supported from academic and administrative institutions.
Interoperability prototype between hospitals and general practitioners in Switzerland.
Alves, Bruno; Müller, Henning; Schumacher, Michael; Godel, David; Abu Khaled, Omar
2010-01-01
Interoperability in data exchange has the potential to improve the care processes and decrease costs of the health care system. Many countries have related eHealth initiatives in preparation or already implemented. In this area, Switzerland has yet to catch up. Its health system is fragmented, because of the federated nature of cantons. It is thus more difficult to coordinate efforts between the existing healthcare actors. In the Medicoordination project a pragmatic approach was selected: integrating several partners in healthcare on a regional scale in French speaking Switzerland. In parallel with the Swiss eHealth strategy, currently being elaborated by the Swiss confederation, particularly medium-sized hospitals and general practitioners were targeted in Medicoordination to implement concrete scenarios of information exchange between hospitals and general practitioners with a high added value. In this paper we focus our attention on a prototype implementation of one chosen scenario: the discharge summary. Although simple in concept, exchanging release letters shows small, hidden difficulties due to the multi-partner nature of the project. The added value of such a prototype is potentially high and it is now important to show that interoperability can work in practice.
Electronic Health Records Data and Metadata: Challenges for Big Data in the United States.
Sweet, Lauren E; Moulaison, Heather Lea
2013-12-01
This article, written by researchers studying metadata and standards, represents a fresh perspective on the challenges of electronic health records (EHRs) and serves as a primer for big data researchers new to health-related issues. Primarily, we argue for the importance of the systematic adoption of standards in EHR data and metadata as a way of promoting big data research and benefiting patients. EHRs have the potential to include a vast amount of longitudinal health data, and metadata provides the formal structures to govern that data. In the United States, electronic medical records (EMRs) are part of the larger EHR. EHR data is submitted by a variety of clinical data providers and potentially by the patients themselves. Because data input practices are not necessarily standardized, and because of the multiplicity of current standards, basic interoperability in EHRs is hindered. Some of the issues with EHR interoperability stem from the complexities of the data they include, which can be both structured and unstructured. A number of controlled vocabularies are available to data providers. The continuity of care document standard will provide interoperability in the United States between the EMR and the larger EHR, potentially making data input by providers directly available to other providers. The data involved is nonetheless messy. In particular, the use of competing vocabularies such as the Systematized Nomenclature of Medicine-Clinical Terms, MEDCIN, and locally created vocabularies inhibits large-scale interoperability for structured portions of the records, and unstructured portions, although potentially not machine readable, remain essential. Once EMRs for patients are brought together as EHRs, the EHRs must be managed and stored. Adequate documentation should be created and maintained to assure the secure and accurate use of EHR data. There are currently a few notable international standards initiatives for EHRs. Organizations such as Health Level Seven International and Clinical Data Interchange Standards Consortium are developing and overseeing implementation of interoperability standards. Denmark and Singapore are two countries that have successfully implemented national EHR systems. Future work in electronic health information initiatives should underscore the importance of standards and reinforce interoperability of EHRs for big data research and for the sake of patients.
CEOS WGISS Common Data Framework for WGISS Connected Data Assets
NASA Technical Reports Server (NTRS)
Enloe, Yonsook; Mitchell, Andrew; Albani, Mirko; Yapur, Martin
2016-01-01
This session will explore the benefits of having such a policy framework and future steps both domestically and internationally. Speakers can highlight current work being done to improve data interoperability, how the Common Framework is relevant for other data types, other countries and multinational organizations, and considerations for data management that have yet to be addressed in the Common Framework.
NASA Astrophysics Data System (ADS)
Allison, M. Lee; Davis, Rowena
2016-04-01
An e-infrastructure that supports data-intensive, multidisciplinary research is needed to accelerate the pace of science to address 21st century global change challenges. Data discovery, access, sharing and interoperability collectively form core elements of an emerging shared vision of e-infrastructure for scientific discovery. The pace and breadth of change in information management across the data lifecycle means that no one country or institution can unilaterally provide the leadership and resources required to use data and information effectively, or needed to support a coordinated, global e-infrastructure. An 18-month long process involving ~120 experts in domain, computer, and social sciences from more than a dozen countries resulted in a formal set of recommendations that were adopted in fall, 2015 by the Belmont Forum collaboration of national science funding agencies and international bodies on what they are best suited to implement for development of an e-infrastructure in support of global change research, including: • adoption of data principles that promote a global, interoperable e-infrastructure, that can be enforced • establishment of information and data officers for coordination of global data management and e-infrastructure efforts • promotion of effective data planning and stewardship • determination of international and community best practices for adoption • development of a cross-disciplinary training curriculum on data management and curation The implementation plan is being executed under four internationally-coordinated Action Themes towards a globally organized, internationally relevant e-infrastructure and data management capability drawn from existing components, protocols, and standards. The Belmont Forum anticipates opportunities to fund additional projects to fill key gaps and to integrate best practices into an e-infrastructure to support their programs but that can also be scaled up and deployed more widely. Background The Belmont Forum is a global consortium established in 2009 to build on the work of the International Group of Funding Agencies for Global Change Research toward furthering collaborative efforts to deliver knowledge needed for action to avoid and adapt to detrimental environmental change, including extreme hazardous events.
Providing interoperability of eHealth communities through peer-to-peer networks.
Kilic, Ozgur; Dogac, Asuman; Eichelberg, Marco
2010-05-01
Providing an interoperability infrastructure for Electronic Healthcare Records (EHRs) is on the agenda of many national and regional eHealth initiatives. Two important integration profiles have been specified for this purpose, namely, the "Integrating the Healthcare Enterprise (IHE) Cross-enterprise Document Sharing (XDS)" and the "IHE Cross Community Access (XCA)." IHE XDS describes how to share EHRs in a community of healthcare enterprises and IHE XCA describes how EHRs are shared across communities. However, the current version of the IHE XCA integration profile does not address some of the important challenges of cross-community exchange environments. The first challenge is scalability. If every community that joins the network needs to connect to every other community, i.e., a pure peer-to-peer network, this solution will not scale. Furthermore, each community may use a different coding vocabulary for the same metadata attribute, in which case, the target community cannot interpret the query involving such an attribute. Yet another important challenge is that each community may (and typically will) have a different patient identifier domain. Querying for the patient identifiers in the target community using patient demographic data may create patient privacy concerns. In this paper, we address each of these challenges and show how they can be handled effectively in a superpeer-based peer-to-peer architecture.
A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem
Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour
2018-01-01
The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem. PMID:29597286
A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem.
AlShehri, Helala; Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour
2018-03-28
The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem.
Auto-Generated Semantic Processing Services
NASA Technical Reports Server (NTRS)
Davis, Rodney; Hupf, Greg
2009-01-01
Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.
Enabling Interoperability in Heliophysical Domains
NASA Astrophysics Data System (ADS)
Bentley, Robert
2013-04-01
There are many aspects of science in the Solar System that are overlapping - phenomena observed in one domain can have effects in other domains. However, there are many problems related to exploiting the data in cross-disciplinary studies because of lack of interoperability of the data and services. The CASSIS project is a Coordination Action funded under FP7 that has the objective of improving the interoperability of data and services related Solar System science. CASSIS has been investigating how the data could be made more accessible with some relatively minor changes to the observational metadata. The project has been looking at the services that are used within the domain and determining whether they are interoperable with each other and if not what would be required make them so. It has also been examining all types of metadata that are used when identifying and using observations and trying to make them more compliant with techniques and standards developed by bodies such as the International Virtual Observatory Alliance (IVOA). Many of the lessons that are being learnt in the study are applicable to domains that go beyond those directly involved in heliophysics. Adopting some simple standards related to the design of the services interfaces and metadata that are used would make it much easier to investigate interdisciplinary science topics. We will report on our finding and describe a roadmap for the future. For more information about CASSIS, please visit the project Web site on cassis-vo.eu
Interoperability challenges in river discharge modelling: A cross domain application scenario
NASA Astrophysics Data System (ADS)
Santoro, Mattia; Andres, Volker; Jirka, Simon; Koike, Toshio; Looser, Ulrich; Nativi, Stefano; Pappenberger, Florian; Schlummer, Manuela; Strauch, Adrian; Utech, Michael; Zsoter, Ervin
2018-06-01
River discharge is a critical water cycle variable, as it integrates all the processes (e.g. runoff and evapotranspiration) occurring within a river basin and provides a hydrological output variable that can be readily measured. Its prediction is of invaluable help for many water-related tasks including water resources assessment and management, flood protection, and disaster mitigation. Observations of river discharge are important to calibrate and validate hydrological or coupled land, atmosphere and ocean models. This requires using datasets from different scientific domains (Water, Weather, etc.). Typically, such datasets are provided using different technological solutions. This complicates the integration of new hydrological data sources into application systems. Therefore, a considerable effort is often spent on data access issues instead of the actual scientific question. This paper describes the work performed to address multidisciplinary interoperability challenges related to river discharge modeling and validation. This includes definition and standardization of domain specific interoperability standards for hydrological data sharing and their support in global frameworks such as the Global Earth Observation System of Systems (GEOSS). The research was developed in the context of the EU FP7-funded project GEOWOW (GEOSS Interoperability for Weather, Ocean and Water), which implemented a "River Discharge" application scenario. This scenario demonstrates the combination of river discharge observations data from the Global Runoff Data Centre (GRDC) database and model outputs produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) predicting river discharge based on weather forecast information in the context of the GEOSS.
Interoperability science cases with the CDPP tools
NASA Astrophysics Data System (ADS)
Nathanaël, J.; Cecconi, B.; André, N.; Bouchemit, M.; Gangloff, M.; Budnik, E.; Jacquey, C.; Pitout, F.; Durand, J.; Rouillard, A.; Lavraud, B.; Genot, V. N.; Popescu, D.; Beigbeder, L.; Toniutti, J. P.; Caussarieu, S.
2017-12-01
Data exchange protocols are never as efficient as when they are invisible for the end user who is then able to discover data, to cross compare observations and modeled data and finally to perform in depth analysis. Over the years these protocols, including SAMP from IVOA, EPN-TAP from the Europlanet 2020 RI community, backed by standard web-services, have been deployed in tools designed by the French Centre de Données de la Physique des Plasmas (CDPP) including AMDA, the Propagation Tool, 3DView, ... . This presentation will focus on science cases which show the capability of interoperability in the planetary and heliophysics contexts, involving both CDPP and companion tools. Europlanet 2020 RI has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208.
2009-06-12
Phasing Model ......................................................................................................9 Figure 2. The Continuum of...the communist periphery. In a high-intensity conflict, doctrine at the time called for conventional forces to fight the traditional, linear fight...operations and proximity of cross component forces in a non- linear battlespace – Rigid business rules, translator applications, or manual workarounds to
Multi-disciplinary interoperability challenges (Ian McHarg Medal Lecture)
NASA Astrophysics Data System (ADS)
Annoni, Alessandro
2013-04-01
Global sustainability research requires multi-disciplinary efforts to address the key research challenges to increase our understanding of the complex relationships between environment and society. For this reason dependence on ICT systems interoperability is rapidly growing but, despite some relevant technological improvement is observed, in practice operational interoperable solutions are still lacking. Among the causes is the absence of a generally accepted definition of "interoperability" in all its broader aspects. In fact the concept of interoperability is just a concept and the more popular definitions are not addressing all challenges to realize operational interoperable solutions. The problem become even more complex when multi-disciplinary interoperability is required because in that case solutions for interoperability of different interoperable solution should be envisaged. In this lecture the following definition will be used: "interoperability is the ability to exchange information and to use it". In the lecture the main challenges for addressing multi-disciplinary interoperability will be presented and a set of proposed approaches/solutions shortly introduced.
2016-09-01
training in the decisive action training environment, with rotations routinely featuring several thousand participants from many nations and operating in...teams work with exercise participants before they arrive at the training center. The goal is to ensure all formations understand — and are able to...capabilities, location, and extensive experience working with NATO and partner countries, the JMTC is uniquely positioned to implement NATO training
Small Countries’ Special Operations Forces Contribution to the NATO Response Force
2014-06-13
tasks: Military Assistance, Direct Actions and Special Reconnaissance. The Jackal Stone exercise serves as a bedrock designed to build special...operations capabilities and improve interoperability among European partner nations. In Jackal Stone 2012, Army Major General Michael S. Repass, Commander...coalition or NATO operation, and the significance of teaming up with another 48 capable partner.96 Exercise Jackal Stone is an annual event and it is
NASA Astrophysics Data System (ADS)
Asmi, Ari; Powers, Lindsay
2015-04-01
Research Infrastructures (RIs) are major long-term investments supporting innovative, bottom-up research activities. In the environmental research, they range from high atmosphere radars, to field observation networks and coordinated laboratory facilities. The Earth system is highly interactive and each part of the system interconnected across the spatial and disciplinary borders. However, due practical and historical reasons, the RIs are built from disciplinary points-of-view and separately in different parts of the world, with differing standards, policies, methods and research cultures. This heterogeneity provides necessary diversity to study the complex Earth system, but makes cross-disciplinary and/or global interoperability a challenge. Global actions towards better interoperability are surfacing, especially with EU and US. For example, recent mandates within the US government prioritize open data for federal agencies and federally funded science, and encourage collaboration among agencies to reduce duplication of efforts and increase efficient use of resources. There are several existing initiatives working toward these goals (e.g., COOPEUS, EarthCube, RDA, ICSU-WDS, DataOne, ESIP, USGEO, GEO). However, there is no cohesive framework to coordinate efforts among these, and other, entities. COOPEUS and EarthCube have now begun to map the landscape of interoperability efforts across earth science domains. The COOPEUS mapping effort describes the EU and US landscape of environmental research infrastructures to accomplish the following: identify gaps in services (data provision) necessary to address societal priorities; provide guidance for development of future research infrastructures; and identify opportunities for Research Infrastructures (RIs) to collaborate on issues of common interest. EarthCube mapping effort identifies opportunities to engage a broader community by identifying scientific domain organizations and entities. We present the current situation of the landscape analysis to create a sustainable effort towards removing barriers to interoperability on a global scale.
Tool and data interoperability in the SSE system
NASA Technical Reports Server (NTRS)
Shotton, Chuck
1988-01-01
Information is given in viewgraph form on tool and data interoperability in the Software Support Environment (SSE). Information is given on industry problems, SSE system interoperability issues, SSE solutions to tool and data interoperability, and attainment of heterogeneous tool/data interoperability.
A Research on E - learning Resources Construction Based on Semantic Web
NASA Astrophysics Data System (ADS)
Rui, Liu; Maode, Deng
Traditional e-learning platforms have the flaws that it's usually difficult to query or positioning, and realize the cross platform sharing and interoperability. In the paper, the semantic web and metadata standard is discussed, and a kind of e - learning system framework based on semantic web is put forward to try to solve the flaws of traditional elearning platforms.
NASA Astrophysics Data System (ADS)
Crutcher, Richard I.; Jones, R. W.; Moore, Michael R.; Smith, S. F.; Tolley, Alan L.; Rochelle, Robert W.
1997-02-01
A prototype 'smart' repeater that provides interoperability capabilities for radio communication systems in multi-agency and multi-user scenarios is being developed by the Oak Ridge National Laboratory. The smart repeater functions as a deployable communications platform that can be dynamically reconfigured to cross-link the radios of participating federal, state, and local government agencies. This interconnection capability improves the coordination and execution of multi-agency operations, including coordinated law enforcement activities and general emergency or disaster response scenarios. The repeater provides multiple channels of operation in the 30-50, 118-136, 138-174, and 403-512 MHz land mobile communications and aircraft bands while providing the ability to cross-connect among multiple frequencies, bands, modulation types, and encryption formats. Additionally, two telephone interconnects provide links to the fixed and cellular telephone networks. The 800- and 900-MHz bands are not supported by the prototype, but the modular design of the system accommodates future retrofits to extend frequency capabilities with minimal impact to the system. Configuration of the repeater is through a portable personal computer with a Windows-based graphical interface control screen that provides dynamic reconfiguration of network interconnections and formats.
UHF (Ultra High Frequency) Military Satellite Communications Ground Equipment Interoperability.
1986-10-06
crisis management requires interoperability between various services. These short-term crises often arise from unforeseen circumstances in which...Scheduler Qualcomm has prepared an interoperability study for the JTC3A (Reference 15) as a TA/CE for USCINCLANT ROC 5-84 requirements. It has defined a...interoperability is fundamental. A number of operational crises have occurred where interoperable communications or the lack of interoperable
Towards technical interoperability in telemedicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard Layne, II
2004-05-01
For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.
The interoperability force in the ERP field
NASA Astrophysics Data System (ADS)
Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon
2015-04-01
Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.
NASA Technical Reports Server (NTRS)
Kwadrat, Carl F.; Horne, William D.; Edwards, Bernard L.
2002-01-01
In order to avoid selecting inadequate inter-spacecraft cross-link communications standards for Distributed Spacecraft System (DSS) missions, it is first necessary to identify cross-link communications strategies and requirements common to a cross-section of proposed missions. This paper addresses the cross-link communication strategies and requirements derived from a survey of 39 DSS mission descriptions that are projected for potential launch within the next 20 years. The inter-spacecraft communications strategies presented are derived from the topological and communications constraints from the DSS missions surveyed. Basic functional requirements are derived from an analysis of the fundamental activities that must be undertaken to establish and maintain a cross-link between two DSS spacecraft. Cross-link bandwidth requirements are derived from high-level assessments of mission science objectives and operations concepts. Finally, a preliminary assessment of possible cross-link standards is presented within the context of the basic operational and interoperability requirements.
Lindsköld, Lars; Wintell, Mikael; Edgren, Lars; Aspelin, Peter; Lundberg, Nina
2013-07-01
Challenges related to the cross-organizational access of accurate and timely information about a patient's condition has become a critical issue in healthcare. Interoperability of different local sources is necessary. To identify and present missing and semantically incorrect data elements of metadata in the radiology enterprise service that supports cross-organizational sharing of dynamic information about patients' visits, in the Region Västra Götaland, Sweden. Quantitative data elements of metadata were collected yearly from the first Wednesday in March from 2006 to 2011 from the 24 in-house radiology departments in Region Västra Götaland. These radiology departments were organized into four hospital groups and three stand-alone hospitals. Included data elements of metadata were the patient name, patient ID, institutional department name, referring physician's name, and examination description. The majority of missing data elements of metadata was related to the institutional department name for Hospital 2, from 87% in 2007 to 25% in 2011. All data elements of metadata except the patient ID contained semantic errors. For example, for the data element "patient name", only three names out of 3537 were semantically correct. This study shows that the semantics of metadata elements are poorly structured and inconsistently used. Although a cross-organizational solution may technically be fully functional, semantic errors may prevent it from serving as an information infrastructure for collaboration between all departments and hospitals in the region. For interoperability, it is important that the agreed semantic models are implemented in vendor systems using the information infrastructure.
2011-12-01
Task Based Approach to Planning.” Paper 08F- SIW -033. In Proceed- ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability...Paper 06F- SIW -003. In Proceed- 2597 Blais ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability Standards Organi...MSDL).” Paper 10S- SIW -003. In Proceedings of the Spring Simulation Interoperability Workshop. Simulation Interoperability Standards Organization
Maturity model for enterprise interoperability
NASA Astrophysics Data System (ADS)
Guédria, Wided; Naudet, Yannick; Chen, David
2015-01-01
Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.
A step-by-step methodology for enterprise interoperability projects
NASA Astrophysics Data System (ADS)
Chalmeta, Ricardo; Pazos, Verónica
2015-05-01
Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.
Experience with abstract notation one
NASA Technical Reports Server (NTRS)
Harvey, James D.; Weaver, Alfred C.
1990-01-01
The development of computer science has produced a vast number of machine architectures, programming languages, and compiler technologies. The cross product of these three characteristics defines the spectrum of previous and present data representation methodologies. With regard to computer networks, the uniqueness of these methodologies presents an obstacle when disparate host environments are to be interconnected. Interoperability within a heterogeneous network relies upon the establishment of data representation commonality. The International Standards Organization (ISO) is currently developing the abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively address this problem. When used within the presentation layer of the open systems interconnection reference model, these two standards provide the data representation commonality required to facilitate interoperability. The details of a compiler that was built to automate the use of ASN.1 and BER are described. From this experience, insights into both standards are given and potential problems relating to this development effort are discussed.
Semantic Interoperability of Health Risk Assessments
Rajda, Jay; Vreeman, Daniel J.; Wei, Henry G.
2011-01-01
The health insurance and benefits industry has administered Health Risk Assessments (HRAs) at an increasing rate. These are used to collect data on modifiable health risk factors for wellness and disease management programs. However, there is significant variability in the semantics of these assessments, making it difficult to compare data sets from the output of 2 different HRAs. There is also an increasing need to exchange this data with Health Information Exchanges and Electronic Medical Records. To standardize the data and concepts from these tools, we outline a process to determine presence of certain common elements of modifiable health risk extracted from these surveys. This information is coded using concept identifiers, which allows cross-survey comparison and analysis. We propose that using LOINC codes or other universal coding schema may allow semantic interoperability of a variety of HRA tools across the industry, research, and clinical settings. PMID:22195174
Beyond Sister City Agreements: Exploring the Challenges of Full International Interoperability
2016-03-01
are often interconnected by more than simple proximity. They are connected through social networks, economy, culture, and shared natural resources...southern U.S. borders to determine how various regions address their cross-border agreements. Research indicated that unique challenges—such as liability...They are connected through social networks, economy, culture, and shared natural resources. Despite this interdependent relationship, and in spite
NASA Astrophysics Data System (ADS)
Fulker, D. W.; Pearlman, F.; Pearlman, J.; Arctur, D. K.; Signell, R. P.
2016-12-01
A major challenge for geoscientists—and a key motivation for the National Science Foundation's EarchCube initiative—is to integrate data across disciplines, as is necessary for complex Earth-system studies such as climate change. The attendant technical and social complexities have led EarthCube participants to devise a system-of-systems architectural concept. Its centerpiece is a (virtual) interoperability workbench, around which a learning community can coalesce, supported in their evolving quests to join data from diverse sources, to synthesize new forms of data depicting Earth phenomena, and to overcome immense obstacles that arise, for example, from mismatched nomenclatures, projections, mesh geometries and spatial-temporal scales. The full architectural concept will require significant time and resources to implement, but this presentation describes a (minimal) starter kit. With a keep-it-simple mantra this workbench starter kit can fulfill the following four objectives: 1) demonstrate the feasibility of an interoperability workbench by mid-2017; 2) showcase scientifically useful examples of cross-domain interoperability, drawn, e.g., from funded EarthCube projects; 3) highlight selected aspects of EarthCube's architectural concept, such as a system of systems (SoS) linked via service interfaces; 4) demonstrate how workflows can be designed and used in a manner that enables sharing, promotes collaboration and fosters learning. The outcome, despite its simplicity, will embody service interfaces sufficient to construct—from extant components—data-integration and data-synthesis workflows involving multiple geoscience domains. Tentatively, the starter kit will build on the Jupyter Notebook web application, augmented with libraries for interfacing current services (at data centers involved in EarthCube's Council of Data Facilities, e.g.) and services developed specifically for EarthCube and spanning most geoscience domains.
Designing learning management system interoperability in semantic web
NASA Astrophysics Data System (ADS)
Anistyasari, Y.; Sarno, R.; Rochmawati, N.
2018-01-01
The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.
Medical Device Plug-and-Play Interoperability Standards and Technology Leadership
2017-10-01
Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...Sept 2016 – 20 Sept 2017 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology ...efficiency through interoperable medical technologies . We played a leadership role on interoperability safety standards (AAMI, AAMI/UL Joint
Representation of Nursing Terminologies in UMLS
Kim, Tae Youn; Coenen, Amy; Hardiker, Nicholas; Bartz, Claudia C.
2011-01-01
There are seven nursing terminologies or classifications that are considered a standard to support nursing practice in the U.S. Harmonizing these terminologies will enhance the interoperability of clinical data documented across nursing practice. As a first step to harmonize the nursing terminologies, the purpose of this study was to examine how nursing problems or diagnostic concepts from select terminologies were cross-mapped in Unified Medical Language System (UMLS). A comparison analysis was conducted by examining whether cross-mappings available in UMLS through concept unique identifiers were consistent with cross-mappings conducted by human experts. Of 423 concepts from three terminologies, 411 (97%) were manually cross-mapped by experts to the International Classification for Nursing Practice. The UMLS semantic mapping among the 411 nursing concepts presented 33.6% accuracy (i.e., 138 of 411 concepts) when compared to expert cross-mappings. Further research and collaboration among experts in this field are needed for future enhancement of UMLS. PMID:22195127
NASA Astrophysics Data System (ADS)
Arias, Carolina; Brovelli, Maria Antonia; Moreno, Rafael
2015-04-01
We are in an age when water resources are increasingly scarce and the impacts of human activities on them are ubiquitous. These problems don't respect administrative or political boundaries and they must be addressed integrating information from multiple sources at multiple spatial and temporal scales. Communication, coordination and data sharing are critical for addressing the water conservation and management issues of the 21st century. However, different countries, provinces, local authorities and agencies dealing with water resources have diverse organizational, socio-cultural, economic, environmental and information technology (IT) contexts that raise challenges to the creation of information systems capable of integrating and distributing information across their areas of responsibility in an efficient and timely manner. Tight and disparate financial resources, and dissimilar IT infrastructures (data, hardware, software and personnel expertise) further complicate the creation of these systems. There is a pressing need for distributed interoperable water information systems that are user friendly, easily accessible and capable of managing and sharing large volumes of spatial and non-spatial data. In a distributed system, data and processes are created and maintained in different locations each with competitive advantages to carry out specific activities. Open Data (data that can be freely distributed) is available in the water domain, and it should be further promoted across countries and organizations. Compliance with Open Specifications for data collection, storage and distribution is the first step toward the creation of systems that are capable of interacting and exchanging data in a seamlessly (interoperable) way. The features of Free and Open Source Software (FOSS) offer low access cost that facilitate scalability and long-term viability of information systems. The World Wide Web (the Web) will be the platform of choice to deploy and access these systems. Geospatial capabilities for mapping, visualization, and spatial analysis will be important components of these new generation of Web-based interoperable information systems in the water domain. The purpose of this presentation is to increase the awareness of scientists, IT personnel and agency managers about the advantages offered by the combined use of Open Data, Open Specifications for geospatial and water-related data collection, storage and sharing, as well as mature FOSS projects for the creation of interoperable Web-based information systems in the water domain. A case study is used to illustrate how these principles and technologies can be integrated to create a system with the previously mentioned characteristics for managing and responding to flood events.
Supply Chain Interoperability Measurement
2015-06-19
Supply Chain Interoperability Measurement DISSERTATION June 2015 Christos E. Chalyvidis, Major, Hellenic Air...ENS-DS-15-J-001 SUPPLY CHAIN INTEROPERABILITY MEASUREMENT DISSERTATION Presented to the Faculty Department of Operational Sciences...INTEROPERABILITY MEASUREMENT Christos E. Chalyvidis, BS, MSc. Major, Hellenic Air Force Committee Membership: Dr. A.W. Johnson Chair
Toward an E-Government Semantic Platform
NASA Astrophysics Data System (ADS)
Sbodio, Marco Luca; Moulin, Claude; Benamou, Norbert; Barthès, Jean-Paul
This chapter describes the major aspects of an e-government platform in which semantics underpins more traditional technologies in order to enable new capabilities and to overcome technical and cultural challenges. The design and development of such an e-government Semantic Platform has been conducted with the financial support of the European Commission through the Terregov research project: "Impact of e-government on Territorial Government Services" (Terregov 2008). The goal of this platform is to let local government and government agencies offer online access to their services in an interoperable way, and to allow them to participate in orchestrated processes involving services provided by multiple agencies. Implementing a business process through an electronic procedure is indeed a core goal in any networked organization. However, the field of e-government brings specific constraints to the operations allowed in procedures, especially concerning the flow of private citizens' data: because of legal reasons in most countries, such data are allowed to circulate only from agency to agency directly. In order to promote transparency and responsibility in e-government while respecting the specific constraints on data flows, Terregov supports the creation of centrally controlled orchestrated processes; while the cross agencies data flows are centrally managed, data flow directly across agencies.
Interoperability and information discovery
Christian, E.
2001-01-01
In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.
Federal Register 2010, 2011, 2012, 2013, 2014
2015-08-03
...] Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments AGENCY: Food... workshop entitled ``FDA/CDC/NLM Workshop on Promoting Semantic Interoperability of Laboratory Data.'' The... to promoting the semantic interoperability of laboratory data between in vitro diagnostic devices and...
1990-06-01
Contamination Marking Set This set is designed for marking areas contaminated with nuclear, biological and chemical (NBC) agents. It consists of a metal ...program, formerly the Heavy Forces Modernization (HFM) program. In March 1990, the Army Systems Acquisition Review Council (ASARC) reviewed the ASM...AUTOKO) prior to initial fielding of MSE in Europe this year, interoperability training was conducted at Grafenwoehr , Germany, from 30 January to 1
75 FR 63462 - Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-15
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid Interoperability Standards October 7, 2010... directs the development of a framework to achieve interoperability of smart grid devices and systems...
Federal Register 2010, 2011, 2012, 2013, 2014
2016-10-04
...] Workshop on Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments... Semantic Interoperability of Laboratory Data.'' The purpose of this public workshop is to receive and... Semantic Interoperability of Laboratory Data.'' Received comments will be placed in the docket and, except...
Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101
2012-07-01
interoperability, although they are supported by some interoperability attributes For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p
Reusing models of actors and services in smart homecare to improve sustainability.
Walderhaug, Ståle; Stav, Erlend; Mikalsen, Marius
2008-01-01
Industrial countries are faced with a growing elderly population. Homecare systems with assistive smart house technology enable elderly to live independently at home. Development of such smart home care systems is complex and expensive and there is no common reference model that can facilitate service reuse. This paper proposes reusable actor and service models based on a model-driven development process where end user organizations and domain healthcare experts from four European countries have been involved. The models, specified using UML can be reused actively as assets in the system design and development process and can reduce development costs, and improve interoperability and sustainability of systems. The models are being evaluated in the European IST project MPOWER.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-25
...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...
Warfighter IT Interoperability Standards Study
2012-07-22
data (e.g. messages) between systems ? ii) What process did you used to validate and certify semantic interoperability between your...other systems at this time There was no requirement to validate and certify semantic interoperability The DLS program exchanges data with... semantics Testing for System Compliance with Data Models Verify and Certify Interoperability Using Data
Enabling interoperability in planetary sciences and heliophysics: The case for an information model
NASA Astrophysics Data System (ADS)
Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.
2018-01-01
The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.
A Story of a Crashed Plane in US-Mexican border
NASA Astrophysics Data System (ADS)
Bermudez, Luis; Hobona, Gobe; Vretanos, Peter; Peterson, Perry
2013-04-01
A plane has crashed on the US-Mexican border. The search and rescue command center planner needs to find information about the crash site, a mountain, nearby mountains for the establishment of a communications tower, as well as ranches for setting up a local incident center. Events like this one occur all over the world and exchanging information seamlessly is key to save lives and prevent further disasters. This abstract describes an interoperability testbed that applied this scenario using technologies based on Open Geospatial Consortium (OGC) standards. The OGC, which has about 500 members, serves as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC Interoperability Program conducts international interoperability testbeds, such as the OGC Web Services Phase 9 (OWS-9), that encourages rapid development, testing, validation, demonstration and adoption of open, consensus based standards and best practices. The Cross-Community Interoperability (CCI) thread in OWS-9 advanced the Web Feature Service for Gazetteers (WFS-G) by providing a Single Point of Entry Global Gazetteer (SPEGG), where a user can submit a single query and access global geographic names data across multiple Federal names databases. Currently users must make two queries with differing input parameters against two separate databases to obtain authoritative cross border geographic names data. The gazetteers in this scenario included: GNIS and GNS. GNIS or Geographic Names Information System is managed by USGS. It was first developed in 1964 and contains information about domestic and Antarctic names. GNS or GeoNET Names Server provides the Geographic Names Data Base (GNDB) and it is managed by National Geospatial Intelligence Agency (NGA). GNS has been in service since 1994, and serves names for areas outside the United States and its dependent areas, as well as names for undersea features. The following challenges were advanced: Cascaded WFS-G servers (allowing to query multiple WFSs with a "parent" WFS), implemented query names filters (e.g. fuzzy search, text search), implemented dealing with multilingualism and diacritics, implemented advanced spatial constraints (e.g. search by radial search and nearest neighbor) and semantically mediated feature types (e.g. mountain vs. hill). To enable semantic mediation, a series of semantic mappings were defined between the NGA GNS, USGS GNIS and the Alexandria Digital Library (ADL) Gazetteer. The mappings were encoded in the Web Ontology Language (OWL) to enable them to be used by semantic web technologies. The semantic mappings were then published for ingestion into a semantic mediator that used the mappings to associate location types from one gazetteer with location types in another. The semantic mediator was then able to transform requests on the fly, providing a single point of entry WFS-G to multiple gazetteers. The presentation will provide a live presentation of the work performed, highlight main developments, and discuss future development.
Finet, Philippe; Gibaud, Bernard; Dameron, Olivier; Le Bouquin Jeannès, Régine
2016-03-01
The number of patients with complications associated with chronic diseases increases with the ageing population. In particular, complex chronic wounds raise the re-admission rate in hospitals. In this context, the implementation of a telemedicine application in Basse-Normandie, France, contributes to reduce hospital stays and transport. This application requires a new collaboration among general practitioners, private duty nurses and the hospital staff. However, the main constraint mentioned by the users of this system is the lack of interoperability between the information system of this application and various partners' information systems. To improve medical data exchanges, the authors propose a new implementation based on the introduction of interoperable clinical documents and a digital document repository for managing the sharing of the documents between the telemedicine application users. They then show that this technical solution is suitable for any telemedicine application and any document sharing system in a healthcare facility or network.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-18
... Docket 07-100; FCC 11-6] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the... interoperable public safety broadband network. The establishment of a common air interface for 700 MHz public safety broadband networks will create a foundation for interoperability and provide a clear path for the...
Juzwishin, Donald W M
2009-01-01
Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.
D-ATM, a working example of health care interoperability: From dirt path to gravel road.
DeClaris, John-William
2009-01-01
For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.
Interoperability of Information Systems Managed and Used by the Local Health Departments.
Shah, Gulzar H; Leider, Jonathon P; Luo, Huabin; Kaur, Ravneet
2016-01-01
In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide.
National electronic health record interoperability chronology.
Hufnagel, Stephen P
2009-05-01
The federal initiative for electronic health record (EHR) interoperability began in 2000 and set the stage for the establishment of the 2004 Executive Order for EHR interoperability by 2014. This article discusses the chronology from the 2001 e-Government Consolidated Health Informatics (CHI) initiative through the current congressional mandates for an aligned, interoperable, and agile DoD AHLTA and VA VistA.
On the formal definition of the systems' interoperability capability: an anthropomorphic approach
NASA Astrophysics Data System (ADS)
Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav
2017-03-01
The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.
Employing Semantic Technologies for the Orchestration of Government Services
NASA Astrophysics Data System (ADS)
Sabol, Tomáš; Furdík, Karol; Mach, Marián
The main aim of the eGovernment is to provide efficient, secure, inclusive services for its citizens and businesses. The necessity to integrate services and information resources, to increase accessibility, to reduce the administrative burden on citizens and enterprises - these are only a few reasons why the paradigm of the eGovernment has been shifted from the supply-driven approach toward the connected governance, emphasizing the concept of interoperability (Archmann and Nielsen 2008). On the EU level, the interoperability is explicitly addressed as one of the four main challenges, including in the i2010 strategy (i2010 2005). The Commission's Communication (Interoperability for Pan-European eGovernment Services 2006) strongly emphasizes the necessity of interoperable eGovernment services, based on standards, open specifications, and open interfaces. The Pan-European interoperability initiatives, such as the European Interoperability Framework (2004) and IDABC, as well as many projects supported by the European Commission within the IST Program and the Competitiveness and Innovation Program (CIP), illustrate the importance of interoperability on the EU level.
[The role of Integrating the Healthcare Enterprise (IHE) in telemedicine].
Bergh, B; Brandner, A; Heiß, J; Kutscha, U; Merzweiler, A; Pahontu, R; Schreiweis, B; Yüksekogul, N; Bronsch, T; Heinze, O
2015-10-01
Telemedicine systems are today already used in a variety of areas to improve patient care. The lack of standardization in those solutions creates a lack of interoperability of the systems. Internationally accepted standards can help to solve the lack of system interoperability. With Integrating the Healthcare Enterprise (IHE), a worldwide initiative of users and vendors is working on the use of defined standards for specific use cases by describing those use cases in so called IHE Profiles. The aim of this work is to determine how telemedicine applications can be implemented using IHE profiles. Based on a literature review, exemplary telemedicine applications are described and technical abilities of IHE Profiles are evaluated. These IHE Profiles are examined for their usability and are then evaluated in exemplary telemedicine application architectures. There are IHE Profiles which can be identified as being useful for intersectoral patient records (e.g. PEHR at Heidelberg), as well as for point to point communication where no patient record is involved. In the area of patient records, the IHE Profile "Cross-Enterprise Document Sharing (XDS)" is often used. The point to point communication can be supported using the IHE "Cross-Enterprise Document Media Interchange (XDM)". IHE-based telemedicine applications offer caregivers the possibility to be informed about their patients using data from intersectoral patient records, but also there are possible savings by reusing the standardized interfaces in other scenarios.
Developing a Standard Method for Link-Layer Security of CCSDS Space Communications
NASA Technical Reports Server (NTRS)
Biggerstaff, Craig
2009-01-01
Communications security for space systems has been a specialized field generally far removed from considerations of mission interoperability and cross-support in fact, these considerations often have been viewed as intrinsically opposed to security objectives. The space communications protocols defined by the Consultative Committee for Space Data Systems (CCSDS) have a twenty-five year history of successful use in over 400 missions. While the CCSDS Telemetry, Telecommand, and Advancing Orbiting Systems protocols for use at OSI Layer 2 are operationally mature, there has been no direct support within these protocols for communications security techniques. Link-layer communications security has been successfully implemented in the past using mission-unique methods, but never before with an objective of facilitating cross-support and interoperability. This paper discusses the design of a standard method for cryptographic authentication, encryption, and replay protection at the data link layer that can be integrated into existing CCSDS protocols without disruption to legacy communications services. Integrating cryptographic operations into existing data structures and processing sequences requires a careful assessment of the potential impediments within spacecraft, ground stations, and operations centers. The objective of this work is to provide a sound method for cryptographic encapsulation of frame data that also facilitates Layer 2 virtual channel switching, such that a mission may procure data transport services as needed without involving third parties in the cryptographic processing, or split independent data streams for separate cryptographic processing.
Semantic Interoperability Almost Without Using The Same Vocabulary: Is It Possible?
NASA Astrophysics Data System (ADS)
Krisnadhi, A. A.
2016-12-01
Semantic interoperability, which is a key requirement in realizing cross-repository data integration, is often understood as using the same ontology or vocabulary. Consequently, within a particular domain, one can easily assume that there has to be one unifying domain ontology covering as many vocabulary terms in the domain as possible in order to realize any form of data integration across multiple data sources. Furthermore, the desire to provide very precise definition of those many terms led to the development of huge, foundational and domain ontologies that are comprehensive, but too complicated, restrictive, monolithic, and difficult to use and reuse, which cause common data providers to avoid using them. This problem is especially true in a domain as diverse as geosciences as it is virtually impossible to reach an agreement to the semantics of many terms (e.g., there are hundreds of definitions of forest used throughout the world). To overcome this challenge, modular ontology architecture has emerged in recent years, fueled among others, by advances in the ontology design pattern research. Each ontology pattern models only one key notion. It can act as a small module of a larger ontology. Such a module is developed in such a way that it is largely independent of how other notions in the same domain are modeled. This leads to an increased reusability. Furthermore, an ontology formed out of such modules would have an improved understandability over large, monolithic ontologies. Semantic interoperability in the aforementioned architecture is not achieved by enforcing the use of the same vocabulary, but rather, promoting alignment to the same ontology patterns. In this work, we elaborate how this architecture realizes the above idea. In particular, we describe how multiple data sources with differing perspectives and vocabularies can interoperate through this architecture. Building the solution upon semantic technologies such as Linked Data and the Web Ontology Language (OWL), we demonstrate how a data integration solution based on this idea can be realized over different data repositories.
Managing Interoperability for GEOSS - A Report from the SIF
NASA Astrophysics Data System (ADS)
Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.
2009-04-01
The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of GEOSS.
Interoperability of Information Systems Managed and Used by the Local Health Departments
Leider, Jonathon P.; Luo, Huabin; Kaur, Ravneet
2016-01-01
Background: In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). Objectives: To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. Data and Methods: This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. Results: For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Conclusion: Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide. PMID:27684616
Approaching semantic interoperability in Health Level Seven
Alschuler, Liora
2010-01-01
‘Semantic Interoperability’ is a driving objective behind many of Health Level Seven's standards. The objective in this paper is to take a step back, and consider what semantic interoperability means, assess whether or not it has been achieved, and, if not, determine what concrete next steps can be taken to get closer. A framework for measuring semantic interoperability is proposed, using a technique called the ‘Single Logical Information Model’ framework, which relies on an operational definition of semantic interoperability and an understanding that interoperability improves incrementally. Whether semantic interoperability tomorrow will enable one computer to talk to another, much as one person can talk to another person, is a matter for speculation. It is assumed, however, that what gets measured gets improved, and in that spirit this framework is offered as a means to improvement. PMID:21106995
Advances Made in the Next Generation of Satellite Networks
NASA Technical Reports Server (NTRS)
Bhasin, Kul B.
1999-01-01
Because of the unique networking characteristics of communications satellites, global satellite networks are moving to the forefront in enhancing national and global information infrastructures. Simultaneously, broadband data services, which are emerging as the major market driver for future satellite and terrestrial networks, are being widely acknowledged as the foundation for an efficient global information infrastructure. In the past 2 years, various task forces and working groups around the globe have identified pivotal topics and key issues to address if we are to realize such networks in a timely fashion. In response, industry, government, and academia undertook efforts to address these topics and issues. A workshop was organized to provide a forum to assess the current state-of-the-art, identify key issues, and highlight the emerging trends in the next-generation architectures, data protocol development, communication interoperability, and applications. The Satellite Networks: Architectures, Applications, and Technologies Workshop was hosted by the Space Communication Program at the NASA Lewis Research Center in Cleveland, Ohio. Nearly 300 executives and technical experts from academia, industry, and government, representing the United States and eight other countries, attended the event (June 2 to 4, 1998). The program included seven panels and invited sessions and nine breakout sessions in which 42 speakers presented on technical topics. The proceedings covers a wide range of topics: access technology and protocols, architectures and network simulations, asynchronous transfer mode (ATM) over satellite networks, Internet over satellite networks, interoperability experiments and applications, multicasting, NASA interoperability experiment programs, NASA mission applications, and Transmission Control Protocol/Internet Protocol (TCP/IP) over satellite: issues, relevance, and experience.
NASA Astrophysics Data System (ADS)
Ansari, S.; Del Greco, S.
2006-12-01
In February 2005, 61 countries around the World agreed on a 10 year plan to work towards building open systems for sharing geospatial data and services across different platforms worldwide. This system is known as the Global Earth Observation System of Systems (GEOSS). The objective of GEOSS focuses on easy access to environmental data and interoperability across different systems allowing participating countries to measure the "pulse" of the planet in an effort to advance society. In support of GEOSS goals, NOAA's National Climatic Data Center (NCDC) has developed radar visualization and data exporter tools in an open systems environment. The NCDC Weather Radar Toolkit (WRT) loads Weather Surveillance Radar 1988 Doppler (WSR-88D) volume scan (S-band) data, known as Level-II, and derived products, known as Level-III, into an Open Geospatial Consortium (OGC) compliant environment. The application is written entirely in Java and will run on any Java- supported platform including Windows, Macintosh and Linux/Unix. The application is launched via Java Web Start and runs on the client machine while accessing these data locally or remotely from the NCDC archive, NOAA FTP server or any URL or THREDDS Data Server. The WRT allows the data to be manipulated to create custom mosaics, composites and precipitation estimates. The WRT Viewer provides tools for custom data overlays, Web Map Service backgrounds, animations and basic filtering. The export of images and movies is provided in multiple formats. The WRT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, NetCDF, GrADS) formats. By decoding the various Radar formats into the NetCDF Common Data Model, the exported NetCDF data becomes interoperable with existing software packages including THREDDS Data Server and the Integrated Data Viewer (IDV). The NCDC recently partnered with NOAA's National Severe Storms Lab (NSSL) to decode Sigmet C-band Doppler radar data providing the NCDC Viewer/Data Exporter the functionality to read C-Band. This also supports a bilateral agreement between the United States and Canada for data sharing and to support interoperability with the US WSR-88D and Environment Canada radar networks. In addition, the NCDC partnered with the University of Oklahoma to develop decoders to read a test bed of distributed X- band radars that are funded through the Collaborative Adaptive Sensing of the Atmosphere (CASA) project. The NCDC is also archiving the National Mosaic and Next Generation QPE (Q2) products from NSSL, which provide products such as three-dimensional reflectivity, composite reflectivity and precipitation estimates at a 1 km resolution. These three sources of Radar data are also supported in the WRT.
1979-11-01
operated as an integrated whole in meeting a mission need . h. Major system means that combination of elements that will function together to produce...developed and in production, it is by no means sure that it will fill a military need of another country. We intend to harmonize requirements... means to review national armaments plans and identify opportunities for armaments cooperation. It is expected that NAPR will eventually fold into PAPS
Towards semantic interoperability for electronic health records.
Garde, Sebastian; Knaup, Petra; Hovenga, Evelyn; Heard, Sam
2007-01-01
In the field of open electronic health records (EHRs), openEHR as an archetype-based approach is being increasingly recognised. It is the objective of this paper to shortly describe this approach, and to analyse how openEHR archetypes impact on health professionals and semantic interoperability. Analysis of current approaches to EHR systems, terminology and standards developments. In addition to literature reviews, we organised face-to-face and additional telephone interviews and tele-conferences with members of relevant organisations and committees. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability -- both important prerequisites for semantic interoperability. Archetypes enable the formal definition of clinical content by clinicians. To enable comprehensive semantic interoperability, the development and maintenance of archetypes needs to be coordinated internationally and across health professions. Domain knowledge governance comprises a set of processes that enable the creation, development, organisation, sharing, dissemination, use and continuous maintenance of archetypes. It needs to be supported by information technology. To enable EHRs, semantic interoperability is essential. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability. However, without coordinated archetype development and maintenance, 'rank growth' of archetypes would jeopardize semantic interoperability. We therefore believe that openEHR archetypes and domain knowledge governance together create the knowledge environment required to adopt EHRs.
The Health Service Bus: an architecture and case study in achieving interoperability in healthcare.
Ryan, Amanda; Eklund, Peter
2010-01-01
Interoperability in healthcare is a requirement for effective communication between entities, to ensure timely access to up to-date patient information and medical knowledge, and thus facilitate consistent patient care. An interoperability framework called the Health Service Bus (HSB), based on the Enterprise Service Bus (ESB) middleware software architecture is presented here as a solution to all three levels of interoperability as defined by the HL7 EHR Interoperability Work group in their definitive white paper "Coming to Terms". A prototype HSB system was implemented based on the Mule Open-Source ESB and is outlined and discussed, followed by a clinically-based example.
PACS/information systems interoperability using Enterprise Communication Framework.
alSafadi, Y; Lord, W P; Mankovich, N J
1998-06-01
Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).
The role of architecture and ontology for interoperability.
Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka
2010-01-01
Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.
Jian, Wen-Shan; Hsu, Chien-Yeh; Hao, Te-Hui; Wen, Hsyien-Chia; Hsu, Min-Huei; Lee, Yen-Liang; Li, Yu-Chuan; Chang, Polun
2007-11-01
Traditional electronic health record (EHR) data are produced from various hospital information systems. They could not have existed independently without an information system until the incarnation of XML technology. The interoperability of a healthcare system can be divided into two dimensions: functional interoperability and semantic interoperability. Currently, no single EHR standard exists that provides complete EHR interoperability. In order to establish a national EHR standard, we developed a set of local EHR templates. The Taiwan Electronic Medical Record Template (TMT) is a standard that aims to achieve semantic interoperability in EHR exchanges nationally. The TMT architecture is basically composed of forms, components, sections, and elements. Data stored in the elements which can be referenced by the code set, data type, and narrative block. The TMT was established with the following requirements in mind: (1) transformable to international standards; (2) having a minimal impact on the existing healthcare system; (3) easy to implement and deploy, and (4) compliant with Taiwan's current laws and regulations. The TMT provides a basis for building a portable, interoperable information infrastructure for EHR exchange in Taiwan.
System and methods of resource usage using an interoperable management framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.
Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.
Information Management Challenges in Achieving Coalition Interoperability
2001-12-01
by J. Dyer SESSION I: ARCHITECTURES AND STANDARDS: FUNDAMENTAL ISSUES Chairman: Dr I. WHITE (UK) Planning for Interoperability 1 by W.M. Gentleman...framework – a crucial step toward achieving coalition C4I interoperability. TOPICS TO BE COVERED: 1 ) Maintaining secure interoperability 2) Command...d’une coalition. SUJETS À EXAMINER : 1 ) Le maintien d’une interopérabilité sécurisée 2) Les interfaces des systèmes de commandement : 2a
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.
The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levelsmore » to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.« less
Space Network Interoperability Panel (SNIP) study
NASA Technical Reports Server (NTRS)
Ryan, Thomas; Lenhart, Klaus; Hara, Hideo
1991-01-01
The Space Network Interoperability Panel (SNIP) study is a tripartite study that involves the National Aeronautics and Space Administration (NASA), the European Space Agency (ESA), and the National Space Development Agency (NASDA) of Japan. SNIP involves an ongoing interoperability study of the Data Relay Satellite (DRS) Systems of the three organizations. The study is broken down into two parts; Phase one deals with S-band (2 GHz) interoperability and Phase two deals with Ka-band (20/30 GHz) interoperability (in addition to S-band). In 1987 the SNIP formed a Working Group to define and study operations concepts and technical subjects to assure compatibility of the international data relay systems. Since that time a number of Panel and Working Group meetings have been held to continue the study. Interoperability is of interest to the three agencies because it offers a number of potential operation and economic benefits. This paper presents the history and status of the SNIP study.
Before you make the data interoperable you have to make the people interoperable
NASA Astrophysics Data System (ADS)
Jackson, I.
2008-12-01
In February 2006 a deceptively simple concept was put forward. Could we use the International Year of Planet Earth 2008 as a stimulus to begin the creation of a digital geological map of the planet at a target scale of 1:1 million? Could we design and initiate a project that uniquely mobilises geological surveys around the world to act as the drivers and sustainable data providers of this global dataset? Further, could we synergistically use this geoscientist-friendly vehicle of creating a tangible geological map to accelerate progress of an emerging global geoscience data model and interchange standard? Finally, could we use the project to transfer know-how to developing countries and reduce the length and expense of their learning curve, while at the same time producing geoscience maps and data that could attract interest and investment? These aspirations, plus the chance to generate a global digital geological dataset to assist in the understanding of global environmental problems and the opportunity to raise the profile of geoscience as part of IYPE seemed more than enough reasons to take the proposal to the next stage. In March 2007, in Brighton, UK, 81 delegates from 43 countries gathered together to consider the creation of this global interoperable geological map dataset. The participants unanimously agreed the Brighton "Accord" and kicked off "OneGeology", an initiative that now has the support of more than 85 nations. Brighton was never designed to be a scientific or technical meeting: it was overtly about people and their interaction - would these delegates, with their diverse cultural and technical backgrounds, be prepared to work together to achieve something which, while technically challenging, was not complex in the context of leading edge geoscience informatics. Could we scale up what is a simple informatics model at national level, to deliver global coverage and access? The major challenges for OneGeology (and the deployment of interoperability) are rarely scientific or technical; they were and are the significantly more difficult logistical and "geopolitical - cultural" issues. OneGeology has grown and progressed rapidly to be an international project. It has not only achieved its first phase scientific and technical goals in launching its web map portal with map data from 30 nations at the International Geological Congress in August 2008, but has also attracted substantial scientific, public and media interest around the world. OneGeology is, in every sense, a child of its time - an agile Internet paradigm - a project whose informatics interoperability goals are in reality the total project ethos. The project has been allowed to grow and extend just as fast and as wide as its actors agree to take it, for the most part free from the territoriality and bureaucracy that all too often inhibit such initiatives. It is beyond doubt that a conventionally run (and thus constrained) OneGeology would not have achieved its goals. The OneGeology team has taken enormous strides in a very short space of time and the achievements are considerable. But some new challenges now arise. How will we sustain the project? Where do we take it next? Can OneGeology continue its "liberal" modus operandi? How should we fund and provide continuity for a growing and thus more demanding infrastructure and user base. Should we expand the portal to include map data from academia, commerce and the public (and how to maintain authentication if one does that?) How fast do we increase the sophistication of the informatics and the resolution and diversity of the data? The presentation will describe OneGeology, its current status and the technical and cultural issues involved in trying to move forward interoperability on a global scale.
Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.
Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert
2017-08-01
Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.
Impact of coalition interoperability on PKI
NASA Astrophysics Data System (ADS)
Krall, Edward J.
2003-07-01
This paper examines methods for providing PKI interoperability among units of a coalition of armed forces drawn from different nations. The area in question is tactical identity management, for the purposes of confidentiality, integrity and non-repudiation in such a dynamic coalition. The interoperating applications under consideration range from email and other forms of store-and-forward messaging to TLS and IPSEC-protected real-time communications. Six interoperability architectures are examined with advantages and disadvantages of each described in the paper.
Telemedicine system interoperability architecture: concept description and architecture overview.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard Layne, II
2004-05-01
In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.
NASA Astrophysics Data System (ADS)
Haener, Rainer; Waechter, Joachim; Grellet, Sylvain; Robida, Francois
2017-04-01
Interoperability is the key factor in establishing scientific research environments and infrastructures, as well as in bringing together heterogeneous, geographically distributed risk management, monitoring, and early warning systems. Based on developments within the European Plate Observing System (EPOS), a reference architecture has been devised that comprises architectural blue-prints and interoperability models regarding the specification of business processes and logic as well as the encoding of data, metadata, and semantics. The architectural blueprint is developed on the basis of the so called service-oriented architecture (SOA) 2.0 paradigm, which combines intelligence and proactiveness of event-driven with service-oriented architectures. SOA 2.0 supports analysing (Data Mining) both, static and real-time data in order to find correlations of disparate information that do not at first appear to be intuitively obvious: Analysed data (e.g., seismological monitoring) can be enhanced with relationships discovered by associating them (Data Fusion) with other data (e.g., creepmeter monitoring), with digital models of geological structures, or with the simulation of geological processes. The interoperability model describes the information, communication (conversations) and the interactions (choreographies) of all participants involved as well as the processes for registering, providing, and retrieving information. It is based on the principles of functional integration, implemented via dedicated services, communicating via service-oriented and message-driven infrastructures. The services provide their functionality via standardised interfaces: Instead of requesting data directly, users share data via services that are built upon specific adapters. This approach replaces the tight coupling at data level by a flexible dependency on loosely coupled services. The main component of the interoperability model is the comprehensive semantic description of the information, business logic and processes on the basis of a minimal set of well-known, established standards. It implements the representation of knowledge with the application of domain-controlled vocabularies to statements about resources, information, facts, and complex matters (ontologies). Seismic experts for example, would be interested in geological models or borehole measurements at a certain depth, based on which it is possible to correlate and verify seismic profiles. The entire model is built upon standards from the Open Geospatial Consortium (Dictionaries, Service Layer), the International Organisation for Standardisation (Registries, Metadata), and the World Wide Web Consortium (Resource Description Framework, Spatial Data on the Web Best Practices). It has to be emphasised that this approach is scalable to the greatest possible extent: All information, necessary in the context of cross-domain infrastructures is referenced via vocabularies and knowledge bases containing statements that provide either the information itself or resources (service-endpoints), the information can be retrieved from. The entire infrastructure communication is subject to a broker-based business logic integration platform where the information exchanged between involved participants, is managed on the basis of standardised dictionaries, repositories, and registries. This approach also enables the development of Systems-of-Systems (SoS), which allow the collaboration of autonomous, large scale concurrent, and distributed systems, yet cooperatively interacting as a collective in a common environment.
Challenges and potential solutions for big data implementations in developing countries.
Luna, D; Mayan, J C; García, M J; Almerares, A A; Househ, M
2014-08-15
The volume of data, the velocity with which they are generated, and their variety and lack of structure hinder their use. This creates the need to change the way information is captured, stored, processed, and analyzed, leading to the paradigm shift called Big Data. To describe the challenges and possible solutions for developing countries when implementing Big Data projects in the health sector. A non-systematic review of the literature was performed in PubMed and Google Scholar. The following keywords were used: "big data", "developing countries", "data mining", "health information systems", and "computing methodologies". A thematic review of selected articles was performed. There are challenges when implementing any Big Data program including exponential growth of data, special infrastructure needs, need for a trained workforce, need to agree on interoperability standards, privacy and security issues, and the need to include people, processes, and policies to ensure their adoption. Developing countries have particular characteristics that hinder further development of these projects. The advent of Big Data promises great opportunities for the healthcare field. In this article, we attempt to describe the challenges developing countries would face and enumerate the options to be used to achieve successful implementations of Big Data programs.
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 1 2014-10-01 2014-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 1 2013-10-01 2013-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A
2008-02-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).
Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.
2008-01-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service Oriented Architecture (SSOA) for cancer research by the National Cancer Institute’s cancer Biomedical Informatics Grid (caBIG™). PMID:17512259
Maturity Model for Advancing Smart Grid Interoperability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knight, Mark; Widergren, Steven E.; Mater, J.
2013-10-28
Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met withmore » process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.« less
Laplante-Lévesque, Ariane; Abrams, Harvey; Bülow, Maja; Lunner, Thomas; Nelson, John; Riis, Søren Kamaric; Vanpoucke, Filiep
2016-10-01
This article describes the perspectives of hearing device manufacturers regarding the exciting developments that the Internet makes possible. Specifically, it proposes to join forces toward interoperability and standardization of Internet and audiology. A summary of why such a collaborative effort is required is provided from historical and scientific perspectives. A roadmap toward interoperability and standardization is proposed. Information and communication technologies improve the flow of health care data and pave the way to better health care. However, hearing-related products, features, and services are notoriously heterogeneous and incompatible with other health care systems (no interoperability). Standardization is the process of developing and implementing technical standards (e.g., Noah hearing database). All parties involved in interoperability and standardization realize mutual gains by making mutually consistent decisions. De jure (officially endorsed) standards can be developed in collaboration with large national health care systems as well as spokespeople for hearing care professionals and hearing device users. The roadmap covers mutual collaboration; data privacy, security, and ownership; compliance with current regulations; scalability and modularity; and the scope of interoperability and standards. We propose to join forces to pave the way to the interoperable Internet and audiology products, features, and services that the world needs.
Reflections on the role of open source in health information system interoperability.
Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G
2007-01-01
This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public... persons that the Federal Communications Commission's (FCC) Communications Security, Reliability, and... the security, reliability, and interoperability of communications systems. On March 19, 2011, the FCC...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
UAS Integration in the NAS Project: DAA-TCAS Interoperability "mini" HITL Primary Results
NASA Technical Reports Server (NTRS)
Rorie, Conrad; Fern, Lisa; Shively, Jay; Santiago, Confesor
2016-01-01
At the May 2015 SC-228 meeting, requirements for TCAS II interoperability became elevated in priority. A TCAS interoperability workgroup was formed to identify and address key issues/questions. The TCAS workgroup came up with an initial list of questions and a plan to address those questions. As part of that plan, NASA proposed to run a mini HITL to address display, alerting and guidance issues. A TCAS Interoperability Workshop was held to determine potential display/alerting/guidance issues that could be explored in future NASA mini HITLS. Consensus on main functionality of DAA guidance when TCAS II RA occurs. Prioritized list of independent variables for experimental design. Set of use cases to stress TCAS Interoperability.
An Ontological Solution to Support Interoperability in the Textile Industry
NASA Astrophysics Data System (ADS)
Duque, Arantxa; Campos, Cristina; Jiménez-Ruiz, Ernesto; Chalmeta, Ricardo
Significant developments in information and communication technologies and challenging market conditions have forced enterprises to adapt their way of doing business. In this context, providing mechanisms to guarantee interoperability among heterogeneous organisations has become a critical issue. Even though prolific research has already been conducted in the area of enterprise interoperability, we have found that enterprises still struggle to introduce fully interoperable solutions, especially, in terms of the development and application of ontologies. Thus, the aim of this paper is to introduce basic ontology concepts in a simple manner and to explain the advantages of the use of ontologies to improve interoperability. We will also present a case study showing the implementation of an application ontology for an enterprise in the textile/clothing sector.
77 FR 37001 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... of the Interoperability Services Layer, Attn: Ron Chen, 400 Gigling Road, Seaside, CA 93955. Title; Associated Form; and OMB Number: Interoperability Services Layer; OMB Control Number 0704-TBD. Needs and Uses... INFORMATION: Summary of Information Collection IoLS (Interoperability Layer Services) is an application in a...
He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison
2018-01-12
Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).
Personal Health Records: Is Rapid Adoption Hindering Interoperability?
Studeny, Jana; Coustasse, Alberto
2014-01-01
The establishment of the Meaningful Use criteria has created a critical need for robust interoperability of health records. A universal definition of a personal health record (PHR) has not been agreed upon. Standardized code sets have been built for specific entities, but integration between them has not been supported. The purpose of this research study was to explore the hindrance and promotion of interoperability standards in relationship to PHRs to describe interoperability progress in this area. The study was conducted following the basic principles of a systematic review, with 61 articles used in the study. Lagging interoperability has stemmed from slow adoption by patients, creation of disparate systems due to rapid development to meet requirements for the Meaningful Use stages, and rapid early development of PHRs prior to the mandate for integration among multiple systems. Findings of this study suggest that deadlines for implementation to capture Meaningful Use incentive payments are supporting the creation of PHR data silos, thereby hindering the goal of high-level interoperability. PMID:25214822
Semantic Mappings and Locality of Nursing Diagnostic Concepts in UMLS
Kim, Tae Youn; Coenen, Amy; Hardiker, Nicholas
2011-01-01
One solution for enhancing the interoperability between nursing information systems, given the availability of multiple nursing terminologies, is to cross-map existing nursing concepts. The Unified Medical Language System (UMLS) developed and distributed by the National Library of Medicine (NLM) is a knowledge resource containing cross-mappings of various terminologies in a unified framework. While the knowledge resource has been available for the last two decades, little research on the representation of nursing terminologies in UMLS has been conducted. As a first step, UMLS semantic mappings and concept locality were examined for nursing diagnostic concepts or problems selected from three terminologies (i.e., CCC, ICNP, and NANDA-I) along with corresponding SNOMED CT concepts. The evaluation of UMLS semantic mappings was conducted by measuring the proportion of concordance between UMLS and human expert mappings. The semantic locality of nursing diagnostic concepts was assessed by examining the associations of select concepts and the placement of the nursing concepts on the Semantic Network and Group. The study found that the UMLS mappings of CCC and NANDA-I concepts to SNOMED CT were highly concordant to expert mappings. The level of concordance in mappings of ICNP to SNOMED CT, CCC and NANDA-I within UMLS was relatively low, indicating the need for further research and development. Likewise, the semantic locality of ICNP concepts could be further improved. Various stakeholders need to collaborate to enhance the NLM knowledge resource and the interoperability of nursing data within the discipline as well as across health-related disciplines. PMID:21951759
Organisational Interoperability: Evaluation and Further Development of the OIM Model
2003-06-01
an Organizational Interoperability Maturity Model (OIM) to evaluate interoperability at the organizational level. The OIM considers the human ... activity aspects of military operations, which are not covered in other models. This paper describes how the model has been used to identify problems and to
Droc, Gaëtan; Larivière, Delphine; Guignon, Valentin; Yahiaoui, Nabila; This, Dominique; Garsmeur, Olivier; Dereeper, Alexis; Hamelin, Chantal; Argout, Xavier; Dufayard, Jean-François; Lengelle, Juliette; Baurens, Franc-Christophe; Cenci, Alberto; Pitollat, Bertrand; D’Hont, Angélique; Ruiz, Manuel; Rouard, Mathieu; Bocs, Stéphanie
2013-01-01
Banana is one of the world’s favorite fruits and one of the most important crops for developing countries. The banana reference genome sequence (Musa acuminata) was recently released. Given the taxonomic position of Musa, the completed genomic sequence has particular comparative value to provide fresh insights about the evolution of the monocotyledons. The study of the banana genome has been enhanced by a number of tools and resources that allows harnessing its sequence. First, we set up essential tools such as a Community Annotation System, phylogenomics resources and metabolic pathways. Then, to support post-genomic efforts, we improved banana existing systems (e.g. web front end, query builder), we integrated available Musa data into generic systems (e.g. markers and genetic maps, synteny blocks), we have made interoperable with the banana hub, other existing systems containing Musa data (e.g. transcriptomics, rice reference genome, workflow manager) and finally, we generated new results from sequence analyses (e.g. SNP and polymorphism analysis). Several uses cases illustrate how the Banana Genome Hub can be used to study gene families. Overall, with this collaborative effort, we discuss the importance of the interoperability toward data integration between existing information systems. Database URL: http://banana-genome.cirad.fr/ PMID:23707967
Borland, Rob; Barasa, Mourice; Iiams-Hauser, Casey; Velez, Olivia; Kaonga, Nadi Nina; Berg, Matt
2013-01-01
The purpose of this paper is to illustrate the importance of using open source technologies and common standards for interoperability when implementing eHealth systems and illustrate this through case studies, where possible. The sources used to inform this paper draw from the implementation and evaluation of the eHealth Program in the context of the Millennium Villages Project (MVP). As the eHealth Team was tasked to deploy an eHealth architecture, the Millennium Villages Global-Network (MVG-Net), across all fourteen of the MVP sites in Sub-Saharan Africa, the team recognized the need for standards and uniformity but also realized that context would be an important factor. Therefore, the team decided to utilize open source solutions. The MVP implementation of MVG-Net provides a model for those looking to implement informatics solutions across disciplines and countries. Furthermore, there are valuable lessons learned that the eHealth community can benefit from. By sharing lessons learned and developing an accessible, open-source eHealth platform, we believe that we can more efficiently and rapidly achieve the health-related and collaborative Millennium Development Goals (MDGs). PMID:22894051
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-13
..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public..., Reliability, and Interoperability Council (CSRIC) will hold its fifth meeting. The CSRIC will vote on... to the FCC regarding best practices and actions the FCC can take to ensure the security, reliability...
Evaluation of Interoperability Protocols in Repositories of Electronic Theses and Dissertations
ERIC Educational Resources Information Center
Hakimjavadi, Hesamedin; Masrek, Mohamad Noorman
2013-01-01
Purpose: The purpose of this study is to evaluate the status of eight interoperability protocols within repositories of electronic theses and dissertations (ETDs) as an introduction to further studies on feasibility of deploying these protocols in upcoming areas of interoperability. Design/methodology/approach: Three surveys of 266 ETD…
Examining the Relationship between Electronic Health Record Interoperability and Quality Management
ERIC Educational Resources Information Center
Purcell, Bernice M.
2013-01-01
A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…
Interoperability of Demand Response Resources Demonstration in NY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wellington, Andre
2014-03-31
The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.
Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interop...
Reminiscing about 15 years of interoperability efforts
Van de Sompel, Herbert; Nelson, Michael L.
2015-11-01
Over the past fifteen years, our perspective on tackling information interoperability problems for web-based scholarship has evolved significantly. In this opinion piece, we look back at three efforts that we have been involved in that aptly illustrate this evolution: OAI-PMH, OAI-ORE, and Memento. Understanding that no interoperability specification is neutral, we attempt to characterize the perspectives and technical toolkits that provided the basis for these endeavors. With that regard, we consider repository-centric and web-centric interoperability perspectives, and the use of a Linked Data or a REST/HATEAOS technology stack, respectively. In addition, we lament the lack of interoperability across nodes thatmore » play a role in web-based scholarship, but end on a constructive note with some ideas regarding a possible path forward.« less
The HDF Product Designer - Interoperability in the First Mile
NASA Astrophysics Data System (ADS)
Lee, H.; Jelenak, A.; Habermann, T.
2014-12-01
Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.
Potential interoperability problems facing multi-site radiation oncology centers in The Netherlands
NASA Astrophysics Data System (ADS)
Scheurleer, J.; Koken, Ph; Wessel, R.
2014-03-01
Aim: To identify potential interoperability problems facing multi-site Radiation Oncology (RO) departments in the Netherlands and solutions for unambiguous multi-system workflows. Specific challenges confronting the RO department of VUmc (RO-VUmc), which is soon to open a satellite department, were characterized. Methods: A nationwide questionnaire survey was conducted to identify possible interoperability problems and solutions. Further detailed information was obtained by in-depth interviews at 3 Dutch RO institutes that already operate in more than one site. Results: The survey had a 100% response rate (n=21). Altogether 95 interoperability problems were described. Most reported problems were on a strategic and semantic level. The majority were DICOM(-RT) and HL7 related (n=65), primarily between treatment planning and verification systems or between departmental and hospital systems. Seven were identified as being relevant for RO-VUmc. Departments have overcome interoperability problems with their own, or with tailor-made vendor solutions. There was little knowledge about or utilization of solutions developed by Integrating the Healthcare Enterprise Radiation Oncology (IHE-RO). Conclusions: Although interoperability problems are still common, solutions have been identified. Awareness of IHE-RO needs to be raised. No major new interoperability problems are predicted as RO-VUmc develops into a multi-site department.
NASA Astrophysics Data System (ADS)
Torres, Y.; Escalante, M. P.
2009-04-01
This work illustrates the advantages of using a Geographic Information System in a cooperative project with researchers of different countries, such as the RESIS II project (financed by the Norwegian Government and managed by CEPREDENAC) for seismic hazard assessment of Central America. As input data present different formats, cover distinct geographical areas and are subjected to different interpretations, data inconsistencies may appear and their management get complicated. To achieve data homogenization and to integrate them in a GIS, it is required previously to develop a conceptual model. This is accomplished in two phases: requirements analysis and conceptualization. The Unified Modeling Language (UML) is used to compose the conceptual model of the GIS. UML complies with ISO 19100 norms and allows the designer defining model architecture and interoperability. The GIS provides a frame for the combination of large geographic-based data volumes, with an uniform geographic reference and avoiding duplications. All this information contains its own metadata following ISO 19115 normative. In this work, the integration in the same environment of active faults and subduction slabs geometries, combined with the epicentres location, has facilitated the definition of seismogenetic regions. This is a great support for national specialists of different countries to make easier their teamwork. The GIS capacity for making queries (by location and by attributes) and geostatistical analyses is used to interpolate discrete data resulting from seismic hazard calculations and to create continuous maps as well as to check and validate partial results of the study. GIS-based products, such as complete, homogenised databases and thematic cartography of the region, are distributed to all researchers, facilitating cross-national communication, the project execution and results dissemination.
Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaidon, Clement; Poplawski, Michael
First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.
Smart Grid Interoperability Maturity Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Levinson, Alex; Mater, J.
2010-04-28
The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizationalmore » alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.« less
Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'
NASA Astrophysics Data System (ADS)
Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno
2015-04-01
Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.
NASA Astrophysics Data System (ADS)
Cole, M.; Alameh, N.; Bambacus, M.
2006-05-01
The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.
NASA Astrophysics Data System (ADS)
Bambacus, M.; Alameh, N.; Cole, M.
2006-12-01
The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.
NASA Technical Reports Server (NTRS)
Stephens, J. Briscoe; Grider, Gary W.
1992-01-01
These Earth Science and Applications Division-Data and Information System (ESAD-DIS) interoperability requirements are designed to quantify the Earth Science and Application Division's hardware and software requirements in terms of communications between personal and visualization workstation, and mainframe computers. The electronic mail requirements and local area network (LAN) requirements are addressed. These interoperability requirements are top-level requirements framed around defining the existing ESAD-DIS interoperability and projecting known near-term requirements for both operational support and for management planning. Detailed requirements will be submitted on a case-by-case basis. This document is also intended as an overview of ESAD-DIs interoperability for new-comers and management not familiar with these activities. It is intended as background documentation to support requests for resources and support requirements.
An open repositories network development for medical teaching resources.
Soula, Gérard; Darmoni, Stefan; Le Beux, Pierre; Renard, Jean-Marie; Dahamna, Badisse; Fieschi, Marius
2010-01-01
The lack of interoperability between repositories of heterogeneous and geographically widespread data is an obstacle to the diffusion, sharing and reutilization of those data. We present the development of an open repositories network taking into account both the syntactic and semantic interoperability of the different repositories and based on international standards in this field. The network is used by the medical community in France for the diffusion and sharing of digital teaching resources. The syntactic interoperability of the repositories is managed using the OAI-PMH protocol for the exchange of metadata describing the resources. Semantic interoperability is based, on one hand, on the LOM standard for the description of resources and on MESH for the indexing of the latter and, on the other hand, on semantic interoperability management designed to optimize compliance with standards and the quality of the metadata.
Ensuring Sustainable Data Interoperability Across the Natural and Social Sciences
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2015-12-01
Both the natural and social science data communities are attempting to address the long-term sustainability of their data infrastructures in rapidly changing research, technological, and policy environments. Many parts of these communities are also considering how to improve the interoperability and integration of their data and systems across natural, social, health, and other domains. However, these efforts have generally been undertaken in parallel, with little thought about how different sustainability approaches may impact long-term interoperability from scientific, legal, or economic perspectives, or vice versa, i.e., how improved interoperability could enhance—or threaten—infrastructure sustainability. Scientific progress depends substantially on the ability to learn from the legacy of previous work available for current and future scientists to study, often by integrating disparate data not previously assembled. Digital data are less likely than scientific publications to be usable in the future unless they are managed by science-oriented repositories that can support long-term data access with the documentation and services needed for future interoperability. We summarize recent discussions in the social and natural science communities on emerging approaches to sustainability and relevant interoperability activities, including efforts by the Belmont Forum E-Infrastructures project to address global change data infrastructure needs; the Group on Earth Observations to further implement data sharing and improve data management across diverse societal benefit areas; and the Research Data Alliance to develop legal interoperability principles and guidelines and to address challenges faced by domain repositories. We also examine emerging needs for data interoperability in the context of the post-2015 development agenda and the expected set of Sustainable Development Goals (SDGs), which set ambitious targets for sustainable development, poverty reduction, and environmental stewardship by 2030. These efforts suggest the need for a holistic approach towards improving and implementing strategies, policies, and practices that will ensure long-term sustainability and interoperability of scientific data repositories and networks across multiple scientific domains.
A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World
NASA Astrophysics Data System (ADS)
Wright, D. J.; Sankaran, S.
2015-12-01
In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted as part of our everyday use of technology.
NASA Astrophysics Data System (ADS)
Oggioni, A.; Tagliolato, P.; Schleidt, K.; Carrara, P.; Grellet, S.; Sarretta, A.
2016-02-01
The state of the art in biodiversity data management unfortunately encompases a plethora of diverse data formats. Compared to other research fields, there is a lack in harmonization and standardization of these data. While data from traditional biodiversity collections (e.g. from museums) can be easily represented by existing standard as provided by TDWG, the growing number of field observations stemming from both VGI activities (e.g. iNaturalist) as well as from automated systems (e.g. animal biotelemetry) would at the very least require upgrades of current formats. Moreover, from an eco-informatics perspective, the integration and use of data from different scientific fields is the norm (abiotic data, geographic information, etc.); the possibility to represent this information and biodiversity data in a homogeneous way would be an advantage for interoperability, allowing for easy integration across environmental media. We will discuss the possibility to exploit the Open Geospatial Consortium/ISO standard, Observations and Measurements (O&M) [1], a generic conceptual model developed for observation data but with strong analogies with the biodiversity-oriented OBOE ontology [2]. The applicability of OGC O&M for the provision of biodiviersity occurence data has been suggested by the INSPIRE Cross Thematic Working Group on Observations & Measurements [3], Inspire Environmental Monitoring Facilities Thematic Working Group [4] and New Zealand Environmental Information Interoperability Framework [5]. This approach, in our opinion, could be an advantage for the biodiversity community. We will provide some examples for encoding biodiversity occurence data using the O&M standard in addition to highlighting the advatages offered by O&M in comparison to other representation formats. [1] Cox, S. (2013). Geographic information - Observations and measurements - OGC and ISO 19156. [2] Madin, J., Bowers, S., Schildhauer, M., Krivov, S., Pennington, D., & Villa, F. (2007). An ontology for describing and synthesizing ecological observation data. Ecological Informatics, 2(3), 279-296. [3] INSPIRE_D2.9_O&M_Guidelines_v2.0rc3.pdf[4] INSPIRE_DataSpecification_EF_v3.0.pdf[5] Watkins, A. (2012) Biodiversity Interoperability through Open Geospatial Standards
An interoperability experiment for sharing hydrological rating tables
NASA Astrophysics Data System (ADS)
Lemon, D.; Taylor, P.; Sheahan, P.
2013-12-01
The increasing demand on freshwater resources is requiring authorities to produce more accurate and timely estimates of their available water. Calculation of continuous time-series of river discharge and storage volumes generally requires rating tables. These approximate relationships between two phenomena, such as river level and discharge, and allow us to produce continuous estimates of a phenomenon that may be impractical or impossible to measure directly. Standardised information models or access mechanisms for rating tables are required to support sharing and exchange of water flow data. An Interoperability Experiment (IE) is underway to test an information model that describes rating tables, the observations made to build these ratings, and river cross-section data. The IE is an initiative of the joint World Meteorological Organisation/Open Geospatial Consortium's Hydrology Domain Working Group (HydroDWG) and the model will be published as WaterML2.0 part 2. Interoperability Experiments (IEs) are low overhead, multiple member projects that are run under the OGC's interoperability program to test existing and emerging standards. The HydroDWG has previously run IEs to test early versions of OGC WaterML2.0 part 1 - timeseries. This IE is focussing on two key exchange scenarios: Sharing rating tables and gauging observations between water agencies. Through the use of standard OGC web services, rating tables and associated data will be made available from water agencies. The (Australian) Bureau of Meteorology will retrieve rating tables on-demand from water authorities, allowing the Bureau to run conversions of data within their own systems. Exposing rating tables and gaugings for online analysis and educational purposes. A web client will be developed to enable exploration and visualization of rating tables, gaugings and related metadata for monitoring points. The client gives a quick view into available rating tables, their periods of applicability and the standard deviation of observations against the relationship. An example of this client running can be seen at the link provided. The result of the IE will form the basis for the standardisation of WaterML2.0 part 2. The use of the standard will lead to increased transparency and accessibility of rating tables, while also improving general understanding of this important hydrological concept.
NASA Astrophysics Data System (ADS)
Arney, David; Goldman, Julian M.; Whitehead, Susan F.; Lee, Insup
When a x-ray image is needed during surgery, clinicians may stop the anesthesia machine ventilator while the exposure is made. If the ventilator is not restarted promptly, the patient may experience severe complications. This paper explores the interconnection of a ventilator and simulated x-ray into a prototype plug-and-play medical device system. This work assists ongoing interoperability framework development standards efforts to develop functional and non-functional requirements and illustrates the potential patient safety benefits of interoperable medical device systems by implementing a solution to a clinical use case requiring interoperability.
Report on the Second Catalog Interoperability Workshop
NASA Technical Reports Server (NTRS)
Thieman, James R.; James, Mary E.
1988-01-01
The events, resolutions, and recommendations of the Second Catalog Interoperability Workshop, held at JPL in January, 1988, are discussed. This workshop dealt with the issues of standardization and communication among directories, catalogs, and inventories in the earth and space science data management environment. The Directory Interchange Format, being constructed as a standard for the exchange of directory information among participating data systems, is discussed. Involvement in the Interoperability effort by NASA, NOAA, ISGS, and NSF is described, and plans for future interoperability considered. The NASA Master Directory prototype is presented and critiqued and options for additional capabilities debated.
A logical approach to semantic interoperability in healthcare.
Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni
2011-01-01
Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.
Kilańska, D; Gaworska-Krzemińska, A; Grabowska, H; Gorzkowicz, B
2016-09-01
The development of a nursing practice, improvements in nurses' autonomy, and increased professional and personal responsibility for the medical services provided all require professional documentation with records of health status assessments, decisions undertaken, actions and their outcomes for each patient. The International Classification for Nursing Practice is a tool that meets all of these needs, and although it requires continuous evaluation, it offers professional documentation and communication in the practitioner and researcher community. The aim of this paper is to present a theoretical critique of an issue related to policy and experience of the current situation in Polish nursing - especially of the efforts to standardize nursing practices through the introduction and development of the Classification in Poland. Despite extensive promotion and training by International Council of Nurses members worldwide, there are still many countries where the Classification has not been implemented as a standard tool in healthcare facilities. Recently, a number of initiatives were undertaken in cooperation with the local and state authorities to disseminate the Classification in healthcare facilities. Thanks to intense efforts by the Polish Nurses Association and the International Council of Nurses Accredited Center for ICNP(®) Research & Development at the Medical University of Łódź, the Classification is known in Poland and has been tested at several centres. Nevertheless, an actual implementation that would allow for national and international interoperability requires strategic governmental decisions and close cooperation with information technology companies operating in the country. Discussing the barriers to the implementation of the Classification can improve understanding of it and its use. At a policy level, decision makers need to understand that use Classification in eHealth services and tools it is necessary to achieve interoperability. © 2016 International Council of Nurses.
Key pillars of data interoperability in Earth Sciences - INSPIRE and beyond
NASA Astrophysics Data System (ADS)
Tomas, Robert; Lutz, Michael
2013-04-01
The well-known heterogeneity and fragmentation of data models, formats and controlled vocabularies of environmental data limit potential data users from utilising the wealth of environmental information available today across Europe. The main aim of INSPIRE1 is to improve this situation and give users possibility to access, use and correctly interpret environmental data. Over the past years number of INSPIRE technical guidelines (TG) and implementing rules (IR) for interoperability have been developed, involving hundreds of domain experts from across Europe. The data interoperability specifications, which have been developed for all 34 INSPIRE spatial data themes2, are the central component of the TG and IR. Several of these themes are related to the earth sciences, e.g. geology (including hydrogeology, geophysics and geomorphology), mineral and energy resources, soil science, natural hazards, meteorology, oceanography, hydrology and land cover. The following main pillars for data interoperability and harmonisation have been identified during the development of the specifications: Conceptual data models describe the spatial objects and their properties and relationships for the different spatial data themes. To achieve cross-domain harmonization, the data models for all themes are based on a common modelling framework (the INSPIRE Generic Conceptual Model3) and managed in a common UML repository. Harmonised vocabularies (or code lists) are to be used in data exchange in order to overcome interoperability issues caused by heterogeneous free-text and/or multi-lingual content. Since a mapping to a harmonized vocabulary could be difficult, the INSPIRE data models typically allow the provision of more specific terms from local vocabularies in addition to the harmonized terms - utilizing either the extensibility options or additional terminological attributes. Encoding. Currently, specific XML profiles of the Geography Markup Language (GML) are promoted as the standard encoding. However, since the conceptual models are independent of concrete encodings, it is also possible to derive other encodings (e.g. based on RDF). Registers provide unique and persistent identifiers for a number of different types of information items (e.g. terms from a controlled vocabulary or units of measure) and allow their consistent management and versioning. By using these identifiers in data, references to specific information items can be made unique and unambiguous. It is important that these interoperability solutions are not developed in isolation - for Europe only. This has been identified from the beginning, and therefore, international standards have been taken into account and been widely referred to in INSPIRE. This mutual cooperation with international standardisation activities needs to be maintained or even extended. For example, where INSPIRE has gone beyond existing standards, the INSPIRE interoperability solutions should be introduced to the international standardisation initiatives. However, in some cases, it is difficult to choose the appropriate international organization or standardisation body (e.g. where there are several organizations overlapping in scope) or to achieve international agreements that accept European specifics. Furthermore, the development of the INSPIRE specifications (to be legally adopted in 2013) is only a beginning of the effort to make environmental data interoperable. Their actual implementation by data providers across Europe, as well as the rapid development in the earth sciences (e.g. from new simulation models, scientific advances, etc.) and ICT technology will lead to requests for changes. It is therefore crucial to ensure the long-term sustainable maintenance and further development of the proposed infrastructure. This task cannot be achieved by the INSPIRE coordination team of the European Commission alone. It is therefore crucial to closely involve relevant (where possible, umbrella) organisations in the earth sciences, who can provide the necessary domain knowledge and expert networks.
Adoption of Electronic Health Records: A Roadmap for India
2016-01-01
Objectives The objective of the study was to create a roadmap for the adoption of Electronic Health Record (EHR) in India based an analysis of the strategies of other countries and national scenarios of ICT use in India. Methods The strategies for adoption of EHR in other countries were analyzed to find the crucial steps taken. Apart from reports collected from stakeholders in the country, the study relied on the experience of the author in handling several e-health projects. Results It was found that there are four major areas where the countries considered have made substantial efforts: ICT infrastructure, Policy & regulations, Standards & interoperability, and Research, development & education. A set of crucial activities were identified in each area. Based on the analysis, a roadmap is suggested. It includes the creation of a secure health network; health information exchange; and the use of open-source software, a national health policy, privacy laws, an agency for health IT standards, R&D, human resource development, etc. Conclusions Although some steps have been initiated, several new steps need to be taken up for the successful adoption of EHR. It requires a coordinated effort from all the stakeholders. PMID:27895957
Tello-Leal, Edgar; Chiotti, Omar; Villarreal, Pablo David
2012-12-01
The paper presents a methodology that follows a top-down approach based on a Model-Driven Architecture for integrating and coordinating healthcare services through cross-organizational processes to enable organizations providing high quality healthcare services and continuous process improvements. The methodology provides a modeling language that enables organizations conceptualizing an integration agreement, and identifying and designing cross-organizational process models. These models are used for the automatic generation of: the private view of processes each organization should perform to fulfill its role in cross-organizational processes, and Colored Petri Net specifications to implement these processes. A multi-agent system platform provides agents able to interpret Colored Petri-Nets to enable the communication between the Healthcare Information Systems for executing the cross-organizational processes. Clinical documents are defined using the HL7 Clinical Document Architecture. This methodology guarantees that important requirements for healthcare services integration and coordination are fulfilled: interoperability between heterogeneous Healthcare Information Systems; ability to cope with changes in cross-organizational processes; guarantee of alignment between the integrated healthcare service solution defined at the organizational level and the solution defined at technological level; and the distributed execution of cross-organizational processes keeping the organizations autonomy.
A cloud-based approach for interoperable electronic health records (EHRs).
Bahga, Arshdeep; Madisetti, Vijay K
2013-09-01
We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.
Reuse and Interoperability of Avionics for Space Systems
NASA Technical Reports Server (NTRS)
Hodson, Robert F.
2007-01-01
The space environment presents unique challenges for avionics. Launch survivability, thermal management, radiation protection, and other factors are important for successful space designs. Many existing avionics designs use custom hardware and software to meet the requirements of space systems. Although some space vendors have moved more towards a standard product line approach to avionics, the space industry still lacks similar standards and common practices for avionics development. This lack of commonality manifests itself in limited reuse and a lack of interoperability. To address NASA s need for interoperable avionics that facilitate reuse, several hardware and software approaches are discussed. Experiences with existing space boards and the application of terrestrial standards is outlined. Enhancements and extensions to these standards are considered. A modular stack-based approach to space avionics is presented. Software and reconfigurable logic cores are considered for extending interoperability and reuse. Finally, some of the issues associated with the design of reusable interoperable avionics are discussed.
2008-08-01
facilitate the use of existing architecture descriptions in performing interoperability measurement. Noting that “everything in the world can be expressed as...biological, botanical, and genetic research, it has also been used with great success in the fields of ecology, medicine, the social sciences, the...appropriate for at least three reasons. First, systems perform different interoperations in different scenarios (i.e., they are used differently); second
Commanding Heterogeneous Multi-Robot Teams
2014-06-01
Coalition Battle Management Language (C-BML) Study Group Report. 2005 Fall Simulation Interoperability Workshop (05F- SIW - 041), Orlando, FL, September...NMSG-085 CIG Land Operation Demonstration. 2013 Spring Simulation Interoperability Workshop (13S- SIW -031), San Diego, CA. April 2013. [4] K...Simulation Interoperability Workshop (10F- SIW -039), Orlando, FL, September 2010. [5] M. Langerwisch, M. Ax, S. Thamke, T. Remmersmann, A. Tiderko
Dandanell, G
1992-01-01
The interoperator distance between a synthetic operator Os and the deoP2O2-galK fusion was varied between 46 and 176 bp. The repression of the deoP2 directed galK expression as a function of the interoperator distance (center-to-center) was measured in vivo in a single-copy system. The results show that the DeoR repressor efficiently can repress transcription at all the interoperator distances tested. The degree of repression depends very little on the spacing between the operators, however, a weak periodic dependency of 8-11 bp may exist. PMID:1437558
NASA Astrophysics Data System (ADS)
Kuo, K.
2010-12-01
As a practitioner in the field of atmospheric remote sensing, the author, like many other similar science users, depends on and uses heavily NASA Earth Science remote sensing data. Thus the author is asked by the NASA Earth Science Data Information System Project (ESDIS) to assess the capabilities of the Earth Observing System Data and Information System (EOSDIS) in order to provide suggestions and recommendations for the evolution of EOSDIS in the path towards its 2015 Vision Tenets. As NASA's Earth science data system, EOSDIS provides data processing and data archiving and distribution services for EOS missions. The science operations of EOSDIS are the focus of this report, i.e. data archiving and distribution, which are performed within a distributed system of many interconnected nodes, namely the Science Investigator-led Processing Systems, or SIPS, and distributed data centers. Since its inception in the early 1990s, EOSDIS has represented a democratization of data, a break from the past when data dissemination was at the discretion of project scientists. Its “open data” policy is so highly valued and well received by its user communities that it has influenced other agencies, even those of other countries, to adopt the same open policy. In the last ~10 years EOSDIS has matured to serve very well users of any given science community in which the varieties of data being used change infrequently. The unpleasant effects of interoperability barriers are now more often felt by users who try to use new data outside their existing familiar set. This paper first defines interoperability and identifies the purposes for achieving interoperability. The sources of interoperability barriers, classified by the author into software, hardware, and human categories, are examined. For a subset of issues related to software, it presents diagnoses obtained from experience of the author and his survey of the EOSDIS data finding, ordering, retrieving, and extraction services. it also reports on an analysis of his survey regarding tools provided by EOSDIS or its user communities and intended to make routine data manipulations easier. Barriers in the hardware category are those resulting from differences in orbit configurations of the spacecrafts and differences in remote sensing modality (active or passive), spectral and spatial resolutions, scanning strategies, etc. of the instruments. Such differences are best understood by considering the nature of remotely sensed observations. Human factors are further classified into institutional and individual subcategories. The former includes factors such as NASA’s funding practices and the latter relates to individuals’ propensity in adopting new technologies. Finally, a strategy for overcoming these barriers is proposed.
Liaw, S T; Rahimi, A; Ray, P; Taggart, J; Dennis, S; de Lusignan, S; Jalaludin, B; Yeo, A E T; Talaei-Khoei, A
2013-01-01
Effective use of routine data to support integrated chronic disease management (CDM) and population health is dependent on underlying data quality (DQ) and, for cross system use of data, semantic interoperability. An ontological approach to DQ is a potential solution but research in this area is limited and fragmented. Identify mechanisms, including ontologies, to manage DQ in integrated CDM and whether improved DQ will better measure health outcomes. A realist review of English language studies (January 2001-March 2011) which addressed data quality, used ontology-based approaches and is relevant to CDM. We screened 245 papers, excluded 26 duplicates, 135 on abstract review and 31 on full-text review; leaving 61 papers for critical appraisal. Of the 33 papers that examined ontologies in chronic disease management, 13 defined data quality and 15 used ontologies for DQ. Most saw DQ as a multidimensional construct, the most used dimensions being completeness, accuracy, correctness, consistency and timeliness. The majority of studies reported tool design and development (80%), implementation (23%), and descriptive evaluations (15%). Ontological approaches were used to address semantic interoperability, decision support, flexibility of information management and integration/linkage, and complexity of information models. DQ lacks a consensus conceptual framework and definition. DQ and ontological research is relatively immature with little rigorous evaluation studies published. Ontology-based applications could support automated processes to address DQ and semantic interoperability in repositories of routinely collected data to deliver integrated CDM. We advocate moving to ontology-based design of information systems to enable more reliable use of routine data to measure health mechanisms and impacts. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Sinaci, A Anil; Laleci Erturkmen, Gokce B
2013-10-01
In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. Copyright © 2013 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, Dave; Stephan, Eric G.; Wang, Weimin
Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability liemore » the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.« less
Developing a National-Level Concept Dictionary for EHR Implementations in Kenya.
Keny, Aggrey; Wanyee, Steven; Kwaro, Daniel; Mulwa, Edwin; Were, Martin C
2015-01-01
The increasing adoption of Electronic Health Records (EHR) by developing countries comes with the need to develop common terminology standards to assure semantic interoperability. In Kenya, where the Ministry of Health has rolled out an EHR at 646 sites, several challenges have emerged including variable dictionaries across implementations, inability to easily share data across systems, lack of expertise in dictionary management, lack of central coordination and custody of a terminology service, inadequately defined policies and processes, insufficient infrastructure, among others. A Concept Working Group was constituted to address these challenges. The country settled on a common Kenya data dictionary, initially derived as a subset of the Columbia International eHealth Laboratory (CIEL)/Millennium Villages Project (MVP) dictionary. The initial dictionary scope largely focuses on clinical needs. Processes and policies around dictionary management are being guided by the framework developed by Bakhshi-Raiez et al. Technical and infrastructure-based approaches are also underway to streamline workflow for dictionary management and distribution across implementations. Kenya's approach on comprehensive common dictionary can serve as a model for other countries in similar settings.
Persistent Identifiers, Discoverability and Open Science (Communication)
NASA Astrophysics Data System (ADS)
Murphy, Fiona; Lehnert, Kerstin; Hanson, Brooks
2016-04-01
Early in 2016, the American Geophysical Union announced it was incorporating ORCIDs into its submission workflows. This was accompanied by a strong statement supporting the use of other persistent identifiers - such as IGSNs, and the CrossRef open registry 'funding data'. This was partly in response to funders' desire to track and manage their outputs. However the more compelling argument, and the reason why the AGU has also signed up to the Center for Open Science's Transparency and Openness Promotion (TOP) Guidelines (http://cos.io/top), is that ultimately science and scientists will be the richer for these initiatives due to increased opportunities for interoperability, reproduceability and accreditation. The AGU has appealed to the wider community to engage with these initiatives, recognising that - unlike the introduction of Digital Object Identifiers (DOIs) for articles by CrossRef - full, enriched use of persistent identifiers throughout the scientific process requires buy-in from a range of scholarly communications stakeholders. At the same time, across the general research landscape, initiatives such as Project CRediT (contributor roles taxonomy), Publons (reviewer acknowledgements) and the forthcoming CrossRef DOI Event Tracker are contributing to our understanding and accreditation of contributions and impact. More specifically for earth science and scientists, the cross-functional Coalition for Publishing Data in the Earth and Space Sciences (COPDESS) was formed in October 2014 and is working to 'provide an organizational framework for Earth and space science publishers and data facilities to jointly implement and promote common policies and procedures for the publication and citation of data across Earth Science journals'. Clearly, the judicious integration of standards, registries and persistent identifiers such as ORCIDs and International Geo Sample Numbers (IGSNs) to the research and research output processes is key to the success of this venture. However these also give rise to a number of logistical, technological and cultural challenges. This poster seeks to identify and progress our understanding of these. The authors are keen to build knowledge from the gathering of case studies (successful or otherwise) and hear from potential collaborators in order to develop a robust structure that will empower both earth science and earth scientists and enable more nuanced, trustworthy, interoperable research in the near future.
Sass, Julian; Becker, Kim; Ludmann, Dominik; Pantazoglou, Elisabeth; Dewenter, Heike; Thun, Sylvia
2018-01-01
A nationally uniform medication plan has recently been part of German legislation. The specification for the German medication plan was developed in cooperation between various stakeholders of the healthcare system. Its' goal is to enhance usability and interoperability while also providing patients and physicians with the necessary information they require for a safe and high-quality therapy. Within the research and development project named Medication Plan PLUS, the specification of the medication plan was tested and reviewed for semantic interoperability in particular. In this study, the list of pharmaceutical dose forms provided in the specification was mapped to the standard terms of the European Directorate for the Quality of Medicines & HealthCare by different coders. The level of agreement between coders was calculated using Cohen's Kappa (κ). Results show that less than half of the dose forms could be coded with EDQM standard terms. In addition to that Kappa was found to be moderate, which means rather unconvincing agreement among coders. In conclusion, there is still vast room for improvement in utilization of standardized international vocabulary and unused potential considering cross-border eHealth implementations in the future.
NASA Astrophysics Data System (ADS)
Tudose, Alexandru; Terstyansky, Gabor; Kacsuk, Peter; Winter, Stephen
Grid Application Repositories vary greatly in terms of access interface, security system, implementation technology, communication protocols and repository model. This diversity has become a significant limitation in terms of interoperability and inter-repository access. This paper presents the Grid Application Meta-Repository System (GAMRS) as a solution that offers better options for the management of Grid applications. GAMRS proposes a generic repository architecture, which allows any Grid Application Repository (GAR) to be connected to the system independent of their underlying technology. It also presents applications in a uniform manner and makes applications from all connected repositories visible to web search engines, OGSI/WSRF Grid Services and other OAI (Open Archive Initiative)-compliant repositories. GAMRS can also function as a repository in its own right and can store applications under a new repository model. With the help of this model, applications can be presented as embedded in virtual machines (VM) and therefore they can be run in their native environments and can easily be deployed on virtualized infrastructures allowing interoperability with new generation technologies such as cloud computing, application-on-demand, automatic service/application deployments and automatic VM generation.
A Framework for Integration of Heterogeneous Medical Imaging Networks
Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos
2014-01-01
Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS. PMID:25279021
A framework for integration of heterogeneous medical imaging networks.
Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos
2014-01-01
Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS.
LVC Architecture Roadmap Implementation - Results of the First Two Years
2012-03-01
NOTES Presented at the Simulation Interoperability Standards Organization?s (SISO) Spring Simulation Interoperability Workshop ( SIW ), 26-30 March...presented at the semi-annual Simulation Interoperability Workshops ( SIWs ) and the annual Interservice/Industry Training, Simulation & Education Conference...I/ITSEC), as well as other venues. For example, a full-day workshop on the initial progress of the effort was conducted at the 2010 Spring SIW [2
An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web
NASA Astrophysics Data System (ADS)
Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.
2013-09-01
Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.
BCube: A Broker Framework for Next Generation Geoscience
NASA Astrophysics Data System (ADS)
Khalsa, S. S.; Pearlman, J.; Nativi, S.
2013-12-01
EarthCube is an NSF initiative that aims to transform the conduct of research through the creation of community-guided cyberinfrastructure enabling the integration information and data across the geosciences. Following an initial phase of concept and community development activities, NSF has made awards for the development of cyberinfrastructure 'building blocks.' In this talk we describe the goals and methods for one of these projects - BCube, for Brokering Building Blocks. BCube addresses the need for effective and efficient multi-disciplinary collaboration and interoperability through the introduction of brokering technologies. Brokers, as information systems middleware, have existed for many years and are found in diverse domains and industries such as financial systems, business-to-business interfaces, medicine and the automotive industry, to name a few. However, the emergence of brokers in science is relatively new and is now being piloted with great promise in cyberinfrastructure and science communities in the U.S., Europe, and elsewhere. Brokers act as intermediaries between information systems that implement well-defined interfaces, providing a bridge between communities using different specifications. The BCube project is helping to build a truly cross-disciplinary, global platform for data providers, cyberinfrastructure developers, and data users to make data more available and interoperable through a brokering framework. Building on the GEOSS Discover and Access Broker (DAB), BCube will develop new modules and services including * Expanded semantic brokering * Business Model support for work flows * Automated metadata generation * Automated linking to services discovered via web crawling * Plug and play for most community service buses * Credential passing for seamless access to data * Ranking of search results from brokered catalogs Because facilitating cross-discipline research involves cultural and well as technical challenges, BCube is also addressing the sociological and educational components of infrastructure development. Our research is initially focused on four disciplines: hydrology, oceans, polar and weather, with an emphasis on connecting existing domain infrastructure elements to facilitate cross-domain communications.
Nagelhout, Gera E; van den Putte, Bas; Allwright, Shane; Mons, Ute; McNeill, Ann; Guignard, Romain; Beck, François; Siahpush, Mohammad; Joossens, Luk; Fong, Geoffrey T; de Vries, Hein; Willemsen, Marc C
2014-03-01
Legal tobacco tax avoidance strategies such as cross-border cigarette purchasing may attenuate the impact of tax increases on tobacco consumption. Little is known about socioeconomic and country variations in cross-border purchasing. To describe socioeconomic and country variations in cross-border cigarette purchasing in six European countries. Cross-sectional data from adult smokers (n=7873) from the International Tobacco Control (ITC) Surveys in France (2006/2007), Germany (2007), Ireland (2006), The Netherlands (2008), Scotland (2006) and the rest of the UK (2007/2008) were used. Respondents were asked whether they had bought cigarettes outside their country in the last 6 months and how often. In French and German provinces/states bordering countries with lower cigarette prices, 24% and 13% of smokers, respectively, reported purchasing cigarettes frequently outside their country. In non-border regions of France and Germany, and in Ireland, Scotland, the rest of the UK and The Netherlands, frequent purchasing of cigarettes outside the country was reported by 2-7% of smokers. Smokers with higher levels of education or income, younger smokers, daily smokers, heavier smokers and smokers not planning to quit smoking were more likely to purchase cigarettes outside their country. Cross-border cigarette purchasing is more common in European regions bordering countries with lower cigarette prices and is more often reported by smokers with higher education and income. Increasing taxes in countries with lower cigarette prices, and reducing the number of cigarettes that can be legally imported across borders could help to avoid cross-border purchasing.
The European Location Framework - from National to European
NASA Astrophysics Data System (ADS)
Pauknerova, E.; Sidlichovsky, P.; Urbanas, S.; Med, M.
2016-06-01
The European Location Framework (ELF) means a technical infrastructure which will deliver authoritative, interoperable geospatial reference data from all over Europe for analysing and understanding information connected to places and features. The ELF has been developed and set up through the ELF Project, which has been realized by a consortium of partners (public, private and academic organisations) since March 2013. Their number increased from thirty to forty in the year 2016, together with a project extension from 36 to 44 months. The project is co-funded by the European Commission's Competitiveness and Innovation Framework Programme (CIP) and will end in October 2016. In broad terms, the ELF Project will deliver a unique gateway to the authoritative reference geospatial information for Europe (harmonised pan-European maps, geographic and land information) sourced from the National Mapping and Cadastral Authorities (NMCAs) around Europe and including transparent licensing. This will be provided as an online ELF web service that will deliver an up-to-date topographic base map and also as view & download services for access to the ELF datasets. To develop and build up the ELF, NMCAs are accompanied and collaborate with several research & academia institutes, a standardisation body, system integrators, software developers and application providers. The harmonisation is in progress developing and triggering a number of geo-tools like edge-matching, generalisation, transformation and others. ELF will provide also some centralised tools like Geo Locator for searching location based on geographical names, addresses and administrative units, and GeoProduct Finder for discovering the available web-services and licensing them. ELF combines national reference geo-information through the ELF platform. ELF web services will be offered to users and application developers through open source (OSKARI) and proprietary (ArcGIS Online) cloud platforms. Recently, 29 NMCAs plus the EuroGeographics - their pan-European umbrella association, contribute to the ELF through an enrichment of data coverage. As a result, over 20 European countries will be covered with the ELF topo Base Map in 2016. Most countries will contribute also with other harmonized thematic data for viewing or down-loading. To overcome the heterogeneity of data resources and diversity of languages in tens of European countries, ELF builds on the existing INSPIRE rules and its own coordination and interoperability measures. ELF realisation empowers the implementation of INSPIRE in Europe and it complements related activities of European NMCAs, e.g. Czech Office for Surveying, Mapping and Cadastre (CUZK), which provides a large portfolio of spatial data/services and contributes significantly to the NSDI of Czech Republic. CUZK is also responsible for the Base Register of Territorial Identification, Addresses and Real Estates (RUIAN) - an important pillar of Czech e-Government. CUZK became an early-bird in implementing INSPIRE and it provides to the ELF a number of compliant datasets and web services. CUZK and the Polish NMCA (GUGiK) collaborate in the Central-European ELF Pilot (cluster) and test various cross-border prototypes. The presentation combines the national and crossborder view and experiences of CUZK and the European perspective of EuroGeographics.
The Microbial Resource Research Infrastructure MIRRI: Strength through Coordination
Stackebrandt, Erko; Schüngel, Manuela; Martin, Dunja; Smith, David
2015-01-01
Microbial resources have been recognized as essential raw materials for the advancement of health and later for biotechnology, agriculture, food technology and for research in the life sciences, as their enormous abundance and diversity offer an unparalleled source of unexplored solutions. Microbial domain biological resource centres (mBRC) provide live cultures and associated data to foster and support the development of basic and applied science in countries worldwide and especially in Europe, where the density of highly advanced mBRCs is high. The not-for-profit and distributed project MIRRI (Microbial Resource Research Infrastructure) aims to coordinate access to hitherto individually managed resources by developing a pan-European platform which takes the interoperability and accessibility of resources and data to a higher level. Providing a wealth of additional information and linking to datasets such as literature, environmental data, sequences and chemistry will enable researchers to select organisms suitable for their research and enable innovative solutions to be developed. The current independent policies and managed processes will be adapted by partner mBRCs to harmonize holdings, services, training, and accession policy and to share expertise. The infrastructure will improve access to enhanced quality microorganisms in an appropriate legal framework and to resource-associated data in a more interoperable way. PMID:27682123
The Microbial Resource Research Infrastructure MIRRI: Strength through Coordination.
Stackebrandt, Erko; Schüngel, Manuela; Martin, Dunja; Smith, David
2015-11-18
Microbial resources have been recognized as essential raw materials for the advancement of health and later for biotechnology, agriculture, food technology and for research in the life sciences, as their enormous abundance and diversity offer an unparalleled source of unexplored solutions. Microbial domain biological resource centres (mBRC) provide live cultures and associated data to foster and support the development of basic and applied science in countries worldwide and especially in Europe, where the density of highly advanced mBRCs is high. The not-for-profit and distributed project MIRRI (Microbial Resource Research Infrastructure) aims to coordinate access to hitherto individually managed resources by developing a pan-European platform which takes the interoperability and accessibility of resources and data to a higher level. Providing a wealth of additional information and linking to datasets such as literature, environmental data, sequences and chemistry will enable researchers to select organisms suitable for their research and enable innovative solutions to be developed. The current independent policies and managed processes will be adapted by partner mBRCs to harmonize holdings, services, training, and accession policy and to share expertise. The infrastructure will improve access to enhanced quality microorganisms in an appropriate legal framework and to resource-associated data in a more interoperable way.
The Next Generation of Interoperability Agents in Healthcare
Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel ; Abelha, António; Machado, José
2014-01-01
Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA. PMID:24840351
The role of markup for enabling interoperability in health informatics.
McKeever, Steve; Johnson, David
2015-01-01
Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable.
NASA Astrophysics Data System (ADS)
Kruger, Scott; Shasharina, S.; Vadlamani, S.; McCune, D.; Holland, C.; Jenkins, T. G.; Candy, J.; Cary, J. R.; Hakim, A.; Miah, M.; Pletzer, A.
2010-11-01
As various efforts to integrate fusion codes proceed worldwide, standards for sharing data have emerged. In the U.S., the SWIM project has pioneered the development of the Plasma State, which has a flat-hierarchy and is dominated by its use within 1.5D transport codes. The European Integrated Tokamak Modeling effort has developed a more ambitious data interoperability effort organized around the concept of Consistent Physical Objects (CPOs). CPOs have deep hierarchies as needed by an effort that seeks to encompass all of fusion computing. Here, we discuss ideas for implementing data interoperability that is complementary to both the Plasma State and CPOs. By making use of attributes within the netcdf and HDF5 binary file formats, the goals of data interoperability can be achieved with a more informal approach. In addition, a file can be simultaneously interoperable to several standards at once. As an illustration of this approach, we discuss its application to the development of synthetic diagnostics that can be used for multiple codes.
NASA Technical Reports Server (NTRS)
Conroy, Mike; Gill, Paul; Ingalls, John; Bengtsson, Kjell
2014-01-01
No known system is in place to allow NASA technical data interoperability throughout the whole life cycle. Life Cycle Cost (LCC) will be higher on many developing programs if action isn't taken soon to join disparate systems efficiently. Disparate technical data also increases safety risks from poorly integrated elements. NASA requires interoperability and industry standards, but breaking legacy ways is a challenge.
Interacting with Multi-Robot Systems Using BML
2013-06-01
Pullen, U. Schade, J. Simonsen & R. Gomez-Veiga, NATO MSG-048 C-BML Final Report Summary. 2010 Fall Simulation Interoperability Workshop (10F- SIW -039...NATO MSG-085. 2012 Spring Simulation Interoperability Workshop (12S- SIW -045), Orlando, FL, March 2012. [3] T. Remmersmann, U. Schade, L. Khimeche...B. Grautreau & R. El Abdouni Khayari, Lessons Recognized: How to Combine BML and MSDL. 2012 Spring Simulation Interoperability Workshop (12S- SIW -012
A Linguistic Foundation for Communicating Geo-Information in the context of BML and geoBML
2010-03-23
BML Standard. 2009 Spring Simulation Interoperability Workshop (09S- SIW -046). San Diego, CA. Rein, K., Schade, U. & Hieb, M.R. (2009). Battle...Formalizing Battle Management Language: A Grammar for Specifying Orders. 2006 Spring Simulation Interoperability Workshop (06S- SIW - 068). Huntsville...Hieb, M.R. (2007). Battle Management Language: A Grammar for Specifying Reports. 2007 Spring Simulation Interoperability Workshop (07S- SIW -036
Semantic and syntactic interoperability in online processing of big Earth observation data.
Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea
2018-01-01
The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover).
Semantic and syntactic interoperability in online processing of big Earth observation data
Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea
2018-01-01
ABSTRACT The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover). PMID:29387171
IHE based interoperability - benefits and challenges.
Wozak, Florian; Ammenwerth, Elske; Hörbst, Alexander; Sögner, Peter; Mair, Richard; Schabetsberger, Thomas
2008-01-01
Optimized workflows and communication between institutions involved in a patient's treatment process can lead to improved quality and efficiency in the healthcare sector. Electronic Health Records (EHRs) provide a patient-centered access to clinical data across institutional boundaries supporting the above mentioned aspects. Interoperability is regarded as vital success factor. However a clear definition of interoperability does not exist. The aim of this work is to define and to assess interoperability criteria as required for EHRs. The definition and assessment of interoperability criteria is supported by the analysis of existing literature and personal experience as well as by discussions with several domain experts. Criteria for interoperability addresses the following aspects: Interfaces, Semantics, Legal and organizational aspects and Security. The Integrating the Healthcare Enterprises initiative (IHE) profiles make a major contribution to these aspects, but they also arise new problems. Flexibility for adoption to different organizational/regional or other specific conditions is missing. Regional or national initiatives should get a possibility to realize their specific needs within the boundaries of IHE profiles. Security so far is an optional element which is one of IHE greatest omissions. An integrated security approach seems to be preferable. Irrespective of the so far practical significance of the IHE profiles it appears to be of great importance, that the profiles are constantly checked against practical experiences and are continuously adapted.
A Survey on Next-generation Power Grid Data Architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
You, Shutang; Zhu, Dr. Lin; Liu, Yong
2015-01-01
The operation and control of power grids will increasingly rely on data. A high-speed, reliable, flexible and secure data architecture is the prerequisite of the next-generation power grid. This paper summarizes the challenges in collecting and utilizing power grid data, and then provides reference data architecture for future power grids. Based on the data architecture deployment, related research on data architecture is reviewed and summarized in several categories including data measurement/actuation, data transmission, data service layer, data utilization, as well as two cross-cutting issues, interoperability and cyber security. Research gaps and future work are also presented.
Policy-Based Negotiation Engine for Cross-Domain Interoperability
NASA Technical Reports Server (NTRS)
Vatan, Farrokh; Chow, Edward T.
2012-01-01
A successful policy negotiation scheme for Policy-Based Management (PBM) has been implemented. Policy negotiation is the process of determining the "best" communication policy that all of the parties involved can agree on. Specifically, the problem is how to reconcile the various (and possibly conflicting) communication protocols used by different divisions. The solution must use protocols available to all parties involved, and should attempt to do so in the best way possible. Which protocols are commonly available, and what the definition of "best" is will be dependent on the parties involved and their individual communications priorities.
Nagelhout, Gera E.; van den Putte, Bas; Allwright, Shane; Mons, Ute; McNeill, Ann; Guignard, Romain; Beck, François; Siahpush, Mohammad; Joossens, Luk; Fong, Geoffrey T.; de Vries, Hein; Willemsen, Marc C.
2014-01-01
Background Legal tobacco tax avoidance strategies such as cross-border cigarette purchasing may attenuate the impact of tax increases on tobacco consumption. Little is known about socioeconomic and country variations in cross-border purchasing. Objective To describe socioeconomic and country variations in cross-border cigarette purchasing in six European countries. Methods Cross-sectional data from adult smokers (n = 7,873) from the International Tobacco Control (ITC) Surveys in France (2006/7), Germany (2007), Ireland (2006), the Netherlands (2008), Scotland (2006), and the rest of the United Kingdom (2007/8) were used. Respondents were asked whether they had bought cigarettes outside their country in the last six months and how often. Findings In French and German provinces/states bordering countries with lower cigarette prices, 24% and 13% of smokers respectively reported purchasing cigarettes frequently outside their country. In non-border regions of France and Germany and in Ireland, Scotland, the rest of the United Kingdom, and the Netherlands, frequent purchasing of cigarettes outside the country was reported by 2% to 7% of smokers. Smokers with higher levels of education or income, younger smokers, daily smokers, heavier smokers, and smokers not planning to quit smoking were more likely to purchase cigarettes outside their country. Conclusion Cross-border cigarette purchasing is more common in European regions bordering countries with lower cigarette prices and is more often reported by smokers with higher education and income. Increasing taxes in countries with lower cigarette prices and reducing the number of cigarettes that can be legally imported across borders could help to avoid cross-border purchasing. PMID:23644287
Stargardt, Tom; Schreyögg, Jonas
2006-01-01
Several EU countries are determining reimbursement prices of pharmaceuticals by cross-referencing prices of foreign countries. Our objective is to quantify the theoretical cross-border spill-over effects of cross-reference pricing schemes on pharmaceutical prices in the former EU-15 countries. An analytical model was developed estimating the impact of pharmaceutical price changes in Germany on pharmaceutical prices in other countries in the former EU-15 using cross-reference pricing. We differentiated between the direct impact (from referencing to Germany directly) and the indirect impact (from referencing to other countries that conduct their own cross-reference pricing schemes). The relationship between the direct and indirect impact of a price change depends mainly on the method applied to set reimbursement prices. When applying cross-reference pricing, the reimbursement price is either determined by the lowest of foreign prices (e.g. Portugal), the average of foreign prices (e.g. Ireland) or a weighted average of foreign prices (e.g. Italy). If the respective drug is marketed in all referenced countries and prices are regularly updated, a price reduction of 1.00 euro in Germany will reduce maximum reimbursement prices in the former EU-15 countries from 0.15 euros in Austria to 0.36 euros in Italy. On one side, the cross-border spill-over effects of price reductions are undoubtedly welcomed by decision makers and may be favourable to the healthcare system in general. On the other side, these cross-border spill-over effects also provide strong incentives for strategic product launches, launch delays and lobbying activities, and can affect the effectiveness of regulation. To avoid the negative effects of cross-reference pricing, a weighted index of prices from as many countries as possible should be used to determine reimbursement prices in order to reduce the direct and indirect impact of individual countries.
2016-07-13
ELECTRONIC HEALTH RECORDS VA’s Efforts Raise Concerns about Interoperability Goals and Measures, Duplication with DOD...Agencies, Committee on Appropriations, U.S. Senate July 13, 2016 ELECTRONIC HEALTH RECORDS VA’s Efforts Raise Concerns about Interoperability Goals...initiatives with the Department of Defense (DOD) that were intended to advance the ability of the two departments to share electronic health records , the
Enabling Medical Device Interoperability for the Integrated Clinical Environment
2016-12-01
else who is eager to work together to mature the healthcare technology ecosystem to enable the next generation of safe and intelligent medical device...Award Number: W81XWH-12-C-0154 TITLE: “Enabling Medical Device Interoperability for the Integrated Clinical Environment ” PRINCIPAL INVESTIGATOR...SUBTITLE 5a. CONTRACT NUMBER W81XWH-12-C-0154 “Enabling Medical Device Interoperability for the Integrated Clinical Environment ” 5b. GRANT NUMBER 5c
2011-07-01
Orlando, Florida, September 2009, 09F- SIW -090. [HLA (2000) - 1] - Modeling and Simulation Standard - High Level Architecture (HLA) – Framework and...Simulation Interoperability Workshop, Orlando, FL, USA, September 2009, 09F- SIW -023. [MaK] - www.mak.com [MIL-STD-3011] - MIL-STD-3011...Spring Simulation Interoperability Workshop, Norfolk, VA, USA, March 2007, 07S- SIW -072. [Ross] - Ross, P. and Clark, P. (2005), “Recommended
An HLA-Based Approach to Quantify Achievable Performance for Tactical Edge Applications
2011-05-01
in: Proceedings of the 2002 Fall Simulation Interoperability Workshop, 02F- SIW -068, Nov 2002. [16] P. Knight, et al. ―WBT RTI Independent...Benchmark Tests: Design, Implementation, and Updated Results‖, in: Proceedings of the 2002 Spring Simulation Interoperability Workshop, 02S- SIW -081, March...Interoperability Workshop, 98F- SIW -085, Nov 1998. [18] S. Ferenci and R. Fujimoto. ―RTI Performance on Shared Memory and Message Passing Architectures‖, in
2010-06-01
Military Scenario Definition Language (MSDL) for Nontraditional Warfare Scenarios," Paper 09S- SIW -001, Proceedings of the Spring Simulation...Update to the M&S Community," Paper 09S- SIW -002, Proceedings of the Spring Simulation Interoperability Workshop, Simulation Interoperability...Multiple Simulations: An Application of the Military Scenario Definition Language (MSDL)," Paper 09S- SIW -003, Proc. of the Spring Simulation
Planetary Sciences Interoperability at VO Paris Data Centre
NASA Astrophysics Data System (ADS)
Le Sidaner, P.; Aboudarham, J.; Birlan, M.; Briot, D.; Bonnin, X.; Cecconi, B.; Chauvin, C.; Erard, S.; Henry, F.; Lamy, L.; Mancini, M.; Normand, J.; Popescu, F.; Roques, F.; Savalle, R.; Schneider, J.; Shih, A.; Thuillot, W.; Vinatier, S.
2015-10-01
The Astronomy community has been developing interoperability since more than 10 years, by standardizing data access, data formats, and metadata. This international action is led by the International Virtual Observatory Alliance (IVOA). Observatoire de Paris is an active participant in this project. All actions on interoperability, data and service provision are centralized in and managed by VOParis Data Centre (VOPDC). VOPDC is a coordinated project from all scientific departments of Observatoire de Paris..
Interoperable and standard e-Health solution over Bluetooth.
Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J
2010-01-01
The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.
NASA Technical Reports Server (NTRS)
Fern, Lisa; Rorie, Conrad; Shively, Jay
2016-01-01
At the May 2015 SC-228 meeting, requirements for TCAS II interoperability became elevated in priority. A TCAS interoperability work group was formed to identify and address key issuesquestions. The TCAS work group came up with an initial list of questions and a plan to address those questions. As part of that plan, NASA proposed to run a mini HITL to address display, alerting and guidance issues. A TCAS Interoperability Workshop was held to determine potential displayalertingguidance issues that could be explored in future NASA mini HITLS. Consensus on main functionality of DAA guidance when TCAS II RA occurs. Prioritized list of independent variables for experimental design. Set of use cases to stress TCAS Interoperability.
Kuijpers, Rowella C. W. M.; Otten, Roy; Vermulst, Ad A.; Bitfoi, Adina; Goelitz, Dietmar; Koç, Ceren; Mihova, Zlatka; Pez, Ondine; Carta, Mauro; Keyes, Katherine; Lesinskiene, Sigita; Engels, Rutger C. M. E.; Kovess, Viviane
2015-01-01
Large-scale international surveys are important to globally evaluate, monitor, and promote children's mental health. However, use of young children's self-reports in these studies is still controversial. The Dominic Interactive, a computerized DSM-IV–based child mental health self-report questionnaire, has unique characteristics that may make it preeminently appropriate for usage in cross-country comparisons. This study aimed to determine scale score reliabilities (omega) of the Dominic Interactive in a sample of 8,135 primary school children, ages 6–11 years old, in 7 European countries, to confirm the proposed 7-scale factor structure, and to test for measurement invariance of scale and item scores across countries. Omega reliability values for scale scores were good to high in every country, and the factor structure was confirmed for all countries. A thorough examination of measurement invariance provided evidence for cross-country test score comparability of 5 of the 7 scales and partial scale score invariance of 2 anxiety scales. Possible explanations for this partial invariance include cross-country differences in conceptualizing items and defining what is socially and culturally acceptable anxiety. The convincing evidence for validity of score interpretation makes the Dominic Interactive an indispensable tool for cross-country screening purposes. PMID:26237209
ERIC Educational Resources Information Center
Green, Andy; Green, Francis; Pensiero, Nicola
2015-01-01
This article examines cross-country variations in adult skills inequality and asks why skills in Anglophone countries are so unequal. Drawing on the Organization for Economic Cooperation and Development's recent Survey of Adult Skills and other surveys, it investigates the differences across countries and country groups in inequality in both…
ERIC Educational Resources Information Center
Caldwell, John
This book presents changes in cross country skiing which have taken place in the last several years and is directed toward both beginning and seasoned tour skiers. Discussed are the following topics: (1) the cross-country revolution (new fiberglass skis); (2) equipment (how to choose from the new waxless touring skis); (3) care of equipment; (4)…
14 CFR 61.93 - Solo cross-country flight requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... preflight planning and preparation is correct and that the student is prepared to make the flight safely... instructor has: (1) Determined that the student's cross-country planning is correct for the flight; (2... 14 Aeronautics and Space 2 2011-01-01 2011-01-01 false Solo cross-country flight requirements. 61...
14 CFR 61.93 - Solo cross-country flight requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
... preflight planning and preparation is correct and that the student is prepared to make the flight safely... instructor has: (1) Determined that the student's cross-country planning is correct for the flight; (2... 14 Aeronautics and Space 2 2010-01-01 2010-01-01 false Solo cross-country flight requirements. 61...
14 CFR 61.93 - Solo cross-country flight requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... preflight planning and preparation is correct and that the student is prepared to make the flight safely... instructor has: (1) Determined that the student's cross-country planning is correct for the flight; (2... 14 Aeronautics and Space 2 2014-01-01 2014-01-01 false Solo cross-country flight requirements. 61...
14 CFR 61.93 - Solo cross-country flight requirements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... preflight planning and preparation is correct and that the student is prepared to make the flight safely... instructor has: (1) Determined that the student's cross-country planning is correct for the flight; (2... 14 Aeronautics and Space 2 2013-01-01 2013-01-01 false Solo cross-country flight requirements. 61...
14 CFR 61.93 - Solo cross-country flight requirements.
Code of Federal Regulations, 2012 CFR
2012-01-01
... preflight planning and preparation is correct and that the student is prepared to make the flight safely... instructor has: (1) Determined that the student's cross-country planning is correct for the flight; (2... 14 Aeronautics and Space 2 2012-01-01 2012-01-01 false Solo cross-country flight requirements. 61...
Data Modeling Challenges of Advanced Interoperability.
Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka
2018-01-01
Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.
Challenges and Potential Solutions for Big Data Implementations in Developing Countries
Mayan, J.C; García, M.J.; Almerares, A.A.; Househ, M.
2014-01-01
Summary Background The volume of data, the velocity with which they are generated, and their variety and lack of structure hinder their use. This creates the need to change the way information is captured, stored, processed, and analyzed, leading to the paradigm shift called Big Data. Objectives To describe the challenges and possible solutions for developing countries when implementing Big Data projects in the health sector. Methods A non-systematic review of the literature was performed in PubMed and Google Scholar. The following keywords were used: “big data”, “developing countries”, “data mining”, “health information systems”, and “computing methodologies”. A thematic review of selected articles was performed. Results There are challenges when implementing any Big Data program including exponential growth of data, special infrastructure needs, need for a trained workforce, need to agree on interoperability standards, privacy and security issues, and the need to include people, processes, and policies to ensure their adoption. Developing countries have particular characteristics that hinder further development of these projects. Conclusions The advent of Big Data promises great opportunities for the healthcare field. In this article, we attempt to describe the challenges developing countries would face and enumerate the options to be used to achieve successful implementations of Big Data programs. PMID:25123719
Semantic interoperability--HL7 Version 3 compared to advanced architecture standards.
Blobel, B G M E; Engel, K; Pharow, P
2006-01-01
To meet the challenge for high quality and efficient care, highly specialized and distributed healthcare establishments have to communicate and co-operate in a semantically interoperable way. Information and communication technology must be open, flexible, scalable, knowledge-based and service-oriented as well as secure and safe. For enabling semantic interoperability, a unified process for defining and implementing the architecture, i.e. structure and functions of the cooperating systems' components, as well as the approach for knowledge representation, i.e. the used information and its interpretation, algorithms, etc. have to be defined in a harmonized way. Deploying the Generic Component Model, systems and their components, underlying concepts and applied constraints must be formally modeled, strictly separating platform-independent from platform-specific models. As HL7 Version 3 claims to represent the most successful standard for semantic interoperability, HL7 has been analyzed regarding the requirements for model-driven, service-oriented design of semantic interoperable information systems, thereby moving from a communication to an architecture paradigm. The approach is compared with advanced architectural approaches for information systems such as OMG's CORBA 3 or EHR systems such as GEHR/openEHR and CEN EN 13606 Electronic Health Record Communication. HL7 Version 3 is maturing towards an architectural approach for semantic interoperability. Despite current differences, there is a close collaboration between the teams involved guaranteeing a convergence between competing approaches.
Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung
2014-08-01
Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.
A Framework for Seamless Interoperation of Heterogeneous Distributed Software Components
2005-05-01
interoperability, b) distributed resource discovery, and c) validation of quality requirements. Principles and prototypical systems were created to demonstrate the successful completion of the research.
The BACnet Campus Challenge - Part 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masica, Ken; Tom, Steve
Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less
Study and validation of tools interoperability in JPSEC
NASA Astrophysics Data System (ADS)
Conan, V.; Sadourny, Y.; Jean-Marie, K.; Chan, C.; Wee, S.; Apostolopoulos, J.
2005-08-01
Digital imagery is important in many applications today, and the security of digital imagery is important today and is likely to gain in importance in the near future. The emerging international standard ISO/IEC JPEG-2000 Security (JPSEC) is designed to provide security for digital imagery, and in particular digital imagery coded with the JPEG-2000 image coding standard. One of the primary goals of a standard is to ensure interoperability between creators and consumers produced by different manufacturers. The JPSEC standard, similar to the popular JPEG and MPEG family of standards, specifies only the bitstream syntax and the receiver's processing, and not how the bitstream is created or the details of how it is consumed. This paper examines the interoperability for the JPSEC standard, and presents an example JPSEC consumption process which can provide insights in the design of JPSEC consumers. Initial interoperability tests between different groups with independently created implementations of JPSEC creators and consumers have been successful in providing the JPSEC security services of confidentiality (via encryption) and authentication (via message authentication codes, or MACs). Further interoperability work is on-going.
NASA Astrophysics Data System (ADS)
Yang, Gongping; Zhou, Guang-Tong; Yin, Yilong; Yang, Xiukun
2010-12-01
A critical step in an automatic fingerprint recognition system is the segmentation of fingerprint images. Existing methods are usually designed to segment fingerprint images originated from a certain sensor. Thus their performances are significantly affected when dealing with fingerprints collected by different sensors. This work studies the sensor interoperability of fingerprint segmentation algorithms, which refers to the algorithm's ability to adapt to the raw fingerprints obtained from different sensors. We empirically analyze the sensor interoperability problem, and effectively address the issue by proposing a [InlineEquation not available: see fulltext.]-means based segmentation method called SKI. SKI clusters foreground and background blocks of a fingerprint image based on the [InlineEquation not available: see fulltext.]-means algorithm, where a fingerprint block is represented by a 3-dimensional feature vector consisting of block-wise coherence, mean, and variance (abbreviated as CMV). SKI also employs morphological postprocessing to achieve favorable segmentation results. We perform SKI on each fingerprint to ensure sensor interoperability. The interoperability and robustness of our method are validated by experiments performed on a number of fingerprint databases which are obtained from various sensors.
Identity Management Systems in Healthcare: The Issue of Patient Identifiers
NASA Astrophysics Data System (ADS)
Soenens, Els
According to a recent recommendation of the European Commission, now is the time for Europe to enhance interoperability in eHealth. Although interoperability of patient identifiers seems promising for matters of patient mobility, patient empowerment and effective access to care, we see that today there is indeed a considerable lack of interoperability in the field of patient identification. Looking from a socio-technical rather than a merely technical point of view, one can understand the fact that the development and implementation of an identity management system in a specific healthcare context is influenced by particular social practices, affected by socio-economical history and the political climate and regulated by specific data protection legislations. Consequently, the process of making patient identification in Europe more interoperable is a development beyond semantic and syntactic levels. In this paper, we gives some examples of today’s patient identifier systems in Europe, discuss the issue of interoperability of (unique) patient identifiers from a socio-technical point of view and try not to ignore the ‘privacy side’ of the story.
The BACnet Campus Challenge - Part 1
Masica, Ken; Tom, Steve
2015-12-01
Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less
Smart Grid Interoperability Maturity Model Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Drummond, R.; Giroti, Tony
The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across anmore » information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.« less
Exploring NASA GES DISC Data with Interoperable Services
NASA Technical Reports Server (NTRS)
Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey
2015-01-01
Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.
NASA Technical Reports Server (NTRS)
Jones, Michael K.
1998-01-01
Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.
Dhaval, Rakesh; Borlawsky, Tara; Ostrander, Michael; Santangelo, Jennifer; Kamal, Jyoti; Payne, Philip R O
2008-11-06
In order to enhance interoperability between enterprise systems, and improve data validity and reliability throughout The Ohio State University Medical Center (OSUMC), we have initiated the development of an ontology-anchored metadata architecture and knowledge collection for our enterprise data warehouse. The metadata and corresponding semantic relationships stored in the OSUMC knowledge collection are intended to promote consistency and interoperability across the heterogeneous clinical, research, business and education information managed within the data warehouse.
Kick, Glide, Pole! Cross-Country Skiing Fun (Part II)
ERIC Educational Resources Information Center
Duoos, Bridget A.
2012-01-01
Part I of Kick, Glide, Pole! Cross-Country Skiing Fun, which was published in last issue, discussed how to select cross-country ski equipment, dress for the activity and the biomechanics of the diagonal stride. Part II focuses on teaching the diagonal stride technique and begins with a progression of indoor activities. Incorporating this fun,…
Kick, Glide, Pole! Cross-Country Skiing Fun (Part I)
ERIC Educational Resources Information Center
Duoos, Bridget A.
2011-01-01
Cross-country skiing is a great activity for taking a physical education class outside during the cold winter months. It is also a diverse activity that appeals to students of all ages, and is an excellent cardio-respiratory activity to keep students active. This article has provided the first steps in preparing a cross-country skiing lesson in…
Implementation of the Cross-border Care Directive in EU Member States: Luxembourg.
Schwebag, Mike
2014-03-01
The Cross-border Care Directive sets up basic patient rights in case of cross-border healthcare. These rights concern both the country of affiliation and the country of treatment of the patient. The article briefly describes the state of the transposition in Luxembourg, with a focus on the draft act on patients' rights and obligations. This new act on patient rights and obligations will apply without distinction to domestic and cross-border patients, thus transposing most of Luxembourg's obligations as a country of treatment of a cross-border patient.
Interoperability Outlook in the Big Data Future
NASA Astrophysics Data System (ADS)
Kuo, K. S.; Ramachandran, R.
2015-12-01
The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center" interoperability is almost guaranteed because data, analysis, and results all can be readily shared and reused. Effectively, with the establishment of "distributed active analysis centers", interoperation turns from a many-to-many problem into a less complicated few-to-few problem and becomes easier to solve.
Seeking the Path to Metadata Nirvana
NASA Astrophysics Data System (ADS)
Graybeal, J.
2008-12-01
Scientists have always found reusing other scientists' data challenging. Computers did not fundamentally change the problem, but enabled more and larger instances of it. In fact, by removing human mediation and time delays from the data sharing process, computers emphasize the contextual information that must be exchanged in order to exchange and reuse data. This requirement for contextual information has two faces: "interoperability" when talking about systems, and "the metadata problem" when talking about data. As much as any single organization, the Marine Metadata Interoperability (MMI) project has been tagged with the mission "Solve the metadata problem." Of course, if that goal is achieved, then sustained, interoperable data systems for interdisciplinary observing networks can be easily built -- pesky metadata differences, like which protocol to use for data exchange, or what the data actually measures, will be a thing of the past. Alas, as you might imagine, there will always be complexities and incompatibilities that are not addressed, and data systems that are not interoperable, even within a science discipline. So should we throw up our hands and surrender to the inevitable? Not at all. Rather, we try to minimize metadata problems as much as we can. In this we increasingly progress, despite natural forces that pull in the other direction. Computer systems let us work with more complexity, build community knowledge and collaborations, and preserve and publish our progress and (dis-)agreements. Funding organizations, science communities, and technologists see the importance interoperable systems and metadata, and direct resources toward them. With the new approaches and resources, projects like IPY and MMI can simultaneously define, display, and promote effective strategies for sustainable, interoperable data systems. This presentation will outline the role metadata plays in durable interoperable data systems, for better or worse. It will describe times when "just choosing a standard" can work, and when it probably won't work. And it will point out signs that suggest a metadata storm is coming to your community project, and how you might avoid it. From these lessons we will seek a path to producing interoperable, interdisciplinary, metadata-enlightened environment observing systems.
Luchsinger, Harri; Sandbakk, Øyvind; Schubert, Michael; Ettema, Gertjan; Baumeister, Jochen
2016-01-01
Background Previous studies using electroencephalography (EEG) to monitor brain activity have linked higher frontal theta activity to more focused attention and superior performance in goal-directed precision tasks. In biathlon, shooting performance requires focused attention after high-intensity cross-country skiing. Purpose To compare biathletes (serving as experts) and cross-country skiers (novices) and examine the effect of vigorous exercise on frontal theta activity during shooting. Methods EEG frontal theta (4–7 Hz) activity was compared between nine biathletes and eight cross-country skiers at comparable skiing performance levels who fired 100 shots on a 5-m indoor shooting range in quiescent condition followed by 20 shots after each of five 6-min high-intensity roller skiing sessions in the skating technique on a treadmill. Results Biathletes hit 80±14% and 81±10% before and after the roller skiing sessions, respectively. For the cross-country skiers these values were significantly lower than for the biathletes and amounted to 39±13% and 44±11% (p<0.01). Biathletes had on average 6% higher frontal theta activity during shooting as compared to cross-country skiers (F1,15 = 4.82, p = 0.044), but no significant effect of vigorous exercise on frontal theta activity in either of the two groups were found (F1,15 = 0.14, p = 0.72). Conclusions Biathletes had significantly higher frontal theta activity than cross-country skiers during shooting, indicating higher focused attention in biathletes. Vigorous exercise did not decrease shooting performance or frontal theta activity during shooting in biathletes and cross-country skiers. PMID:26981639
OGC and Grid Interoperability in enviroGRIDS Project
NASA Astrophysics Data System (ADS)
Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas
2010-05-01
EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/
Zhang, Mingyuan; Velasco, Ferdinand T.; Musser, R. Clayton; Kawamoto, Kensaku
2013-01-01
Enabling clinical decision support (CDS) across multiple electronic health record (EHR) systems has been a desired but largely unattained aim of clinical informatics, especially in commercial EHR systems. A potential opportunity for enabling such scalable CDS is to leverage vendor-supported, Web-based CDS development platforms along with vendor-supported application programming interfaces (APIs). Here, we propose a potential staged approach for enabling such scalable CDS, starting with the use of custom EHR APIs and moving towards standardized EHR APIs to facilitate interoperability. We analyzed three commercial EHR systems for their capabilities to support the proposed approach, and we implemented prototypes in all three systems. Based on these analyses and prototype implementations, we conclude that the approach proposed is feasible, already supported by several major commercial EHR vendors, and potentially capable of enabling cross-platform CDS at scale. PMID:24551426
Interoperable Data Access Services for NOAA IOOS
NASA Astrophysics Data System (ADS)
de La Beaujardiere, J.
2008-12-01
The Integrated Ocean Observing System (IOOS) is intended to enhance our ability to collect, deliver, and use ocean information. The goal is to support research and decision-making by providing data on our open oceans, coastal waters, and Great Lakes in the formats, rates, and scales required by scientists, managers, businesses, governments, and the public. The US National Oceanic and Atmospheric Administration (NOAA) is the lead agency for IOOS. NOAA's IOOS office supports the development of regional coastal observing capability and promotes data management efforts to increase data accessibility. Geospatial web services have been established at NOAA data providers including the National Data Buoy Center (NDBC), the Center for Operational Oceanographic Products and Services (CO-OPS), and CoastWatch, and at regional data provider sites. Services established include Open-source Project for a Network Data Access Protocol (OpenDAP), Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), and OGC Web Coverage Service (WCS). These services provide integrated access to data holdings that have been aggregated at each center from multiple sources. We wish to collaborate with other groups to improve our service offerings to maximize interoperability and enhance cross-provider data integration, and to share common service components such as registries, catalogs, data conversion, and gateways. This paper will discuss the current status of NOAA's IOOS efforts and possible next steps.
Moving Towards a Common Ground and Flight Data Systems Architecture for NASA's Exploration Missions
NASA Technical Reports Server (NTRS)
Rader. Steve; Kearney, Mike; McVittie, Thom; Smith, Dan
2006-01-01
The National Aeronautics and Space Administration has embarked on an ambitious effort to return man to the moon and then on to Mars. The Exploration Vision requires development of major new space and ground assets and poses challenges well beyond those faced by many of NASA's recent programs. New crewed vehicles must be developed. Compatible supply vehicles, surface mobility modules and robotic exploration capabilities will supplement the manned exploration vehicle. New launch systems will be developed as well as a new ground communications and control infrastructure. The development must take place in a cost-constrained environment and must advance along an aggressive schedule. Common solutions and system interoperability and will be critical to the successful development of the Exploration data systems for this wide variety of flight and ground elements. To this end, NASA has assembled a team of engineers from across the agency to identify the key challenges for Exploration data systems and to establish the most beneficial strategic approach to be followed. Key challenges and the planned NASA approach for flight and ground systems will be discussed in the paper. The described approaches will capitalize on new technologies, and will result in cross-program interoperability between spacecraft and ground systems, from multiple suppliers and agencies.
Transportation communications interoperability : phase 2, resource evaluation.
DOT National Transportation Integrated Search
2006-12-01
Based on the Arizona Department of Transportations (ADOT) previous SPR-561 Needs Assessment study, this : report continues the efforts to enhance radio interoperability between Department of Public Safety (DPS) Highway : Patrol officers and ...
Analysis of OPACITY and PLAID Protocols for Contactless Smart Cards
2012-09-01
9 3. Access Control ........................................................................ 9 E . THREATS AND...Synchronization .............................. 23 c. Simple Integration and Interoperability ..................... 24 E . MODES OF OPERATION...Interoperability ..................... 47 E . MODES OF OPERATIONS ................................................................ 47 F. SUGGESTED KEY
Plugfest 2009: Global Interoperability in Telerobotics and Telemedicine
King, H. Hawkeye; Hannaford, Blake; Kwok, Ka-Wai; Yang, Guang-Zhong; Griffiths, Paul; Okamura, Allison; Farkhatdinov, Ildar; Ryu, Jee-Hwan; Sankaranarayanan, Ganesh; Arikatla, Venkata; Tadano, Kotaro; Kawashima, Kenji; Peer, Angelika; Schauß, Thomas; Buss, Martin; Miller, Levi; Glozman, Daniel; Rosen, Jacob; Low, Thomas
2014-01-01
Despite the great diversity of teleoperator designs and applications, their underlying control systems have many similarities. These similarities can be exploited to enable inter-operability between heterogeneous systems. We have developed a network data specification, the Interoperable Telerobotics Protocol, that can be used for Internet based control of a wide range of teleoperators. In this work we test interoperable telerobotics on the global Internet, focusing on the telesurgery application domain. Fourteen globally dispersed telerobotic master and slave systems were connected in thirty trials in one twenty four hour period. Users performed common manipulation tasks to demonstrate effective master-slave operation. With twenty eight (93%) successful, unique connections the results show a high potential for standardizing telerobotic operation. Furthermore, new paradigms for telesurgical operation and training are presented, including a networked surgery trainer and upper-limb exoskeleton control of micro-manipulators. PMID:24748993
Semantically Interoperable XML Data
Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel
2013-01-01
XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789
Architecture for interoperable software in biology.
Bare, James Christopher; Baliga, Nitin S
2014-07-01
Understanding biological complexity demands a combination of high-throughput data and interdisciplinary skills. One way to bring to bear the necessary combination of data types and expertise is by encapsulating domain knowledge in software and composing that software to create a customized data analysis environment. To this end, simple flexible strategies are needed for interconnecting heterogeneous software tools and enabling data exchange between them. Drawing on our own work and that of others, we present several strategies for interoperability and their consequences, in particular, a set of simple data structures--list, matrix, network, table and tuple--that have proven sufficient to achieve a high degree of interoperability. We provide a few guidelines for the development of future software that will function as part of an interoperable community of software tools for biological data analysis and visualization. © The Author 2012. Published by Oxford University Press.
Chao, Tian-Jy; Kim, Younghun
2015-02-03
Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.
The Long Road to Semantic Interoperability in Support of Public Health: Experiences from Two States
Vreeman, Daniel J.; Grannis, Shaun J.
2014-01-01
Proliferation of health information technologies creates opportunities to improve clinical and public health, including high quality, safer care and lower costs. To maximize such potential benefits, health information technologies must readily and reliably exchange information with other systems. However, evidence from public health surveillance programs in two states suggests that operational clinical information systems often fail to use available standards, a barrier to semantic interoperability. Furthermore, analysis of existing policies incentivizing semantic interoperability suggests they have limited impact and are fragmented. In this essay, we discuss three approaches for increasing semantic interoperability to support national goals for using health information technologies. A clear, comprehensive strategy requiring collaborative efforts by clinical and public health stakeholders is suggested as a guide for the long road towards better population health data and outcomes. PMID:24680985
Geoscience Information Network (USGIN) Solutions for Interoperable Open Data Access Requirements
NASA Astrophysics Data System (ADS)
Allison, M. L.; Richard, S. M.; Patten, K.
2014-12-01
The geosciences are leading development of free, interoperable open access to data. US Geoscience Information Network (USGIN) is a freely available data integration framework, jointly developed by the USGS and the Association of American State Geologists (AASG), in compliance with international standards and protocols to provide easy discovery, access, and interoperability for geoscience data. USGIN standards include the geologic exchange language 'GeoSciML' (v 3.2 which enables instant interoperability of geologic formation data) which is also the base standard used by the 117-nation OneGeology consortium. The USGIN deployment of NGDS serves as a continent-scale operational demonstration of the expanded OneGeology vision to provide access to all geoscience data worldwide. USGIN is developed to accommodate a variety of applications; for example, the International Renewable Energy Agency streams data live to the Global Atlas of Renewable Energy. Alternatively, users without robust data sharing systems can download and implement a free software packet, "GINstack" to easily deploy web services for exposing data online for discovery and access. The White House Open Data Access Initiative requires all federally funded research projects and federal agencies to make their data publicly accessible in an open source, interoperable format, with metadata. USGIN currently incorporates all aspects of the Initiative as it emphasizes interoperability. The system is successfully deployed as the National Geothermal Data System (NGDS), officially launched at the White House Energy Datapalooza in May, 2014. The USGIN Foundation has been established to ensure this technology continues to be accessible and available.
Big data in global health: improving health in low- and middle-income countries
Vaillancourt, Samuel; Perry, William; Mannava, Priya; Folaranmi, Temitope; Celi, Leo Anthony
2015-01-01
Abstract Over the last decade, a massive increase in data collection and analysis has occurred in many fields. In the health sector, however, there has been relatively little progress in data analysis and application despite a rapid rise in data production. Given adequate governance, improvements in the quality, quantity, storage and analysis of health data could lead to substantial improvements in many health outcomes. In low- and middle-income countries in particular, the creation of an information feedback mechanism can move health-care delivery towards results-based practice and improve the effective use of scarce resources. We review the evolving definition of big data and the possible advantages of – and problems in – using such data to improve health-care delivery in low- and middle-income countries. The collection of big data as mobile-phone based services improve may mean that development phases required elsewhere can be skipped. However, poor infrastructure may prevent interoperability and the safe use of patient data. An appropriate governance framework must be developed and enforced to protect individuals and ensure that health-care delivery is tailored to the characteristics and values of the target communities. PMID:25767300
Bernal-Delgado, Enrique; Estupiñán-Romero, Francisco
2018-01-01
The integration of different administrative data sources from a number of European countries has been shown useful in the assessment of unwarranted variations in health care performance. This essay describes the procedures used to set up a data infrastructure (e.g., data access and exchange, definition of the minimum common wealth of data required, and the development of the relational logic data model) and, the methods to produce trustworthy healthcare performance measurements (e.g., ontologies standardisation and quality assurance analysis). The paper ends providing some hints on how to use these lessons in an eventual European infrastructure on public health research and monitoring. Although the relational data infrastructure developed has been proven accurate, effective to compare health system performance across different countries, and efficient enough to deal with hundred of millions of episodes, the logic data model might not be responsive if the European infrastructure aims at including electronic health records and carrying out multi-cohort multi-intervention comparative effectiveness research. The deployment of a distributed infrastructure based on semantic interoperability, where individual data remain in-country and open-access scripts for data management and analysis travel around the hubs composing the infrastructure, might be a sensible way forward.
Skiing economy and efficiency in recreational and elite cross-country skiers.
Ainegren, Mats; Carlsson, Peter; Tinnsten, Mats; Laaksonen, Marko S
2013-05-01
The purpose of this study was to investigate and compare skiing economy and gross efficiency in cross-country skiers of different performance levels, ages and genders; male recreational skiers and elite senior and junior cross-country skiers of both genders. The skiers performed tests involving roller skiing on a treadmill using the gear 3 and diagonal stride techniques. The elite cross-country skiers were found to have better skiing economy and higher gross efficiency (5-18%) compared with the recreational skiers (p < 0.05) and the senior elite had better economy and higher efficiency (4-5%) than their junior counterparts (p < 0.05), whereas no differences could be found between the genders. Also, large ranges in economy and gross efficiency were found in all groups. It was concluded that, in addition to V[Combining Dot Above]O2peak, skiing economy and gross efficiency have a great influence on the differences in performance times between recreational and junior and senior elite cross-country skiers, as well as between individual skiers within the different categories. Thus, we recommend cross-country skiers at all performance levels to test not only V[Combining Dot Above]O2peak, but also skiing economy and efficiency.
Smuggling and cross border shopping of tobacco in Europe.
Joossens, L; Raw, M
1995-05-27
Governments have recently become concerned about cross border shopping and smuggling because it can decrease tax revenue. The tobacco industry predicted that, with the removal of border controls in the European Union, price differences between neighbouring countries would lead to a diversion of tobacco trade, legally and illegally, to countries with cheaper cigarettes. According to them this diversion would be through increased cross border shopping for personal consumption or through increased smuggling of cheap cigarettes from countries with low tax to countries with high tax, where cigarettes are more expensive. These arguments have been used to urge governments not to increase tax on tobacco products. The evidence suggests, however, that cross border shopping is not yet a problem in Europe and that smuggling is not of cheap cigarettes to expensive countries. Instead, more expensive "international" brands are smuggled into northern Europe and sold illegally on the streets of the cheaper countries of southern Europe.
Smuggling and cross border shopping of tobacco in Europe.
Joossens, L.; Raw, M.
1995-01-01
Governments have recently become concerned about cross border shopping and smuggling because it can decrease tax revenue. The tobacco industry predicted that, with the removal of border controls in the European Union, price differences between neighbouring countries would lead to a diversion of tobacco trade, legally and illegally, to countries with cheaper cigarettes. According to them this diversion would be through increased cross border shopping for personal consumption or through increased smuggling of cheap cigarettes from countries with low tax to countries with high tax, where cigarettes are more expensive. These arguments have been used to urge governments not to increase tax on tobacco products. The evidence suggests, however, that cross border shopping is not yet a problem in Europe and that smuggling is not of cheap cigarettes to expensive countries. Instead, more expensive "international" brands are smuggled into northern Europe and sold illegally on the streets of the cheaper countries of southern Europe. PMID:7787549
Moving Beyond the 10,000 Ways That Don't Work
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Arctur, D. K.; Rueda, C.
2009-12-01
From his research in developing light bulb filaments, Thomas Edison provide us with a good lesson to advance any venture. He said "I have not failed, I've just found 10,000 ways that won't work." Advancing data and access interoperability is one of those ventures difficult to achieve because of the differences among the participating communities. Even within the marine domain, different communities exist and with them different technologies (formats and protocols) to publish data and its descriptions, and different vocabularies to name things (e.g. parameters, sensor types). Simplifying the heterogeneity of technologies is not only accomplished by adopting standards, but by creating profiles, and advancing tools that use those standards. In some cases, standards are advanced by building from existing tools. But what is the best strategy? Edison could provide us a hint. Prototypes and test beds are essential to achieve interoperability among geospatial communities. The Open Geospatial Consortium (OGC) calls them interoperability experiments. The World Wide Web Consortium (W3C) calls them incubator projects. Prototypes help test and refine specifications. The Marine Metadata Interoperability (MMI) Initiative, which is advancing marine data integration and re-use by promoting community solutions, understood this strategy and started an interoperability demonstration with the SURA Coastal Ocean Observing and Prediction (SCOOP) program. This interoperability demonstration transformed into the OGC Ocean Science Interoperability Experiment (Oceans IE). The Oceans IE brings together the Ocean-Observing community to advance interoperability of ocean observing systems by using OGC Standards. The Oceans IE Phase I investigated the use of OGC Web Feature Service (WFS) and OGC Sensor Observation Service (SOS) standards for representing and exchanging point data records from fixed in-situ marine platforms. The Oceans IE Phase I produced an engineering best practices report, advanced reference implementations, and submitted various change requests that are now being considered by the OGC SOS working group. Building on Phase I, and with a focus on semantically-enabled services, Oceans IE Phase II will continue the use and improvement of OGC specifications in the marine community. We will present the lessons learned and in particular the strategy of experimenting with technologies to advance standards to publish data in marine communities, which could also help advance interoperability in other geospatial communities. We will also discuss the growing collaborations among ocean-observing standards organizations that will bring about the institutional acceptance needed for these technologies and practices to gain traction globally.
Insights into Broker - User interactions from the BCube Project
NASA Astrophysics Data System (ADS)
Santoro, M.; Nativi, S.; Pearlman, J.; Khalsa, S. J. S.; Fulweiler, R. W.
2015-12-01
Introducing a broad brokering capability for science interoperability and cross-disciplinary research has many challenges and perspectives. Developing a business model that is sustainable is one aspect. Engaging and supporting the science research community is a second. In working with this community, significant added value must be provided. Various facets of the broker capability from discovery and access to data transformations and mapping are elements that were examined and applied to science use cases. In this presentation, we look at these facets and their benefits and challenges for specific use cases in the areas of ocean, coastal and arctic research . Specific recommendations for future implementations will be discussed.
Space Station Information System - Concepts and international issues
NASA Technical Reports Server (NTRS)
Williams, R. B.; Pruett, David; Hall, Dana L.
1987-01-01
The Space Station Information System (SSIS) is outlined in terms of its functions and probable physical facilities. The SSIS includes flight element systems as well as existing and planned institutional systems such as the NASA Communications System, the Tracking and Data Relay Satellite System, and the data and communications networks of the international partners. The SSIS strives to provide both a 'user friendly' environment and a software environment which will allow for software transportability and interoperability across the SSIS. International considerations are discussed as well as project management, software commonality, data communications standards, data security, documentation commonality, transaction management, data flow cross support, and key technologies.
NASA Astrophysics Data System (ADS)
2018-01-01
The large amount of data generated by modern space missions calls for a change of organization of data distribution and access procedures. Although long term archives exist for telescopic and space-borne observations, high-level functions need to be developed on top of these repositories to make Planetary Science and Heliophysics data more accessible and to favor interoperability. Results of simulations and reference laboratory data also need to be integrated to support and interpret the observations. Interoperable software and interfaces have recently been developed in many scientific domains. The Virtual Observatory (VO) interoperable standards developed for Astronomy by the International Virtual Observatory Alliance (IVOA) can be adapted to Planetary Sciences, as demonstrated by the VESPA (Virtual European Solar and Planetary Access) team within the Europlanet-H2020-RI project. Other communities have developed their own standards: GIS (Geographic Information System) for Earth and planetary surfaces tools, SPASE (Space Physics Archive Search and Extract) for space plasma, PDS4 (NASA Planetary Data System, version 4) and IPDA (International Planetary Data Alliance) for planetary mission archives, etc, and an effort to make them interoperable altogether is starting, including automated workflows to process related data from different sources.
Interoperability of Neuroscience Modeling Software
Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik
2009-01-01
Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374
A development framework for semantically interoperable health information systems.
Lopez, Diego M; Blobel, Bernd G M E
2009-02-01
Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.
A SOA-Based Platform to Support Clinical Data Sharing.
Gazzarata, R; Giannini, B; Giacomini, M
2017-01-01
The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the "Interoperable" Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the "Interoperable" Tier, the current solution actually covers the "Connected" Tier, due to local hospital policy restrictions.
Expansion of Higher Education and Inequality of Opportunities: A Cross-National Analysis
ERIC Educational Resources Information Center
Liu, Ye; Green, Andy; Pensiero, Nicola
2016-01-01
This study extends the comparative model of country groups to analyse the cross-national trends in the higher education expansion and opportunities. We use descriptive data on characteristics and outcomes of higher education systems in different countries groups, including the liberal market countries, the social democratic countries, the…
Interoperable Archetypes With a Three Folded Terminology Governance.
Pederson, Rune; Ellingsen, Gunnar
2015-01-01
The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded.
CCP interoperability and system stability
NASA Astrophysics Data System (ADS)
Feng, Xiaobing; Hu, Haibo
2016-09-01
To control counterparty risk, financial regulations such as the Dodd-Frank Act are increasingly requiring standardized derivatives trades to be cleared by central counterparties (CCPs). It is anticipated that in the near term future, CCPs across the world will be linked through interoperability agreements that facilitate risk sharing but also serve as a conduit for transmitting shocks. This paper theoretically studies a networked network with CCPs that are linked through interoperability arrangements. The major finding is that the different configurations of networked network CCPs contribute to the different properties of the cascading failures.
The Importance of State and Context in Safe Interoperable Medical Systems
Jaffe, Michael B.; Robkin, Michael; Rausch, Tracy; Arney, David; Goldman, Julian M.
2016-01-01
This paper describes why “device state” and “patient context” information are necessary components of device models for safe interoperability. This paper includes a discussion of the importance of describing the roles of devices with respect to interactions (including human user workflows involving devices, and device to device communication) within a system, particularly those intended for use at the point-of-care, and how this role information is communicated. In addition, it describes the importance of clinical scenarios in creating device models for interoperable devices. PMID:27730013
Integrating technology to improve medication administration.
Prusch, Amanda E; Suess, Tina M; Paoletti, Richard D; Olin, Stephen T; Watts, Starann D
2011-05-01
The development, implementation, and evaluation of an i.v. interoperability program to advance medication safety at the bedside are described. I.V. interoperability integrates intelligent infusion devices (IIDs), the bar-code-assisted medication administration system, and the electronic medication administration record system into a bar-code-driven workflow that populates provider-ordered, pharmacist-validated infusion parameters on IIDs. The purpose of this project was to improve medication safety through the integration of these technologies and decrease the potential for error during i.v. medication administration. Four key phases were essential to developing and implementing i.v. interoperability: (a) preparation, (b) i.v. interoperability pilot, (c) preliminary validation, and (d) expansion. The establishment of pharmacy involvement in i.v. interoperability resulted in two additional safety checks: pharmacist infusion rate oversight and nurse independent validation of the autoprogrammed rate. After instituting i.v. interoperability, monthly compliance to the telemetry drug library increased to a mean ± S.D. of 72.1% ± 2.1% from 56.5% ± 1.5%, and the medical-surgical nursing unit's drug library monthly compliance rate increased to 58.6% ± 2.9% from 34.1% ± 2.6% (p < 0.001 for both comparisons). The number of manual pump edits decreased with both telemetry and medical-surgical drug libraries, demonstrating a reduction from 56.9 ± 12.8 to 14.2 ± 3.9 and from 61.2 ± 15.4 to 14.7 ± 3.8, respectively (p < 0.001 for both comparisons). Through the integration and incorporation of pharmacist oversight for rate changes, the telemetry and medical-surgical patient care areas demonstrated a 32% reduction in reported monthly errors involving i.v. administration of heparin. By integrating two stand-alone technologies, i.v. interoperability was implemented to improve medication administration. Medication errors were reduced, nursing workflow was simplified, and pharmacists became involved in checking infusion rates of i.v. medications.
NASA Astrophysics Data System (ADS)
Glaves, H. M.
2015-12-01
In recent years marine research has become increasingly multidisciplinary in its approach with a corresponding rise in the demand for large quantities of high quality interoperable data as a result. This requirement for easily discoverable and readily available marine data is currently being addressed by a number of regional initiatives with projects such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Integrated Marine Observing System (IMOS) in Australia, having implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these systems has been developed to address local requirements and created in isolation from those in other regions.Multidisciplinary marine research on a global scale necessitates a common framework for marine data management which is based on existing data systems. The Ocean Data Interoperability Platform project is seeking to address this requirement by bringing together selected regional marine e-infrastructures for the purposes of developing interoperability across them. By identifying the areas of commonality and incompatibility between these data infrastructures, and leveraging the development activities and expertise of these individual systems, three prototype interoperability solutions are being created which demonstrate the effective sharing of marine data and associated metadata across the participating regional data infrastructures as well as with other target international systems such as GEO, COPERNICUS etc.These interoperability solutions combined with agreed best practice and approved standards, form the basis of a common global approach to marine data management which can be adopted by the wider marine research community. To encourage implementation of these interoperability solutions by other regional marine data infrastructures an impact assessment is being conducted to determine both the technical and financial implications of deploying them alongside existing services. The associated best practice and common standards are also being disseminated to the user community through relevant accreditation processes and related initiatives such as the Research Data Alliance and the Belmont Forum.
Parel, I; Cutti, A G; Fiumana, G; Porcellini, G; Verni, G; Accardo, A P
2012-04-01
To measure the scapulohumeral rhythm (SHR) in outpatient settings, the motion analysis protocol named ISEO (INAIL Shoulder and Elbow Outpatient protocol) was developed, based on inertial and magnetic sensors. To complete the sensor-to-segment calibration, ISEO requires the involvement of an operator for sensor placement and for positioning the patient's arm in a predefined posture. Since this can affect the measure, this study aimed at quantifying ISEO intra- and inter-operator agreement. Forty subjects were considered, together with two operators, A and B. Three measurement sessions were completed for each subject: two by A and one by B. In each session, the humerus and scapula rotations were measured during sagittal and scapular plane elevation movements. ISEO intra- and inter-operator agreement were assessed by computing, between sessions, the: (1) similarity of the scapulohumeral patterns through the Coefficient of Multiple Correlation (CMC(2)), both considering and excluding the difference of the initial value of the scapula rotations between two sessions (inter-session offset); (2) 95% Smallest Detectable Difference (SDD(95)) in scapula range of motion. Results for CMC(2) showed that the intra- and inter-operator agreement is acceptable (median≥0.85, lower-whisker ≥ 0.75) for most of the scapula rotations, independently from the movement and the inter-session offset. The only exception is the agreement for scapula protraction-retraction and for scapula medio-lateral rotation during abduction (inter-operator), which is acceptable only if the inter-session offset is removed. SDD(95) values ranged from 4.4° to 8.6° for the inter-operator and between 4.9° and 8.5° for the intra-operator agreement. In conclusion, ISEO presents a high intra- and inter-operator agreement, particularly with the scapula inter-session offset removed. Copyright © 2011 Elsevier B.V. All rights reserved.
Positive train control interoperability and networking research : final report.
DOT National Transportation Integrated Search
2015-12-01
This document describes the initial development of an ITC PTC Shared Network (IPSN), a hosted : environment to support the distribution, configuration management, and IT governance of Interoperable : Train Control (ITC) Positive Train Control (PTC) s...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-01
... FEDERAL COMMUNICATIONS COMMISSION Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public... persons that the Federal Communications Commission's (FCC or Commission) Communications Security...
CCSDS SM and C Mission Operations Interoperability Prototype
NASA Technical Reports Server (NTRS)
Lucord, Steven A.
2010-01-01
This slide presentation reviews the prototype of the Spacecraft Monitor and Control (SM&C) Operations for interoperability among other space agencies. This particular prototype uses the German Space Agency (DLR) to test the ideas for interagency coordination.
RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service
NASA Astrophysics Data System (ADS)
Yang, Chao; Chen, Nengcheng; Di, Liping
2012-10-01
Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.
An Interoperability Framework and Capability Profiling for Manufacturing Software
NASA Astrophysics Data System (ADS)
Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.
ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.
Metadata behind the Interoperability of Wireless Sensor Networks
Ballari, Daniela; Wachowicz, Monica; Callejo, Miguel Angel Manso
2009-01-01
Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability. PMID:22412330
Metadata behind the Interoperability of Wireless Sensor Networks.
Ballari, Daniela; Wachowicz, Monica; Callejo, Miguel Angel Manso
2009-01-01
Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, Tian-Jy; Kim, Younghun
Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function tomore » convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.« less
Clinical data interoperability based on archetype transformation.
Costa, Catalina Martínez; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2011-10-01
The semantic interoperability between health information systems is a major challenge to improve the quality of clinical practice and patient safety. In recent years many projects have faced this problem and provided solutions based on specific standards and technologies in order to satisfy the needs of a particular scenario. Most of such solutions cannot be easily adapted to new scenarios, thus more global solutions are needed. In this work, we have focused on the semantic interoperability of electronic healthcare records standards based on the dual model architecture and we have developed a solution that has been applied to ISO 13606 and openEHR. The technological infrastructure combines reference models, archetypes and ontologies, with the support of Model-driven Engineering techniques. For this purpose, the interoperability infrastructure developed in previous work by our group has been reused and extended to cover the requirements of data transformation. Copyright © 2011 Elsevier Inc. All rights reserved.
Dynamic Business Networks: A Headache for Sustainable Systems Interoperability
NASA Astrophysics Data System (ADS)
Agostinho, Carlos; Jardim-Goncalves, Ricardo
Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.
NASA Technical Reports Server (NTRS)
Fischer, Daniel; Aguilar-Sanchez, Ignacio; Saba, Bruno; Moury, Gilles; Biggerstaff, Craig; Bailey, Brandon; Weiss, Howard; Pilgram, Martin; Richter, Dorothea
2015-01-01
The protection of data transmitted over the space-link is an issue of growing importance also for civilian space missions. Through the Consultative Committee for Space Data Systems (CCSDS), space agencies have reacted to this need by specifying the Space Data-Link Layer Security (SDLS) protocol which provides confidentiality and integrity services for the CCSDS Telemetry (TM), Telecommand (TC) and Advanced Orbiting Services (AOS) space data-link protocols. This paper describes the approach of the CCSDS SDLS working group to specify and execute the necessary interoperability tests. It first details the individual SDLS implementations that have been produced by ESA, NASA, and CNES and then the overall architecture that allows the interoperability tests between them. The paper reports on the results of the interoperability tests and identifies relevant aspects for the evolution of the test environment.
An Architecture for Semantically Interoperable Electronic Health Records.
Toffanello, André; Gonçalves, Ricardo; Kitajima, Adriana; Puttini, Ricardo; Aguiar, Atualpa
2017-01-01
Despite the increasing adhesion of electronic health records, the challenge of semantic interoperability remains unsolved. The fact that different parties can exchange messages does not mean they can understand the underlying clinical meaning, therefore, it cannot be assumed or treated as a requirement. This work introduces an architecture designed to achieve semantic interoperability, in a way which organizations that follow different policies may still share medical information through a common infrastructure comparable to an ecosystem, whose organisms are exemplified within the Brazilian scenario. Nonetheless, the proposed approach describes a service-oriented design with modules adaptable to different contexts. We also discuss the establishment of an enterprise service bus to mediate a health infrastructure defined on top of international standards, such as openEHR and IHE. Moreover, we argue that, in order to achieve truly semantic interoperability in a wide sense, a proper profile must be published and maintained.
Sharing and interoperation of Digital Dongying geospatial data
NASA Astrophysics Data System (ADS)
Zhao, Jun; Liu, Gaohuan; Han, Lit-tao; Zhang, Rui-ju; Wang, Zhi-an
2006-10-01
Digital Dongying project was put forward by Dongying city, Shandong province, and authenticated by Ministry of Information Industry, Ministry of Science and Technology and Ministry of Construction P.R.CHINA in 2002. After five years of building, informationization level of Dongying has reached to the advanced degree. In order to forward the step of digital Dongying building, and to realize geospatial data sharing, geographic information sharing standards are drawn up and applied into realization. Secondly, Digital Dongying Geographic Information Sharing Platform has been constructed and developed, which is a highly integrated platform of WEBGIS. 3S (GIS, GPS, RS), Object oriented RDBMS, Internet, DCOM, etc. It provides an indispensable platform for sharing and interoperation of Digital Dongying Geospatial Data. According to the standards, and based on the platform, sharing and interoperation of "Digital Dongying" geospatial data have come into practice and the good results have been obtained. However, a perfect leadership group is necessary for data sharing and interoperation.
Operational Interoperability Challenges on the Example of GEOSS and WIS
NASA Astrophysics Data System (ADS)
Heene, M.; Buesselberg, T.; Schroeder, D.; Brotzer, A.; Nativi, S.
2015-12-01
The following poster highlights the operational interoperability challenges on the example of Global Earth Observation System of Systems (GEOSS) and World Meteorological Organization Information System (WIS). At the heart of both systems is a catalogue of earth observation data, products and services but with different metadata management concepts. While in WIS a strong governance with an own metadata profile for the hundreds of thousands metadata records exists, GEOSS adopted a more open approach for the ten million records. Furthermore, the development of WIS - as an operational system - follows a roadmap with committed downwards compatibility while the GEOSS development process is more agile. The poster discusses how the interoperability can be reached for the different metadata management concepts and how a proxy concept helps to couple two different systems which follow a different development methodology. Furthermore, the poster highlights the importance of monitoring and backup concepts as a verification method for operational interoperability.
Academic Research Library as Broker in Addressing Interoperability Challenges for the Geosciences
NASA Astrophysics Data System (ADS)
Smith, P., II
2015-12-01
Data capture is an important process in the research lifecycle. Complete descriptive and representative information of the data or database is necessary during data collection whether in the field or in the research lab. The National Science Foundation's (NSF) Public Access Plan (2015) mandates the need for federally funded projects to make their research data more openly available. Developing, implementing, and integrating metadata workflows into to the research process of the data lifecycle facilitates improved data access while also addressing interoperability challenges for the geosciences such as data description and representation. Lack of metadata or data curation can contribute to (1) semantic, (2) ontology, and (3) data integration issues within and across disciplinary domains and projects. Some researchers of EarthCube funded projects have identified these issues as gaps. These gaps can contribute to interoperability data access, discovery, and integration issues between domain-specific and general data repositories. Academic Research Libraries have expertise in providing long-term discovery and access through the use of metadata standards and provision of access to research data, datasets, and publications via institutional repositories. Metadata crosswalks, open archival information systems (OAIS), trusted-repositories, data seal of approval, persistent URL, linking data, objects, resources, and publications in institutional repositories and digital content management systems are common components in the library discipline. These components contribute to a library perspective on data access and discovery that can benefit the geosciences. The USGS Community for Data Integration (CDI) has developed the Science Support Framework (SSF) for data management and integration within its community of practice for contribution to improved understanding of the Earth's physical and biological systems. The USGS CDI SSF can be used as a reference model to map to EarthCube Funded projects with academic research libraries facilitating the data and information assets components of the USGS CDI SSF via institutional repositories and/or digital content management. This session will explore the USGS CDI SSF for cross-discipline collaboration considerations from a library perspective.
NASA Astrophysics Data System (ADS)
Pulsifer, P. L.; Parsons, M. A.; Duerr, R. E.; Fox, P. A.; Khalsa, S. S.; McCusker, J. P.; McGuinness, D. L.
2012-12-01
To address interoperability, we first need to understand how human perspectives and worldviews influence the way people conceive of and describe geophysical phenomena. There is never a single, unambiguous description of a phenomenon - the terminology used is based on the relationship people have with it and what their interests are. So how can these perspectives be reconciled in a way that is not only clear to different people but also formally described so that information systems can interoperate? In this paper we explore conceptions of Arctic sea ice as a means of exploring these issues. We examine multiple conceptions of sea ice and related processes as fundamental components of the Earth system. Arctic sea ice is undergoing rapid and dramatic decline. This will have huge impact on climate and biological systems as well as on shipping, exploration, human culture, and geopolitics. Local hunters, operational shipping forecasters, global climate researchers, and others have critical needs for sea ice data and information, but they conceive of, and describe sea ice phenomena in very different ways. Our hypothesis is that formally representing these diverse conceptions in a suite of formal ontologies can help facilitate sharing of information across communities and enhance overall Arctic data interoperability. We present initial work to model operational, research, and Indigenous (Iñupiat and Yup'ik) concepts of sea ice phenomena and data. Our results illustrate important and surprising differences in how these communities describe and represent sea ice, and we describe our approach to resolving incongruities and inconsistencies. We begin by exploring an intriguing information artifact, the World Meteorological Organization "egg code". The egg code is a compact, information rich way of illustrating detailed ice conditions that has been used broadly for a century. There is much agreement on construction and content encoding, but there are important regional differences in its application. Furthermore, it is an analog encoding scheme whose meaning has evolved over time. By semantically modeling the egg code, its subtle variations, and how it connects to other data, we illustrate a mechanism for translating across data formats and representations. But there are limits to what semantically modeling the egg-code can achieve. The egg-code and common operational sea ice formats do not address community needs, notably the timing and processes of sea ice freeze-up and break-up which have profound impact on local hunting, shipping, oil exploration, and safety. We work with local experts from four very different Indigenous communities and scientific creators of sea ice forecasts to establish an understanding of concepts and terminology related to fall freeze-up and spring break up from the individually represented regions. This helps expand our conceptions of sea ice while also aiding in understanding across cultures and communities, and in passing knowledge to younger generations. This is an early step to expanding concepts of interoperability to very different ways of knowing to make data truly relevant and locally useful.
Chapter 18: Web-based Tools - NED VO Services
NASA Astrophysics Data System (ADS)
Mazzarella, J. M.; NED Team
The NASA/IPAC Extragalactic Database (NED) is a thematic, web-based research facility in widespread use by scientists, educators, space missions, and observatory operations for observation planning, data analysis, discovery, and publication of research about objects beyond our Milky Way galaxy. NED is a portal into a systematic fusion of data from hundreds of sky surveys and tens of thousands of research publications. The contents and services span the entire electromagnetic spectrum from gamma rays through radio frequencies, and are continuously updated to reflect the current literature and releases of large-scale sky survey catalogs. NED has been on the Internet since 1990, growing in content, automation and services with the evolution of information technology. NED is the world's largest database of crossidentified extragalactic objects. As of December 2006, the system contains approximately 10 million objects and 15 million multi-wavelength cross-IDs. Over 4 thousand catalogs and published lists covering the entire electromagnetic spectrum have had their objects cross-identified or associated, with fundamental data parameters federated for convenient queries and retrieval. This chapter describes the interoperability of NED services with other components of the Virtual Observatory (VO). Section 1 is a brief overview of the primary NED web services. Section 2 provides a tutorial for using NED services currently available through the NVO Registry. The "name resolver" provides VO portals and related internet services with celestial coordinates for objects specified by catalog identifier (name); any alias can be queried because this service is based on the source cross-IDs established by NED. All major services have been updated to provide output in VOTable (XML) format that can be accessed directly from the NED web interface or using the NVO registry. These include access to images via SIAP, Cone- Search queries, and services providing fundamental, multi-wavelength extragalactic data such as positions, redshifts, photometry and spectral energy distributions (SEDs), and sizes (all with references and uncertainties when available). Section 3 summarizes the advantages of accessing the NED "name resolver" and other NED services via the web to replace the legacy "server mode" custom data structure previously available through a function library provided only in the C programming language. Section 4 illustrates visualization via VOPlot of an SED and the spatial distribution of sources from a NED All-Sky (By Parameters) query. Section 5 describes the new NED Spectral Archive, illustrating how VOTables are being used to standardize the data and metadata as well as the physical units of spectra made available by authors of journal articles and producers of major survey archives; quick-look spectral analysis through convenient interoperability with the SpecView (STScI) Java applet is also shown. Section 6 closes with a summary of the capabilities described herein, which greatly simplify interoperability of NED with other components of the VO, enabling new opportunities for discovery, visualization, and analysis of multiwavelength data.
The International Planetary Data Alliance (IPDA)
NASA Astrophysics Data System (ADS)
Stein, Thomas; Gopala Krishna, Barla; Crichton, Daniel J.
2016-07-01
The International Planetary Data Alliance (IPDA) is a close association of partners with the aim of improving the quality of planetary science data and services to the end users of space based instrumentation. The specific mission of the IPDA is to facilitate global access to, and exchange of, high quality scientific data products managed across international boundaries. Ensuring proper capture, accessibility and availability of the data is the task of the individual member space agencies. The IPDA is focused on developing an international standard that allows discovery, query, access, and usage of such data across international planetary data archive systems. While trends in other areas of space science are concentrating on the sharing of science data from diverse standards and collection methods, the IPDA concentrates on promoting governing data standards that drive common methods for collecting and describing planetary science data across the international community. This approach better supports the long term goal of easing data sharing across system and agency boundaries. An initial starting point for developing such a standard will be internationalization of NASA's Planetary Data System's (PDS) PDS4 standard. The IPDA was formed in 2006 with the purpose of adopting standards and developing collaborations across agencies to ensure data is captured in common formats. It has grown to a dozen member agencies represented by a number of different groups through the IPDA Steering Committee. Member agencies include: Armenian Astronomical Society, China National Space Agency (CNSA), European Space Agency (ESA), German Aerospace Center (DLR), Indian Space Research Organization (ISRO), Italian Space Agency (ASI), Japanese Aerospace Exploration Agency (JAXA), National Air and Space Administration (NASA), National Centre for Space Studies (CNES), Space Research Institute (IKI), UAE Space Agency, and UK Space Agency. The IPDA Steering Committee oversees the execution of projects and coordinates international collaboration. In executing its mission, the IPDA conducts a number of focused projects to enable interoperability, construction of compatible archives, and the operation of the IPDA as a whole. These projects have helped to establish the IPDA and to move the collaboration forward. A key project that is currently underway is the implementation of the PDS4 data standard. Given the international focus, it has been critical that the PDS and the IPDA collaborate on its development. Also, many other projects have been conducted successfully, including the IPDA Requirements Document, Data Dictionary Modelling, ESA Registry Integration, the Tools Registry, and several demonstrations of interoperability protocols applied to specific missions and data sets (PDS4/PDAP (Planetary Data Access Protocol), Venus Express Interoperability). The IPDA has grown significantly since its first meetings back in November 2006. The steering committee is composed today of 28 members from 24 countries or international organizations. In addition, a technical expert group composed of 20 members from participating countries provides supportive input on technical and compatibility issues. A number of IPDA projects are ongoing, including the creation of the Memorandum of Understanding (MOU) template for international missions; the investigation of IVOA/IPDA (International Virtual Observatory Alliance-IVOA) interaction; PDS4 implementation project; the development of international registries to enable registration and search of data, tools and services; and Chandrayaan-1 interoperability project with PDAP. In addition, the IPDA continues with outreach activities, being present or represented at national and international levels and at meetings such as COSPAR, AGU, EPSC, and EGU. Further information on IPDA activities, standards, and tools are available at the web page http://www.planetarydata.org. Tool and service developers are encouraged to register their products at the IPDA web site.
ERIC Educational Resources Information Center
Sanderson, Matthew
2010-01-01
Contemporary levels of international migration in less-developed countries are raising new and important questions regarding the consequences of immigration for human welfare and well-being. However, there is little systematic cross-national evidence of how international migration affects human development levels in migrant-receiving countries in…
Globalization and Contemporary Fertility Convergence.
Hendi, Arun S
2017-09-01
The rise of the global network of nation-states has precipitated social transformations throughout the world. This article examines the role of political and economic globalization in driving fertility convergence across countries between 1965 and 2009. While past research has typically conceptualized fertility change as a country-level process, this study instead employs a theoretical and methodological framework that examines differences in fertility between pairs of countries over time. Convergence in fertility between pairs of countries is hypothesized to result from increased cross-country connectedness and cross-national transmission of fertility-related schemas. I investigate the impact of various cross-country ties, including ties through bilateral trade, intergovernmental organizations, and regional trade blocs, on fertility convergence. I find that globalization acts as a form of social interaction to produce fertility convergence. There is significant heterogeneity in the effects of different cross-country ties. In particular, trade with rich model countries, joint participation in the UN and UNESCO, and joining a free trade agreement all contribute to fertility convergence between countries. Whereas the prevailing focus in fertility research has been on factors producing fertility declines, this analysis highlights specific mechanisms-trade and connectedness through organizations-leading to greater similarity in fertility across countries. Globalization is a process that propels the spread of culturally laden goods and schemas impinging on fertility, which in turn produces fertility convergence.
Gaia Data Release 1. Cross-match with external catalogues. Algorithm and results
NASA Astrophysics Data System (ADS)
Marrese, P. M.; Marinoni, S.; Fabrizio, M.; Giuffrida, G.
2017-11-01
Context. Although the Gaia catalogue on its own will be a very powerful tool, it is the combination of this highly accurate archive with other archives that will truly open up amazing possibilities for astronomical research. The advanced interoperation of archives is based on cross-matching, leaving the user with the feeling of working with one single data archive. The data retrieval should work not only across data archives, but also across wavelength domains. The first step for seamless data access is the computation of the cross-match between Gaia and external surveys. Aims: The matching of astronomical catalogues is a complex and challenging problem both scientifically and technologically (especially when matching large surveys like Gaia). We describe the cross-match algorithm used to pre-compute the match of Gaia Data Release 1 (DR1) with a selected list of large publicly available optical and IR surveys. Methods: The overall principles of the adopted cross-match algorithm are outlined. Details are given on the developed algorithm, including the methods used to account for position errors, proper motions, and environment; to define the neighbours; and to define the figure of merit used to select the most probable counterpart. Results: Statistics on the results are also given. The results of the cross-match are part of the official Gaia DR1 catalogue.
NASA Astrophysics Data System (ADS)
Glaves, Helen
2015-04-01
Marine research is rapidly moving away from traditional discipline specific science to a wider ecosystem level approach. This more multidisciplinary approach to ocean science requires large amounts of good quality, interoperable data to be readily available for use in an increasing range of new and complex applications. Significant amounts of marine data and information are already available throughout the world as a result of e-infrastructures being established at a regional level to manage and deliver marine data to the end user. However, each of these initiatives has been developed to address specific regional requirements and independently of those in other regions. Establishing a common framework for marine data management on a global scale necessitates that there is interoperability across these existing data infrastructures and active collaboration between the organisations responsible for their management. The Ocean Data Interoperability Platform (ODIP) project is promoting co-ordination between a number of these existing regional e-infrastructures including SeaDataNet and Geo-Seas in Europe, the Integrated Marine Observing System (IMOS) in Australia, the Rolling Deck to Repository (R2R) in the USA and the international IODE initiative. To demonstrate this co-ordinated approach the ODIP project partners are currently working together to develop several prototypes to test and evaluate potential interoperability solutions for solving the incompatibilities between the individual regional marine data infrastructures. However, many of the issues being addressed by the Ocean Data Interoperability Platform are not specific to marine science. For this reason many of the outcomes of this international collaborative effort are equally relevant and transferable to other domains.
Landslide databases review in the Geological Surveys of Europe
NASA Astrophysics Data System (ADS)
Herrera, Gerardo
2017-04-01
Landslides are one of the most widespread geohazards in Europe, producing significant social and economic damages. Rapid population growth in urban areas throughout many countries in Europe and extreme climatic scenarios can considerably increase landslide risk in the near future. However, many European countries do not include landslide risk into their legislation. Countries lack official methodological assessment guidelines and knowledge about landslide impacts. Although regional and national landslide databases exist in most countries, they are often not integrated because they are owed by different institutions. Hence, a European Landslides Directive, that provides a common legal framework for dealing with landslides, is necessary. With this long-term goal in mind, we present a review of the landslide databases from the Geological Surveys of Europe focusing on their interoperability. The same landslide classification was used for the 849,543 landslide records from the Geological Surveys, from which 36% are slides, 10 % falls, 20% flows, 11% complex slides and 24% remain either unclassified or correspond to another typology. A landslide density map was produced from the available records of the Geological Surveys of 17 countries showing the variable distribution of landslides. There are 0.2 million km2 of landslide prone areas. The comparison of this map with the European landslide susceptibility map ELSUS v1 was successful for 73% of the predictions, and permitted identification of 25% of susceptible areas where landslide records are not available from the Geological Surveys. Taking these results into account the completeness of these landslide databases was evaluated, revealing different landslide hazard management approaches between surveys and countries.
When They Can't Talk, Lives Are Lost : What Public Officials Need to Know About Interoperability
DOT National Transportation Integrated Search
2003-02-01
This publication was developed to facilitate education and discussion among and between elected and appointed officials, their representative associations, and public safety representatives on public safety wireless communications interoperability. T...
Design and Implementation of a REST API for the Human Well Being Index (HWBI)
Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...
Design and Implementation of a REST API for the ?Human Well Being Index (HWBI)
Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...
Luedke, Lace E; Heiderscheit, Bryan C; Williams, D S Blaise; Rauh, Mitchell J
2015-11-01
High school cross country runners have a high incidence of overuse injuries, particularly to the knee and shin. As lower extremity strength is modifiable, identification of strength attributes that contribute to anterior knee pain (AKP) and shin injuries may influence prevention and management of these injuries. To determine if a relationship existed between isometric hip abductor, knee extensor and flexor strength and the incidence of AKP and shin injury in high school cross country runners. Sixty-eight high school cross country runners (47 girls, 21 boys) participated in the study. Isometric strength tests of hip abductors, knee extensors and flexors were performed with a handheld dynamometer. Runners were prospectively followed during the 2014 interscholastic cross country season for occurrences of AKP and shin injury. Bivariate logistic regression was used to examine risk relationships between strength values and occurrence of AKP and shin injury. During the season, three (4.4%) runners experienced AKP and 13 (19.1%) runners incurred a shin injury. Runners in the tertiles indicating weakest hip abductor (chi-square = 6.140; p=0.046), knee extensor (chi-square = 6.562; p=0.038), and knee flexor (chi-square = 6.140; p=0.046) muscle strength had a significantly higher incidence of AKP. Hip and knee muscle strength was not significantly associated with shin injury. High school cross country runners with weaker hip abductor, knee extensor and flexor muscle strength had a higher incidence of AKP. Increasing hip and knee muscle strength may reduce the likelihood of AKP in high school cross country runners. 2b.
A Systematic Review of Mobile Health Technology Use in Developing Countries.
Alghamdi, Manal; Gashgari, Horeya; Househ, Mowafa
2015-01-01
In developing countries, patients are now more informed about their healthcare options as a result of their use of mobile health (mHealth) technologies. The purpose of this paper is to describe the opportunities and challenges in using mHealth technologies for developing countries. In April 2015, Google Scholar and PubMed were searched to identify articles discussing the types, advantages and disadvantages, effectiveness, evaluation of mHealth technologies, and examples of mHealth implementation in developing countries. A total number of 3,803 articles were retrieved from both databases. Articles reporting the benefits and risks, effectiveness, and evaluation of mHealth were included. Articles that were written in English and from developing countries were also included. We excluded papers that were published before 2005, not written in English, and that were technical in nature. After screening the articles using the inclusion and exclusion criteria, 27 articles were selected for inclusion in the study. Of the 27 papers included in the review, eight described opportunities and challenges relating to mHealth, four focused on smoking cessation, three focused on weight loss, and four papers focused on chronic diseases. We also identified four articles discussing mHealth evaluation and four discussing the use of mHealth as a health promotion tool. We conclude that mHealth can improve healthcare delivery for developing countries. Some of the advantages of mHealth include: patient education, health promotion, disease self-management, decrease in healthcare costs, and remote monitoring of patients. However, there are several limitations in using mHealth technologies for developing countries, which include: interoperability, lack of evaluation standards, and lack of a technology infrastructure.
76 FR 71048 - Assistance to Firefighters Grant Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-16
... the highest-scoring applications for awards. Applications that involve interoperable communications... categories: (1) Activities designed to reach high- risk target groups and mitigate incidences of death and... communications project is consistent with the Statewide Communications Interoperability Plan (SCIP). If the State...
Identification of port communication equipment needs for safety, security, and interoperability
DOT National Transportation Integrated Search
2007-12-01
Identification of Port Communication Equipment Needs for Safety, Security, and Interoperability is a big concern for current and : future need. The data demonstrates that two-way radios should be the most effective method of communication in both rou...
Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd
2014-01-01
Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.
FLTSATCOM interoperability applications
NASA Astrophysics Data System (ADS)
Woolford, Lynn
A mobile Fleet Satellite Communications (FLTSATCOM) system called the Mobile Operational Control Center (MOCC) was developed which has demonstrated the ability to be interoperable with many of the current FLTSATCOM command and control channels. This low-cost system is secure in all its communications, is lightweight, and provides a gateway for other communications formats. The major elements of this system are made up of a personal computer, a protocol microprocessor, and off-the-shelf mobile communication components. It is concluded that with both FLTSATCOM channel protocol and data format interoperability, the MOCC has the ability provide vital information in or near real time, which significantly improves mission effectiveness.
Ryan, Amanda; Eklund, Peter
2008-01-01
Healthcare information is composed of many types of varying and heterogeneous data. Semantic interoperability in healthcare is especially important when all these different types of data need to interact. Presented in this paper is a solution to interoperability in healthcare based on a standards-based middleware software architecture used in enterprise solutions. This architecture has been translated into the healthcare domain using a messaging and modeling standard which upholds the ideals of the Semantic Web (HL7 V3) combined with a well-known standard terminology of clinical terms (SNOMED CT).
Scientific Digital Libraries, Interoperability, and Ontologies
NASA Technical Reports Server (NTRS)
Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.
2009-01-01
Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.
Current Barriers to Large-scale Interoperability of Traceability Technology in the Seafood Sector.
Hardt, Marah J; Flett, Keith; Howell, Colleen J
2017-08-01
Interoperability is a critical component of full-chain digital traceability, but is almost nonexistent in the seafood industry. Using both quantitative and qualitative methodology, this study explores the barriers impeding progress toward large-scale interoperability among digital traceability systems in the seafood sector from the perspectives of seafood companies, technology vendors, and supply chains as a whole. We highlight lessons from recent research and field work focused on implementing traceability across full supply chains and make some recommendations for next steps in terms of overcoming challenges and scaling current efforts. © 2017 Institute of Food Technologists®.
ERIC Educational Resources Information Center
Arms, William Y.; Hillmann, Diane; Lagoze, Carl; Krafft, Dean; Marisa, Richard; Saylor, John; Terizzi, Carol; Van de Sompel, Herbert; Gill, Tony; Miller, Paul; Kenney, Anne R.; McGovern, Nancy Y.; Botticelli, Peter; Entlich, Richard; Payette, Sandra; Berthon, Hilary; Thomas, Susan; Webb, Colin; Nelson, Michael L.; Allen, B. Danette; Bennett, Nuala A.; Sandore, Beth; Pianfetti, Evangeline S.
2002-01-01
Discusses digital libraries, including interoperability, metadata, and international standards; Web resource preservation efforts at Cornell University; digital preservation at the National Library of Australia; object persistence and availability; collaboration among libraries, museums and elementary schools; Asian digital libraries; and a Web…
Interoperability of Electronic Health Records: A Physician-Driven Redesign.
Miller, Holly; Johns, Lucy
2018-01-01
PURPOSE: Electronic health records (EHRs), now used by hundreds of thousands of providers and encouraged by federal policy, have the potential to improve quality and decrease costs in health care. But interoperability, although technically feasible among different EHR systems, is the weak link in the chain of logic. Interoperability is inhibited by poor understanding, by suboptimal implementation, and at times by a disinclination to dilute market share or patient base on the part of vendors or providers, respectively. The intent of this project has been to develop a series of practicable recommendations that, if followed by EHR vendors and users, can promote and enhance interoperability, helping EHRs reach their potential. METHODOLOGY: A group of 11 physicians, one nurse, and one health policy consultant, practicing from California to Massachusetts, has developed a document titled "Feature and Function Recommendations To Optimize Clinician Usability of Direct Interoperability To Enhance Patient Care" that offers recommendations from the clinician point of view. This report introduces some of these recommendations and suggests their implications for policy and the "virtualization" of EHRs. CONCLUSION: Widespread adoption of even a few of these recommendations by designers and vendors would enable a major advance toward the "Triple Aim" of improving the patient experience, improving the health of populations, and reducing per capita costs.
Translations on USSR Military Affairs, Number 1278
1977-05-19
plans for conducting them. Here special responsibility rests on the staff, as the command body. Precisely clear planning based upon a pro - found...swimming 100 meters freestyle , cross-country race of 3 kilo- meters or skiing 10 kilometers (for age group I), 1-kilometer cross-country race or...Classification), swimming 100 meters freestyle , 3-kilometer cross-country race, figure driving of vehicle Categories 290 3’UO 3^0 3^0 1+00 270 300
NASA Astrophysics Data System (ADS)
Samors, R. J.; Allison, M. L.
2016-12-01
An e-infrastructure that supports data-intensive, multidisciplinary research is being organized under the auspices of the Belmont Forum consortium of national science funding agencies to accelerate the pace of science to address 21st century global change research challenges. The pace and breadth of change in information management across the data lifecycle means that no one country or institution can unilaterally provide the leadership and resources required to use data and information effectively, or needed to support a coordinated, global e-infrastructure. The five action themes adopted by the Belmont Forum: 1. Adopt and make enforceable Data Principles that establish a global, interoperable e-infrastructure. 2. Foster communication, collaboration and coordination between the wider research community and Belmont Forum and its projects through an e-Infrastructure Coordination, Communication, & Collaboration Office. 3. Promote effective data planning and stewardship in all Belmont Forum agency-funded research with a goal to make it enforceable. 4. Determine international and community best practice to inform Belmont Forum research e-infrastructure policy through identification and analysis of cross-disciplinary research case studies. 5. Support the development of a cross-disciplinary training curriculum to expand human capacity in technology and data-intensive analysis methods. The Belmont Forum is ideally poised to play a vital and transformative leadership role in establishing a sustained human and technical international data e-infrastructure to support global change research. In 2016, members of the 23-nation Belmont Forum began a collaborative implementation phase. Four multi-national teams are undertaking Action Themes based on the recommendations above. Tasks include mapping the landscape, identifying and documenting existing data management plans, and scheduling a series of workshops that analyse trans-disciplinary applications of existing Belmont Forum projects to identify best practices and critical gaps that may be uniquely or best addressed by the Belmont Forum funding model. Concurrent work will define challenges in conducting international and interdisciplinary data management implementation plans and identify sources of relevant expertise and knowledge.
Interoperability, Scaling, and the Digital Libraries Research Agenda.
ERIC Educational Resources Information Center
Lynch, Clifford; Garcia-Molina, Hector
1996-01-01
Summarizes reports and activities at the Information Infrastructure Technology and Applications workshop on digital libraries (Reston, Virginia, August 22, 1995). Defines digital library roles and identifies areas of needed research, including: interoperability; protocols for digital objects; collection management; interface design; human-computer…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-05
... Communication Interoperability Plan Implementation Report AGENCY: National Protection and Programs Directorate... Directorate/Cybersecurity and Communications/Office of Emergency Communications, has submitted the following... INFORMATION: The Office of Emergency Communications (OEC), formed under Title XVIII of the Homeland Security...
Easy research data handling with an OpenEarth DataLab for geo-monitoring research
NASA Astrophysics Data System (ADS)
Vanderfeesten, Maurice; van der Kuil, Annemiek; Prinčič, Alenka; den Heijer, Kees; Rombouts, Jeroen
2015-04-01
OpenEarth DataLab is an open source-based collaboration and processing platform to enable streamlined research data management from raw data ingest and transformation to interoperable distribution. It enables geo-scientists to easily synchronise, share, compute and visualise the dynamic and most up-to-date research data, scripts and models in multi-stakeholder geo-monitoring programs. This DataLab is developed by the Research Data Services team of TU Delft Library and 3TU.Datacentrum together with coastal engineers of Delft University of Technology and Deltares. Based on the OpenEarth software stack an environment has been developed to orchestrate numerous geo-related open source software components that can empower researchers and increase the overall research quality by managing research data; enabling automatic and interoperable data workflows between all the components with track & trace, hit & run data transformation processing in cloud infrastructure using MatLab and Python, synchronisation of data and scripts (SVN), and much more. Transformed interoperable data products (KML, NetCDF, PostGIS) can be used by ready-made OpenEarth tools for further analyses and visualisation, and can be distributed via interoperable channels such as THREDDS (OpenDAP) and GeoServer. An example of a successful application of OpenEarth DataLab is the Sand Motor, an innovative method for coastal protection in the Netherlands. The Sand Motor is a huge volume of sand that has been applied along the coast to be spread naturally by wind, waves and currents. Different research disciplines are involved concerned with: weather, waves and currents, sand distribution, water table and water quality, flora and fauna, recreation and management. Researchers share and transform their data in the OpenEarth DataLab, that makes it possible to combine their data and to see influence of different aspects of the coastal protection on their models. During the project the data are available only for the researchers involved. After the project a large part of the data and scripts will be published with DOI in the Data Archive of 3TU.Datacentrum for reuse in new research. For the 83 project members of the Sand Motor, the OpenEarth DataLab is available on www.zandmotordata.nl. The OpenEarth DataLab not only saves time and increases quality, but has the potential to open new frontiers for exploring cross-domain analysis and visualisations, revealing new scientific insights.
Small pixel cross-talk MTF and its impact on MWIR sensor performance
NASA Astrophysics Data System (ADS)
Goss, Tristan M.; Willers, Cornelius J.
2017-05-01
As pixel sizes reduce in the development of modern High Definition (HD) Mid Wave Infrared (MWIR) detectors the interpixel cross-talk becomes increasingly difficult to regulate. The diffusion lengths required to achieve the quantum efficiency and sensitivity of MWIR detectors are typically longer than the pixel pitch dimension, and the probability of inter-pixel cross-talk increases as the pixel pitch/diffusion length fraction decreases. Inter-pixel cross-talk is most conveniently quantified by the focal plane array sampling Modulation Transfer Function (MTF). Cross-talk MTF will reduce the ideal sinc square pixel MTF that is commonly used when modelling sensor performance. However, cross-talk MTF data is not always readily available from detector suppliers, and since the origins of inter-pixel cross-talk are uniquely device and manufacturing process specific, no generic MTF models appear to satisfy the needs of the sensor designers and analysts. In this paper cross-talk MTF data has been collected from recent publications and the development for a generic cross-talk MTF model to fit this data is investigated. The resulting cross-talk MTF model is then included in a MWIR sensor model and the impact on sensor performance is evaluated in terms of the National Imagery Interoperability Rating Scale's (NIIRS) General Image Quality Equation (GIQE) metric for a range of fnumber/ detector pitch Fλ/d configurations and operating environments. By applying non-linear boost transfer functions in the signal processing chain, the contrast losses due to cross-talk may be compensated for. Boost transfer functions, however, also reduce the signal to noise ratio of the sensor. In this paper boost function limits are investigated and included in the sensor performance assessments.
EPA Scientific Knowledge Management Assessment and Needs
A series of activities have been conducted by a core group of EPA scientists from across the Agency. The activities were initiated in 2012 and the focus was to increase the reuse and interoperability of science software at EPA. The need for increased reuse and interoperability ...
Interoperability in Personalized Adaptive Learning
ERIC Educational Resources Information Center
Aroyo, Lora; Dolog, Peter; Houben, Geert-Jan; Kravcik, Milos; Naeve, Ambjorn; Nilsson, Mikael; Wild, Fridolin
2006-01-01
Personalized adaptive learning requires semantic-based and context-aware systems to manage the Web knowledge efficiently as well as to achieve semantic interoperability between heterogeneous information resources and services. The technological and conceptual differences can be bridged either by means of standards or via approaches based on the…
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a... Public Safety and Homeland Security Bureau to develop, recommend, and administer policy goals, objectives... and procedures for the 700 MHz public safety broadband wireless network and other public safety...
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a... Public Safety and Homeland Security Bureau to develop, recommend, and administer policy goals, objectives... and procedures for the 700 MHz public safety broadband wireless network and other public safety...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-12
... Alerting, E9-1-1 Location Accuracy, Network Security Best Practices, DNSSEC Implementation Practices for... FEDERAL COMMUNICATIONS COMMISSION Federal Advisory Committee Act; Communications Security... Security, Reliability, and Interoperability Council (CSRIC) meeting that was scheduled for March 6, 2013 is...
Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E.
2014-01-01
Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information. PMID:25954452
Archive interoperability in the Virtual Observatory
NASA Astrophysics Data System (ADS)
Genova, Françoise
2003-02-01
Main goals of Virtual Observatory projects are to build interoperability between astronomical on-line services, observatory archives, databases and results published in journals, and to develop tools permitting the best scientific usage from the very large data sets stored in observatory archives and produced by large surveys. The different Virtual Observatory projects collaborate to define common exchange standards, which are the key for a truly International Virtual Observatory: for instance their first common milestone has been a standard allowing exchange of tabular data, called VOTable. The Interoperability Work Area of the European Astrophysical Virtual Observatory project aims at networking European archives, by building a prototype using the CDS VizieR and Aladin tools, and at defining basic rules to help archive providers in interoperability implementation. The prototype is accessible for scientific usage, to get user feedback (and science results!) at an early stage of the project. ISO archive participates very actively to this endeavour, and more generally to information networking. The on-going inclusion of the ISO log in SIMBAD will allow higher level links for users.
Achieving interoperability for metadata registries using comparative object modeling.
Park, Yu Rang; Kim, Ju Han
2010-01-01
Achieving data interoperability between organizations relies upon agreed meaning and representation (metadata) of data. For managing and registering metadata, many organizations have built metadata registries (MDRs) in various domains based on international standard for MDR framework, ISO/IEC 11179. Following this trend, two pubic MDRs in biomedical domain have been created, United States Health Information Knowledgebase (USHIK) and cancer Data Standards Registry and Repository (caDSR), from U.S. Department of Health & Human Services and National Cancer Institute (NCI), respectively. Most MDRs are implemented with indiscriminate extending for satisfying organization-specific needs and solving semantic and structural limitation of ISO/IEC 11179. As a result it is difficult to address interoperability among multiple MDRs. In this paper, we propose an integrated metadata object model for achieving interoperability among multiple MDRs. To evaluate this model, we developed an XML Schema Definition (XSD)-based metadata exchange format. We created an XSD-based metadata exporter, supporting both the integrated metadata object model and organization-specific MDR formats.
Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E
2014-01-01
Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.
NASA Technical Reports Server (NTRS)
Bradley, Arthur; Dubowsky, Steven; Quinn, Roger; Marzwell, Neville
2005-01-01
Robots that operate independently of one another will not be adequate to accomplish the future exploration tasks of long-distance autonomous navigation, habitat construction, resource discovery, and material handling. Such activities will require that systems widely share information, plan and divide complex tasks, share common resources, and physically cooperate to manipulate objects. Recognizing the need for interoperable robots to accomplish the new exploration initiative, NASA s Office of Exploration Systems Research & Technology recently funded the development of the Joint Technical Architecture for Robotic Systems (JTARS). JTARS charter is to identify the interface standards necessary to achieve interoperability among space robots. A JTARS working group (JTARS-WG) has been established comprising recognized leaders in the field of space robotics including representatives from seven NASA centers along with academia and private industry. The working group s early accomplishments include addressing key issues required for interoperability, defining which systems are within the project s scope, and framing the JTARS manuals around classes of robotic systems.
Interoperable and accessible census and survey data from IPUMS.
Kugler, Tracy A; Fitch, Catherine A
2018-02-27
The first version of the Integrated Public Use Microdata Series (IPUMS) was released to users in 1993, and since that time IPUMS has come to stand for interoperable and accessible census and survey data. Initially created to harmonize U.S. census microdata over time, IPUMS now includes microdata from the U.S. and international censuses and from surveys on health, employment, and other topics. IPUMS also provides geo-spatial data, aggregate population data, and environmental data. IPUMS supports ten data products, each disseminating an integrated data collection with a set of tools that make complex data easy to find, access, and use. Key features are record-level integration to create interoperable datasets, user-friendly interfaces, and comprehensive metadata and documentation. The IPUMS philosophy aligns closely with the FAIR principles of findability, accessibility, interoperability, and re-usability. IPUMS data have catalyzed knowledge generation across a wide range of social science and other disciplines, as evidenced by the large volume of publications and other products created by the vast IPUMS user community.
Myneni, Sahiti; Patel, Vimla L.
2009-01-01
Biomedical researchers often have to work on massive, detailed, and heterogeneous datasets that raise new challenges of information management. This study reports an investigation into the nature of the problems faced by the researchers in two bioscience test laboratories when dealing with their data management applications. Data were collected using ethnographic observations, questionnaires, and semi-structured interviews. The major problems identified in working with these systems were related to data organization, publications, and collaboration. The interoperability standards were analyzed using a C4I framework at the level of connection, communication, consolidation, and collaboration. Such an analysis was found to be useful in judging the capabilities of data management systems at different levels of technological competency. While collaboration and system interoperability are the “must have” attributes of these biomedical scientific laboratory information management applications, usability and human interoperability are the other design concerns that must also be addressed for easy use and implementation. PMID:20351900
Multi-PON access network using a coarse AWG for smooth migration from TDM to WDM PON
NASA Astrophysics Data System (ADS)
Shachaf, Y.; Chang, C.-H.; Kourtessis, P.; Senior, J. M.
2007-06-01
An interoperable access network architecture based on a coarse array waveguide grating (AWG) is described, displaying dynamic wavelength assignment to manage the network load across multiple PONs. The multi-PON architecture utilizes coarse Gaussian channels of an AWG to facilitate scalability and smooth migration path between TDM and WDM PONs. Network simulations of a cross-operational protocol platform confirmed successful routing of individual PON clusters through 7 nm-wide passband windows of the AWG. Furthermore, polarization-dependent wavelength shift and phase errors of the device proved not to impose restrain on the routing performance. Optical transmission tests at 2.5 Gbit/s for distances up to 20 km are demonstrated.
Gurcan, Metin N; Tomaszewski, John; Overton, James A; Doyle, Scott; Ruttenberg, Alan; Smith, Barry
2017-02-01
Interoperability across data sets is a key challenge for quantitative histopathological imaging. There is a need for an ontology that can support effective merging of pathological image data with associated clinical and demographic data. To foster organized, cross-disciplinary, information-driven collaborations in the pathological imaging field, we propose to develop an ontology to represent imaging data and methods used in pathological imaging and analysis, and call it Quantitative Histopathological Imaging Ontology - QHIO. We apply QHIO to breast cancer hot-spot detection with the goal of enhancing reliability of detection by promoting the sharing of data between image analysts. Copyright © 2016 Elsevier Inc. All rights reserved.
ScrapbookUSA: Writing 'Cross Grade, 'Cross Curriculum, 'Cross Country.
ERIC Educational Resources Information Center
Roth, Emery, II
1993-01-01
Describes the ScrapBookUSA Writing Project, a computer telecommunications project linking classrooms across the country, and its educational opportunities for the writing and multicultural studies curricula. Examples of Hello letters, student essays, and ScrapBook Chronicles are given to demonstrate the impact a wide audience and immediate…
Lin, M.C.; Vreeman, D.J.; Huff, S.M.
2012-01-01
Objectives We wanted to develop a method for evaluating the consistency and usefulness of LOINC code use across different institutions, and to evaluate the degree of interoperability that can be attained when using LOINC codes for laboratory data exchange. Our specific goals were to: 1) Determine if any contradictory knowledge exists in LOINC. 2) Determine how many LOINC codes were used in a truly interoperable fashion between systems. 3) Provide suggestions for improving the semantic interoperability of LOINC. Methods We collected Extensional Definitions (EDs) of LOINC usage from three institutions. The version space approach was used to divide LOINC codes into small sets, which made auditing of LOINC use across the institutions feasible. We then compared pairings of LOINC codes from the three institutions for consistency and usefulness. Results The number of LOINC codes evaluated were 1,917, 1,267 and 1,693 as obtained from ARUP, Intermountain and Regenstrief respectively. There were 2,022, 2,030, and 2,301 version spaces among ARUP & Intermountain, Intermountain & Regenstrief and ARUP & Regenstrief respectively. Using the EDs as the gold standard, there were 104, 109 and 112 pairs containing contradictory knowledge and there were 1,165, 765 and 1,121 semantically interoperable pairs. The interoperable pairs were classified into three levels: 1) Level I – No loss of meaning, complete information was exchanged by identical codes. 2) Level II – No loss of meaning, but processing of data was needed to make the data completely comparable. 3) Level III – Some loss of meaning. For example, tests with a specific ‘method’ could be rolled-up with tests that were ‘methodless’. Conclusions There are variations in the way LOINC is used for data exchange that result in some data not being truly interoperable across different enterprises. To improve its semantic interoperability, we need to detect and correct any contradictory knowledge within LOINC and add computable relationships that can be used for making reliable inferences about the data. The LOINC committee should also provide detailed guidance on best practices for mapping from local codes to LOINC codes and for using LOINC codes in data exchange. PMID:22306382
Semantics-Based Interoperability Framework for the Geosciences
NASA Astrophysics Data System (ADS)
Sinha, A.; Malik, Z.; Raskin, R.; Barnes, C.; Fox, P.; McGuinness, D.; Lin, K.
2008-12-01
Interoperability between heterogeneous data, tools and services is required to transform data to knowledge. To meet geoscience-oriented societal challenges such as forcing of climate change induced by volcanic eruptions, we suggest the need to develop semantic interoperability for data, services, and processes. Because such scientific endeavors require integration of multiple data bases associated with global enterprises, implicit semantic-based integration is impossible. Instead, explicit semantics are needed to facilitate interoperability and integration. Although different types of integration models are available (syntactic or semantic) we suggest that semantic interoperability is likely to be the most successful pathway. Clearly, the geoscience community would benefit from utilization of existing XML-based data models, such as GeoSciML, WaterML, etc to rapidly advance semantic interoperability and integration. We recognize that such integration will require a "meanings-based search, reasoning and information brokering", which will be facilitated through inter-ontology relationships (ontologies defined for each discipline). We suggest that Markup languages (MLs) and ontologies can be seen as "data integration facilitators", working at different abstraction levels. Therefore, we propose to use an ontology-based data registration and discovery approach to compliment mark-up languages through semantic data enrichment. Ontologies allow the use of formal and descriptive logic statements which permits expressive query capabilities for data integration through reasoning. We have developed domain ontologies (EPONT) to capture the concept behind data. EPONT ontologies are associated with existing ontologies such as SUMO, DOLCE and SWEET. Although significant efforts have gone into developing data (object) ontologies, we advance the idea of developing semantic frameworks for additional ontologies that deal with processes and services. This evolutionary step will facilitate the integrative capabilities of scientists as we examine the relationships between data and external factors such as processes that may influence our understanding of "why" certain events happen. We emphasize the need to go from analysis of data to concepts related to scientific principles of thermodynamics, kinetics, heat flow, mass transfer, etc. Towards meeting these objectives, we report on a pair of related service engines: DIA (Discovery, integration and analysis), and SEDRE (Semantically-Enabled Data Registration Engine) that utilize ontologies for semantic interoperability and integration.
Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A
2015-01-01
This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.
GMPLS-based control plane for optical networks: early implementation experience
NASA Astrophysics Data System (ADS)
Liu, Hang; Pendarakis, Dimitrios; Komaee, Nooshin; Saha, Debanjan
2002-07-01
Generalized Multi-Protocol Label Switching (GMPLS) extends MPLS signaling and Internet routing protocols to provide a scalable, interoperable, distributed control plane, which is applicable to multiple network technologies such as optical cross connects (OXCs), photonic switches, IP routers, ATM switches, SONET and DWDM systems. It is intended to facilitate automatic service provisioning and dynamic neighbor and topology discovery across multi-vendor intelligent transport networks, as well as their clients. Efforts to standardize such a distributed common control plane have reached various stages in several bodies such as the IETF, ITU and OIF. This paper describes the design considerations and architecture of a GMPLS-based control plane that we have prototyped for core optical networks. Functional components of GMPLS signaling and routing are integrated in this architecture with an application layer controller module. Various requirements including bandwidth, network protection and survivability, traffic engineering, optimal utilization of network resources, and etc. are taken into consideration during path computation and provisioning. Initial experiments with our prototype demonstrate the feasibility and main benefits of GMPLS as a distributed control plane for core optical networks. In addition to such feasibility results, actual adoption and deployment of GMPLS as a common control plane for intelligent transport networks will depend on the successful completion of relevant standardization activities, extensive interoperability testing as well as the strengthening of appropriate business drivers.
The Java Image Science Toolkit (JIST) for rapid prototyping and publishing of neuroimaging software.
Lucas, Blake C; Bogovic, John A; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L; Pham, Dzung L; Landman, Bennett A
2010-03-01
Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC).
The Java Image Science Toolkit (JIST) for Rapid Prototyping and Publishing of Neuroimaging Software
Lucas, Blake C.; Bogovic, John A.; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L.; Pham, Dzung
2010-01-01
Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC). PMID:20077162
NASA Astrophysics Data System (ADS)
Pang, Zhibo; Zheng, Lirong; Tian, Junzhe; Kao-Walter, Sharon; Dubrova, Elena; Chen, Qiang
2015-01-01
In-home health care services based on the Internet-of-Things are promising to resolve the challenges caused by the ageing of population. But the existing research is rather scattered and shows lack of interoperability. In this article, a business-technology co-design methodology is proposed for cross-boundary integration of in-home health care devices and services. In this framework, three key elements of a solution (business model, device and service integration architecture and information system integration architecture) are organically integrated and aligned. In particular, a cooperative Health-IoT ecosystem is formulated, and information systems of all stakeholders are integrated in a cooperative health cloud as well as extended to patients' home through the in-home health care station (IHHS). Design principles of the IHHS includes the reuse of 3C platform, certification of the Health Extension, interoperability and extendibility, convenient and trusted software distribution, standardised and secured electrical health care record handling, effective service composition and efficient data fusion. These principles are applied to the design of an IHHS solution called iMedBox. Detailed device and service integration architecture and hardware and software architecture are presented and verified by an implemented prototype. The quantitative performance analysis and field trials have confirmed the feasibility of the proposed design methodology and solution.
Adaptation of interoperability standards for cross domain usage
NASA Astrophysics Data System (ADS)
Essendorfer, B.; Kerth, Christian; Zaschke, Christian
2017-05-01
As globalization affects most aspects of modern life, challenges of quick and flexible data sharing apply to many different domains. To protect a nation's security for example, one has to look well beyond borders and understand economical, ecological, cultural as well as historical influences. Most of the time information is produced and stored digitally and one of the biggest challenges is to receive relevant readable information applicable to a specific problem out of a large data stock at the right time. These challenges to enable data sharing across national, organizational and systems borders are known to other domains (e.g., ecology or medicine) as well. Solutions like specific standards have been worked on for the specific problems. The question is: what can the different domains learn from each other and do we have solutions when we need to interlink the information produced in these domains? A known problem is to make civil security data available to the military domain and vice versa in collaborative operations. But what happens if an environmental crisis leads to the need to quickly cooperate with civil or military security in order to save lives? How can we achieve interoperability in such complex scenarios? The paper introduces an approach to adapt standards from one domain to another and lines out problems that have to be overcome and limitations that may apply.
Decomposing cross-country differences in quality adjusted life expectancy: the impact of value sets.
Heijink, Richard; van Baal, Pieter; Oppe, Mark; Koolman, Xander; Westert, Gert
2011-06-23
The validity, reliability and cross-country comparability of summary measures of population health (SMPH) have been persistently debated. In this debate, the measurement and valuation of nonfatal health outcomes have been defined as key issues. Our goal was to quantify and decompose international differences in health expectancy based on health-related quality of life (HRQoL). We focused on the impact of value set choice on cross-country variation. We calculated Quality Adjusted Life Expectancy (QALE) at age 20 for 15 countries in which EQ-5D population surveys had been conducted. We applied the Sullivan approach to combine the EQ-5D based HRQoL data with life tables from the Human Mortality Database. Mean HRQoL by country-gender-age was estimated using a parametric model. We used nonparametric bootstrap techniques to compute confidence intervals. QALE was then compared across the six country-specific time trade-off value sets that were available. Finally, three counterfactual estimates were generated in order to assess the contribution of mortality, health states and health-state values to cross-country differences in QALE. QALE at age 20 ranged from 33 years in Armenia to almost 61 years in Japan, using the UK value set. The value sets of the other five countries generated different estimates, up to seven years higher. The relative impact of choosing a different value set differed across country-gender strata between 2% and 20%. In 50% of the country-gender strata the ranking changed by two or more positions across value sets. The decomposition demonstrated a varying impact of health states, health-state values, and mortality on QALE differences across countries. The choice of the value set in SMPH may seriously affect cross-country comparisons of health expectancy, even across populations of similar levels of wealth and education. In our opinion, it is essential to get more insight into the drivers of differences in health-state values across populations. This will enhance the usefulness of health-expectancy measures.
A First Look at the Upcoming SISO Space Reference FOM
NASA Technical Reports Server (NTRS)
Mueller, Bjorn; Crues, Edwin Z.; Dexter, Dan; Garro, Alfredo; Skuratovskiy, Anton; Vankov, Alexander
2016-01-01
Spaceflight is difficult, dangerous and expensive; human spaceflight even more so. In order to mitigate some of the danger and expense, professionals in the space domain have relied, and continue to rely, on computer simulation. Simulation is used at every level including concept, design, analysis, construction, testing, training and ultimately flight. As space systems have grown more complex, new simulation technologies have been developed, adopted and applied. Distributed simulation is one those technologies. Distributed simulation provides a base technology for segmenting these complex space systems into smaller, and usually simpler, component systems or subsystems. This segmentation also supports the separation of responsibilities between participating organizations. This segmentation is particularly useful for complex space systems like the International Space Station (ISS), which is composed of many elements from many nations along with visiting vehicles from many nations. This is likely to be the case for future human space exploration activities. Over the years, a number of distributed simulations have been built within the space domain. While many use the High Level Architecture (HLA) to provide the infrastructure for interoperability, HLA without a Federation Object Model (FOM) is insufficient by itself to insure interoperability. As a result, the Simulation Interoperability Standards Organization (SISO) is developing a Space Reference FOM. The Space Reference FOM Product Development Group is composed of members from several countries. They contribute experiences from projects within NASA, ESA and other organizations and represent government, academia and industry. The initial version of the Space Reference FOM is focusing on time and space and will provide the following: (i) a flexible positioning system using reference frames for arbitrary bodies in space, (ii) a naming conventions for well-known reference frames, (iii) definitions of common time scales, (iv) federation agreements for common types of time management with focus on time stepped simulation, and (v) support for physical entities, such as space vehicles and astronauts. The Space Reference FOM is expected to make collaboration politically, contractually and technically easier. It is also expected to make collaboration easier to manage and extend.
An Information System for European culture collections: the way forward.
Casaregola, Serge; Vasilenko, Alexander; Romano, Paolo; Robert, Vincent; Ozerskaya, Svetlana; Kopf, Anna; Glöckner, Frank O; Smith, David
2016-01-01
Culture collections contain indispensable information about the microorganisms preserved in their repositories, such as taxonomical descriptions, origins, physiological and biochemical characteristics, bibliographic references, etc. However, information currently accessible in databases rarely adheres to common standard protocols. The resultant heterogeneity between culture collections, in terms of both content and format, notably hampers microorganism-based research and development (R&D). The optimized exploitation of these resources thus requires standardized, and simplified, access to the associated information. To this end, and in the interest of supporting R&D in the fields of agriculture, health and biotechnology, a pan-European distributed research infrastructure, MIRRI, including over 40 public culture collections and research institutes from 19 European countries, was established. A prime objective of MIRRI is to unite and provide universal access to the fragmented, and untapped, resources, information and expertise available in European public collections of microorganisms; a key component of which is to develop a dynamic Information System. For the first time, both culture collection curators as well as their users have been consulted and their feedback, concerning the needs and requirements for collection databases and data accessibility, utilised. Users primarily noted that databases were not interoperable, thus rendering a global search of multiple databases impossible. Unreliable or out-of-date and, in particular, non-homogenous, taxonomic information was also considered to be a major obstacle to searching microbial data efficiently. Moreover, complex searches are rarely possible in online databases thus limiting the extent of search queries. Curators also consider that overall harmonization-including Standard Operating Procedures, data structure, and software tools-is necessary to facilitate their work and to make high-quality data easily accessible to their users. Clearly, the needs of culture collection curators coincide with those of users on the crucial point of database interoperability. In this regard, and in order to design an appropriate Information System, important aspects on which the culture collection community should focus include: the interoperability of data sets with the ontologies to be used; setting best practice in data management, and the definition of an appropriate data standard.
The Next Stage: Moving from Isolated Digital Collections to Interoperable Digital Libraries.
ERIC Educational Resources Information Center
Besser, Howard
2002-01-01
Presents a conceptual framework for digital library development and discusses how to move from isolated digital collections to interoperable digital libraries. Topics include a history of digital libraries; user-centered architecture; stages of technological development; standards, including metadata; and best practices. (Author/LRW)
Proceedings of the ITS Standards Program Review and Interoperability Workshop
DOT National Transportation Integrated Search
1997-12-17
An ITS Standards Program Review and Interoperability Workshop was held on Dec. 17-18, 1997 in Arlington, Va. It was sponsored by the U.S. DOT, ITS America, George Mason University (GMU) and the University of Michigan. The purpose was to review the US...
Putting the School Interoperability Framework to the Test
ERIC Educational Resources Information Center
Mercurius, Neil; Burton, Glenn; Hopkins, Bill; Larsen, Hans
2004-01-01
The Jurupa Unified School District in Southern California recently partnered with Microsoft, Dell and the Zone Integration Group for the implementation of a School Interoperability Framework (SIF) database repository model throughout the district (Magner 2002). A two-week project--the Integrated District Education Applications System, better known…
75 FR 66752 - Smart Grid Interoperability Standards; Notice of Technical Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-29
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid... adoption of Smart Grid Interoperability Standards (Standards) in their States. On October 6, 2010, the....m. Eastern time in conjunction with the NARUC/FERC Collaborative on Smart Response (Collaborative...
76 FR 4102 - Smart Grid Interoperability Standards; Supplemental Notice of Technical Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-24
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid... Federal Energy Regulatory Commission announced that a Technical Conference on Smart Grid Interoperability... National Institute of Standards and Technology are ready for Commission consideration in a rulemaking...
Cross-Country Differentials in Work Disability Reporting among Older Europeans
ERIC Educational Resources Information Center
Angelini, Viola; Cavapozzi, Danilo; Paccagnella, Omar
2012-01-01
Descriptive evidence shows that there is large cross-country variation in self-reported work disability rates of the elderly in Europe. In this paper we analyse whether these differences are genuine or they just reflect heterogeneity in reporting styles. To shed light on the determinants of work-disability differentials across countries, we…
Zhou, Yuan; Ancker, Jessica S; Upadhye, Mandar; McGeorge, Nicolette M; Guarrera, Theresa K; Hegde, Sudeep; Crane, Peter W; Fairbanks, Rollin J; Bisantz, Ann M; Kaushal, Rainu; Lin, Li
2013-01-01
The effect of health information technology (HIT) on efficiency and workload among clinical and nonclinical staff has been debated, with conflicting evidence about whether electronic health records (EHRs) increase or decrease effort. None of this paper to date, however, examines the effect of interoperability quantitatively using discrete event simulation techniques. To estimate the impact of EHR systems with various levels of interoperability on day-to-day tasks and operations of ambulatory physician offices. Interviews and observations were used to collect workflow data from 12 adult primary and specialty practices. A discrete event simulation model was constructed to represent patient flows and clinical and administrative tasks of physicians and staff members. High levels of EHR interoperability were associated with reduced time spent by providers on four tasks: preparing lab reports, requesting lab orders, prescribing medications, and writing referrals. The implementation of an EHR was associated with less time spent by administrators but more time spent by physicians, compared with time spent at paper-based practices. In addition, the presence of EHRs and of interoperability did not significantly affect the time usage of registered nurses or the total visit time and waiting time of patients. This paper suggests that the impact of using HIT on clinical and nonclinical staff work efficiency varies, however, overall it appears to improve time efficiency more for administrators than for physicians and nurses.
2011-01-01
Background The practice and research of medicine generates considerable quantities of data and model resources (DMRs). Although in principle biomedical resources are re-usable, in practice few can currently be shared. In particular, the clinical communities in physiology and pharmacology research, as well as medical education, (i.e. PPME communities) are facing considerable operational and technical obstacles in sharing data and models. Findings We outline the efforts of the PPME communities to achieve automated semantic interoperability for clinical resource documentation in collaboration with the RICORDO project. Current community practices in resource documentation and knowledge management are overviewed. Furthermore, requirements and improvements sought by the PPME communities to current documentation practices are discussed. The RICORDO plan and effort in creating a representational framework and associated open software toolkit for the automated management of PPME metadata resources is also described. Conclusions RICORDO is providing the PPME community with tools to effect, share and reason over clinical resource annotations. This work is contributing to the semantic interoperability of DMRs through ontology-based annotation by (i) supporting more effective navigation and re-use of clinical DMRs, as well as (ii) sustaining interoperability operations based on the criterion of biological similarity. Operations facilitated by RICORDO will range from automated dataset matching to model merging and managing complex simulation workflows. In effect, RICORDO is contributing to community standards for resource sharing and interoperability. PMID:21878109
Interoperable web applications for sharing data and products of the International DORIS Service
NASA Astrophysics Data System (ADS)
Soudarin, L.; Ferrage, P.
2017-12-01
The International DORIS Service (IDS) was created in 2003 under the umbrella of the International Association of Geodesy (IAG) to foster scientific research related to the French satellite tracking system DORIS and to deliver scientific products, mostly related to the International Earth rotation and Reference systems Service (IERS). Since its start, the organization has continuously evolved, leading to additional and improved operational products from an expanded set of DORIS Analysis Centers. In addition, IDS has developed services for sharing data and products with the users. Metadata and interoperable web applications are proposed to explore, visualize and download the key products such as the position time series of the geodetic points materialized at the ground tracking stations. The Global Geodetic Observing System (GGOS) encourages the IAG Services to develop such interoperable facilities on their website. The objective for GGOS is to set up an interoperable portal through which the data and products produced by the IAG Services can be served to the user community. We present the web applications proposed by IDS to visualize time series of geodetic observables or to get information about the tracking ground stations and the tracked satellites. We discuss the future plans for IDS to meet the recommendations of GGOS. The presentation also addresses the needs for the IAG Services to adopt common metadata thesaurus to describe data and products, and interoperability standards to share them.
NASA Astrophysics Data System (ADS)
Mueller, Wolfgang; Mueller, Henning; Marchand-Maillet, Stephane; Pun, Thierry; Squire, David M.; Pecenovic, Zoran; Giess, Christoph; de Vries, Arjen P.
2000-10-01
While in the area of relational databases interoperability is ensured by common communication protocols (e.g. ODBC/JDBC using SQL), Content Based Image Retrieval Systems (CBIRS) and other multimedia retrieval systems are lacking both a common query language and a common communication protocol. Besides its obvious short term convenience, interoperability of systems is crucial for the exchange and analysis of user data. In this paper, we present and describe an extensible XML-based query markup language, called MRML (Multimedia Retrieval markup Language). MRML is primarily designed so as to ensure interoperability between different content-based multimedia retrieval systems. Further, MRML allows researchers to preserve their freedom in extending their system as needed. MRML encapsulates multimedia queries in a way that enable multimedia (MM) query languages, MM content descriptions, MM query engines, and MM user interfaces to grow independently from each other, reaching a maximum of interoperability while ensuring a maximum of freedom for the developer. For benefitting from this, only a few simple design principles have to be respected when extending MRML for one's fprivate needs. The design of extensions withing the MRML framework will be described in detail in the paper. MRML has been implemented and tested for the CBIRS Viper, using the user interface Snake Charmer. Both are part of the GNU project and can be downloaded at our site.
Essink-Bot, Marie-Louise; Pereira, Joaquin; Packer, Claire; Schwarzinger, Michael; Burstrom, Kristina
2002-01-01
OBJECTIVE: To investigate the sources of cross-national variation in disability-adjusted life-years (DALYs) in the European Disability Weights Project. METHODS: Disability weights for 15 disease stages were derived empirically in five countries by means of a standardized procedure and the cross-national differences in visual analogue scale (VAS) scores were analysed. For each country the burden of dementia in women, used as an illustrative example, was estimated in DALYs. An analysis was performed of the relative effects of cross-national variations in demography, epidemiology and disability weights on DALY estimates. FINDINGS: Cross-national comparison of VAS scores showed almost identical ranking orders. After standardization for population size and age structure of the populations, the DALY rates per 100000 women ranged from 1050 in France to 1404 in the Netherlands. Because of uncertainties in the epidemiological data, the extent to which these differences reflected true variation between countries was difficult to estimate. The use of European rather than country-specific disability weights did not lead to a significant change in the burden of disease estimates for dementia. CONCLUSIONS: Sound epidemiological data are the first requirement for burden of disease estimation and relevant between-countries comparisons. DALY estimates for dementia were relatively insensitive to differences in disability weights between European countries. PMID:12219156
Reunification: keeping families together in crisis.
Blake, Nancy; Stevenson, Kathleen
2009-08-01
In reviewing the literature, there has not been a family reunification plan that has worked consistently during disasters. During Hurricane Katrina, there were children who were sent to a shelter in a different state than their patients. When children are involved, the issues become even more difficult, because some children who are preverbal cannot tell their names or their parents names. Tracking systems have been developed but are not interoperable. No central repository has been developed. There are also issues related to transporting patients, psychosocial issues as well as safety issues that are different when children will be unaccompanied by an adult. Two national meetings were held with experts from all over the country who have an expertise in the care of children. Six focused groups were identified: patient movement/transportation; technology/tracking; clinical issues, nonmedical issues; communication/regulatory issues; and pediatric psychosocial support. The second meeting was a consensus conference. Recommendations from each subgroup were presented and voted on. All recommendations were accepted. The issue of reunification of families in disaster is still a problem. Further work needs to be done on tracking systems that are interoperable before another large disaster strike, pediatric psychological issues after a disaster, transporting patients, and care of the pediatric patient who is not accompanied by an adult. Once a system has been developed, the system needs to be tested by large scale drills that practice moving children across state lines and from one area to another.
Clinical knowledge governance: the international perspective.
Garde, Sebastian
2013-01-01
As a basis for semantic interoperability, ideally, a Clinical Knowledge Resource for a clinical concept should be defined formally and defined once in a way that all clinical professions and all countries can agree on. Clinical Knowledge Governance is required to create high-quality, reusable Clinical Knowledge Resources and achieve this aim. Traditionally, this is a time-consuming and cumbersome process, relying heavily on face-to-face meetings and being able to get sufficient input from clinicians. However, in a national or even international space, it is required to streamline the processes involved in creating Clinical Knowledge Resources. For this, a Web 2.0 tool that supports online collaboration of clinicians during their creation and publishing of Clinical Knowledge Resources has been developed. This tool is named the Clinical Knowledge Manager (CKM) and supports the development, review and publication of Clinical Knowledge Resources. Also, post-publication activities such as adding terminology bindings, translating the Clinical Knowledge Resource into another language and republishing it are supported. The acceptance of Clinical Knowledge Resources depends on their quality and being able to determine their quality, for example it is important to know that a broad umber of reviewers from various clinical disciplines have been involved in the development of the Clinical Knowledge Resource. We are still far from realizing the vision of a global repository of a great number of reusable, high-quality Clinical Knowledge Resources, which can provide the basis for broad semantic interoperability between systems. However progress towards this aim is being made around the world.
An ontological system for interoperable spatial generalisation in biodiversity monitoring
NASA Astrophysics Data System (ADS)
Nieland, Simon; Moran, Niklas; Kleinschmit, Birgit; Förster, Michael
2015-11-01
Semantic heterogeneity remains a barrier to data comparability and standardisation of results in different fields of spatial research. Because of its thematic complexity, differing acquisition methods and national nomenclatures, interoperability of biodiversity monitoring information is especially difficult. Since data collection methods and interpretation manuals broadly vary there is a need for automatised, objective methodologies for the generation of comparable data-sets. Ontology-based applications offer vast opportunities in data management and standardisation. This study examines two data-sets of protected heathlands in Germany and Belgium which are based on remote sensing image classification and semantically formalised in an OWL2 ontology. The proposed methodology uses semantic relations of the two data-sets, which are (semi-)automatically derived from remote sensing imagery, to generate objective and comparable information about the status of protected areas by utilising kernel-based spatial reclassification. This automatised method suggests a generalisation approach, which is able to generate delineation of Special Areas of Conservation (SAC) of the European biodiversity Natura 2000 network. Furthermore, it is able to transfer generalisation rules between areas surveyed with varying acquisition methods in different countries by taking into account automated inference of the underlying semantics. The generalisation results were compared with the manual delineation of terrestrial monitoring. For the different habitats in the two sites an accuracy of above 70% was detected. However, it has to be highlighted that the delineation of the ground-truth data inherits a high degree of uncertainty, which is discussed in this study.
2010-12-01
This involves zeroing and recreating the interoperability arrays and other variables used in the simulation. Since the constants do not change from run......Using this algorithm, the process of encrypting/decrypting data requires very little computation, and the generation of the random pads can be
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-05
... FEDERAL COMMUNICATIONS COMMISSION 47 CFR Part 27 [WT Docket Nos. 12-69, 12-332; FCC 13-136] Promoting Interoperability in the 700 MHz Commercial Spectrum; Requests for Waiver and Extension of Lower 700 MHz Band Interim Construction Benchmark Deadlines AGENCY: Federal Communications Commission...
Generic Educational Knowledge Representation for Adaptive and Cognitive Systems
ERIC Educational Resources Information Center
Caravantes, Arturo; Galan, Ramon
2011-01-01
The interoperability of educational systems, encouraged by the development of specifications, standards and tools related to the Semantic Web is limited to the exchange of information in domain and student models. High system interoperability requires that a common framework be defined that represents the functional essence of educational systems.…
Global Interoperability of Broadband Networks (GIBN): Project Overview
NASA Technical Reports Server (NTRS)
DePaula, Ramon P.
1998-01-01
Various issues associated with the Global Interoperability of Broadband Networks (GIBN) are presented in viewgraph form. Specific topics include GIBN principles, objectives and goals, and background. GIBN/NASA status, the Transpacific High Definition Video experiment, GIBN experiment selection criteria, satellite industry involvement, and current experiments associated with GIBN are also discussed.
Exploring Interoperability as a Multidimensional Challenge for Effective Emergency Response
ERIC Educational Resources Information Center
Santisteban, Hiram
2010-01-01
Purpose. The purpose of this research was to further an understanding of how the federal government is addressing the challenges of interoperability for emergency response or crisis management (FEMA, 2009) by informing the development of standards through the review of current congressional law, commissions, studies, executive orders, and…
Interoperability Is the Foundation for Successful Internet Telephony.
ERIC Educational Resources Information Center
Fromm, Larry
1997-01-01
More than 40 leading computer and telephony companies have united to lead the charge toward open standards and universal interoperability for Internet telephony products. The voice of IP Forum (VoIP) is working to define technical guidelines for two-party, real-time communications over IP networks, including provisions for compatibility with…
An Access Control and Trust Management Framework for Loosely-Coupled Multidomain Environments
ERIC Educational Resources Information Center
Zhang, Yue
2010-01-01
Multidomain environments where multiple organizations interoperate with each other are becoming a reality as can be seen in emerging Internet-based enterprise applications. Access control to ensure secure interoperation in such an environment is a crucial challenge. A multidomain environment can be categorized as "tightly-coupled" and…
OpenICE medical device interoperability platform overview and requirement analysis.
Arney, David; Plourde, Jeffrey; Goldman, Julian M
2018-02-23
We give an overview of OpenICE, an open source implementation of the ASTM standard F2761 for the Integrated Clinical Environment (ICE) that leverages medical device interoperability, together with an analysis of the clinical and non-functional requirements and community process that inspired its design.
47 CFR 90.525 - Administration of interoperability channels.
Code of Federal Regulations, 2010 CFR
2010-10-01
... RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Regulations Governing the Licensing and Use of... meeting the requirements of § 90.523 may operate mobile or portable units on the Interoperability channels... Commission provided it holds a part 90 license. All persons operating mobile or portable units under this...
2006-09-30
coastal phenomena. OBJECTIVES SURA is creating a SCOOP “Grid” that extends the interoperability enabled by the World Wide Web. The coastal ... community faces special challenges with respect to achieving a level of interoperability that can leverage emerging Grid technologies. With that in mind
Waveform Diversity and Design for Interoperating Radar Systems
2013-01-01
University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica , Telecomunicazioni Via Girolamo Caruso 16 Pisa, Italy 56122...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica ...DIPARTIMENTO DI INGEGNERIA DELL’INFORMAZIONE ELETTRONICA, INFORMATICA , TELECOMUNICAZIONI WAVEFORM DIVERSITY AND DESIGN FOR INTEROPERATING
RuleML-Based Learning Object Interoperability on the Semantic Web
ERIC Educational Resources Information Center
Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.
2008-01-01
Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…
75 FR 81605 - Smart Grid Interoperability Standards; Notice of Technical Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-28
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Technical Conference December 21, 2010. Take notice that the Federal Energy... National Institute of Standards and Technology and included in this proceeding are ready for Commission...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-30
...] Emission Mask Requirements for Digital Technologies on 800 MHz NPSPAC Channels; Analog FM Capability on Mutual Aid and Interoperability Channels AGENCY: Federal Communications Commission. ACTION: Proposed rule... interoperability channels. These proposals could help safeguard public safety licensees in the NPSPAC band from...
ERIC Educational Resources Information Center
Swahn, Monica H.
2012-01-01
The current special issue brings together intriguing and important cross-country comparisons of issues pertinent to early adolescence that can inform the design and implementation of broader and relevant public health prevention strategies. The findings illustrate the importance of cross-country analyses for better understanding a range of…
Measurement of Job Motivation in TEDS-M: Testing for Invariance across Countries and Cultures
ERIC Educational Resources Information Center
Laschke, Christin; Blömeke, Sigrid
2016-01-01
The paper presents the challenges of cross-country and cross-cultural research on the motivation to become a mathematics teacher based on data from the "Teacher Education and Development Study in Mathematics" ("TEDS-M"). Referring to studies from cross-cultural psychology, measurement invariance (MI) of constructs representing…
Huijts, Tim; Kraaykamp, Gerbert
2012-01-01
In this study, we examined origin, destination, and community effects on first- and second-generation immigrants' health in Europe. We used information from the European Social Surveys (2002–2008) on 19,210 immigrants from 123 countries of origin, living in 31 European countries. Cross-classified multilevel regression analyses reveal that political suppression in the origin country and living in countries with large numbers of immigrant peers have a detrimental influence on immigrants' health. Originating from predominantly Islamic countries and good average health among natives in the destination country appear to be beneficial. Additionally, the results point toward health selection mechanisms into migration.
BCube: Building a Geoscience Brokering Framework
NASA Astrophysics Data System (ADS)
Jodha Khalsa, Siri; Nativi, Stefano; Duerr, Ruth; Pearlman, Jay
2014-05-01
BCube is addressing the need for effective and efficient multi-disciplinary collaboration and interoperability through the advancement of brokering technologies. As a prototype "building block" for NSF's EarthCube cyberinfrastructure initiative, BCube is demonstrating how a broker can serve as an intermediary between information systems that implement well-defined interfaces, thereby providing a bridge between communities that employ different specifications. Building on the GEOSS Discover and Access Broker (DAB), BCube will develop new modules and services including: • Expanded semantic brokering capabilities • Business Model support for work flows • Automated metadata generation • Automated linking to services discovered via web crawling • Credential passing for seamless access to data • Ranking of search results from brokered catalogs Because facilitating cross-discipline research involves cultural and well as technical challenges, BCube is also addressing the sociological and educational components of infrastructure development. We are working, initially, with four geoscience disciplines: hydrology, oceans, polar and weather, with an emphasis on connecting existing domain infrastructure elements to facilitate cross-domain communications.
A framework for secure and decentralized sharing of medical imaging data via blockchain consensus.
Patel, Vishal
2018-04-01
The electronic sharing of medical imaging data is an important element of modern healthcare systems, but current infrastructure for cross-site image transfer depends on trust in third-party intermediaries. In this work, we examine the blockchain concept, which enables parties to establish consensus without relying on a central authority. We develop a framework for cross-domain image sharing that uses a blockchain as a distributed data store to establish a ledger of radiological studies and patient-defined access permissions. The blockchain framework is shown to eliminate third-party access to protected health information, satisfy many criteria of an interoperable health system, and readily generalize to domains beyond medical imaging. Relative drawbacks of the framework include the complexity of the privacy and security models and an unclear regulatory environment. Ultimately, the large-scale feasibility of such an approach remains to be demonstrated and will depend on a number of factors which we discuss in detail.
Ferraretti, Anna Pia; Pennings, Guido; Gianaroli, Luca; Natali, Francesca; Magli, M Cristina
2010-02-01
Cross-border reproductive care, also called reproductive tourism, refers to the travelling of citizens from their country of residence to another country in order to receive fertility treatment through assisted reproductive technology. Several reasons account for cross-border reproductive care: (i) a certain kind of treatment is forbidden by law in the couple's own country or is inaccessible to the couple because of their demographic or social characteristics; (ii) foreign centres report higher success rates compared with those of the centres in the country of residence; (iii) a specific treatment may be locally unavailable because of a lack of expertise or because the treatment is considered experimental or insufficiently safe; and (iv) limited access to the treatment in the couple's home country because of long waiting lists, excessive distance from a centre or high costs. Although cross-border reproductive care can be viewed as a safety valve, the phenomenon is often associated with a high risk of health dangers, frustration and disparities. Solutions to these problematic effects need to be considered in the light of the fact that cross-border reproductive care is a growing phenomenon. 2009 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
Coalition readiness management system preliminary interoperability experiment (CReaMS PIE)
NASA Astrophysics Data System (ADS)
Clark, Peter; Ryan, Peter; Zalcman, Lucien; Robbie, Andrew
2003-09-01
The United States Navy (USN) has initiated the Coalition Readiness Management System (CReaMS) Initiative to enhance coalition warfighting readiness through advancing development of a team interoperability training and combined mission rehearsal capability. It integrates evolving cognitive team learning principles and processes with advanced technology innovations to produce an effective and efficient team learning environment. The JOint Air Navy Networking Environment (JOANNE) forms the Australian component of CReaMS. The ultimate goal is to link Australian Defence simulation systems with the USN Battle Force Tactical Training (BFTT) system to demonstrate and achieve coalition level warfare training in a synthetic battlespace. This paper discusses the initial Preliminary Interoperability Experiment (PIE) involving USN and Australian Defence establishments.
A Shovel-Ready Solution to Fill the Nursing Data Gap in the Interdisciplinary Clinical Picture.
Keenan, Gail M; Lopez, Karen Dunn; Sousa, Vanessa E C; Stifter, Janet; Macieira, Tamara G R; Boyd, Andrew D; Yao, Yingwei; Herdman, T Heather; Moorhead, Sue; McDaniel, Anna; Wilkie, Diana J
2018-01-01
To critically evaluate 2014 American Academy of Nursing (AAN) call-to-action plan for generating interoperable nursing data. Healthcare literature. AAN's plan will not generate the nursing data needed to participate in big data science initiatives in the short term because Logical Observation Identifiers Names and Codes and Systematized Nomenclature of Medicine - Clinical Terms are not yet ripe for generating interoperable data. Well-tested viable alternatives exist. Authors present recommendations for revisions to AAN's plan and an evidence-based alternative to generating interoperable nursing data in the near term. These revisions can ultimately lead to the proposed terminology goals of the AAN's plan in the long term. © 2017 NANDA International, Inc.
NASA Astrophysics Data System (ADS)
Agrawal, Arun; Koff, David; Bak, Peter; Bender, Duane; Castelli, Jane
2015-03-01
The deployment of regional and national Electronic Health Record solutions has been a focus of many countries throughout the past decade. A major challenge for these deployments has been support for ubiquitous image viewing. More specifically, these deployments require an imaging solution that can work over the Internet, leverage any point of service device: desktop, tablet, phone; and access imaging data from any source seamlessly. Whereas standards exist to enable ubiquitous image viewing, few if any solutions exist that leverage these standards and meet the challenge. Rather, most of the currently available web based DI viewing solutions are either proprietary solutions or require special plugins. We developed a true zero foot print browser based DI viewing solution based on the Web Access DICOM Objects (WADO) and Cross-enterprise Document Sharing for Imaging (XDS-I.b) standards to a) demonstrate that a truly ubiquitous image viewer can be deployed; b) identify the gaps in the current standards and the design challenges for developing such a solution. The objective was to develop a viewer, which works on all modern browsers on both desktop and mobile devices. The implementation allows basic viewing functionalities of scroll, zoom, pan and window leveling (limited). The major gaps identified in the current DICOM WADO standards are a lack of ability to allow any kind of 3D reconstruction or MPR views. Other design challenges explored include considerations related to optimization of the solution for response time and low memory foot print.
A Cross-National Study of Secondary Science Classroom Environments in Australia and Indonesia
ERIC Educational Resources Information Center
Fraser, Barry J.; Aldridge, Jill M.; Adolphe, F. S. Gerard
2010-01-01
This article reports a cross-national study of classroom environments in Australia and Indonesia. A modified version of the What Is Happening In this Class? (WIHIC) questionnaire was used simultaneously in these two countries to: 1) cross validate the modified WIHIC; 2) investigate differences between countries and sexes in perceptions of…
49 CFR 232.603 - Design, interoperability, and configuration management requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
...: 1999; Revised 2002, 2007); (3) AAR S-4220, “ECP Cable-Based Brake DC Power Supply—Performance...; Revised: 2004); (7) AAR S-4260, “ECP Brake and Wire Distributed Power Interoperability Test Procedures...) Approval. A freight train or freight car equipped with an ECP brake system and equipment covered by the AAR...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-24
... Docket 07-100; FCC 11-6] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the... framework for the nationwide public safety broadband network. This document considers and proposes... broadband networks operating in the 700 MHz band. This document addresses public safety broadband network...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-07
... FEDERAL COMMUNICATIONS COMMISSION 47 CFR Chapter I [PS Docket No. 06-229; WT Docket 06-150; WP Docket 07-100; FCC 11-113] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the 700 MHz Band AGENCY: Federal Communications Commission. ACTION: Final rule. SUMMARY: In this...
Interoperability Gap Challenges for Learning Object Repositories & Learning Management Systems
ERIC Educational Resources Information Center
Mason, Robert T.
2011-01-01
An interoperability gap exists between Learning Management Systems (LMSs) and Learning Object Repositories (LORs). Learning Objects (LOs) and the associated Learning Object Metadata (LOM) that is stored within LORs adhere to a variety of LOM standards. A common LOM standard found in LORs is the Sharable Content Object Reference Model (SCORM)…
Promoting Interoperability: The Case for Discipline-Specific PSAPS
2014-12-01
incidents for two reasons: first, numerous steel and concrete floors that affected signal penetration; and second, so many different companies were...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited PROMOTING...INTEROPERABILITY: THE CASE FOR DISCIPLINE-SPECIFIC PSAPS by Thomas Michael Walsh December 2014 Thesis Advisor: Fathali Moghaddam Second Reader
ERIC Educational Resources Information Center
Aburto, Rafael
2014-01-01
This qualitative study examined efforts by the military organizations and federal agencies to improve information sharing, interoperability, and systems integration in all business practices. More specifically, a survey instrument with six open-ended and eight demographic questions was used to measure the perceived progress, issues, challenges of…
Look who's talking. A guide to interoperability groups and resources.
2011-06-01
There are huge challenges in getting medical devices to communicate with other devices and to information systems. Fortunately, a number of groups have emerged to help hospitals cope. Here's a description of the most prominent ones, including useful web links for each. We also discuss the latest and most pertinent interoperability standards.
USDA-ARS?s Scientific Manuscript database
Environmental modeling framework (EMF) design goals are multi-dimensional and often include many aspects of general software framework development. Many functional capabilities offered by current EMFs are closely related to interoperability and reuse aspects. For example, an EMF needs to support dev...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-14
... Broadband Networks AGENCY: Federal Communications Commission. ACTION: Final rule; waiver. SUMMARY: In this... Homeland Security Bureau (Bureau) approved an initial set of technical requirements for public safety... file an interoperability showing a renewed opportunity to do so and to proceed with network deployment...
2010-06-11
maintenance funds when the purpose of the training is to enhance interoperability, familiarization , and safety training.3 While technically this type of...funding may not be used for SFA, significant gray area exists when determining what training is interoperability, familiarization , and safety and what
NASA Astrophysics Data System (ADS)
Kutsch, W. L.; Zhao, Z.; Hardisty, A.; Hellström, M.; Chin, Y.; Magagna, B.; Asmi, A.; Papale, D.; Pfeil, B.; Atkinson, M.
2017-12-01
Environmental Research Infrastructures (ENVRIs) are expected to become important pillars not only for supporting their own scientific communities, but also a) for inter-disciplinary research and b) for the European Earth Observation Program Copernicus as a contribution to the Global Earth Observation System of Systems (GEOSS) or global thematic data networks. As such, it is very important that data-related activities of the ENVRIs will be well integrated. This requires common policies, models and e-infrastructure to optimise technological implementation, define workflows, and ensure coordination, harmonisation, integration and interoperability of data, applications and other services. The key is interoperating common metadata systems (utilising a richer metadata model as the `switchboard' for interoperation with formal syntax and declared semantics). The metadata characterises data, services, users and ICT resources (including sensors and detectors). The European Cluster Project ENVRIplus has developed a reference model (ENVRI RM) for common data infrastructure architecture to promote interoperability among ENVRIs. The presentation will provide an overview of recent progress and give examples for the integration of ENVRI data in global integration networks.
On the feasibility of interoperable schemes in hand biometrics.
Morales, Aythami; González, Ester; Ferrer, Miguel A
2012-01-01
Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.
Relevance of eHealth standards for big data interoperability in radiology and beyond.
Marcheschi, Paolo
2017-06-01
The aim of this paper is to report on the implementation of radiology and related information technology standards to feed big data repositories and so to be able to create a solid substrate on which to operate with analysis software. Digital Imaging and Communications in Medicine (DICOM) and Health Level 7 (HL7) are the major standards for radiology and medical information technology. They define formats and protocols to transmit medical images, signals, and patient data inside and outside hospital facilities. These standards can be implemented but big data expectations are stimulating a new approach, simplifying data collection and interoperability, seeking reduction of time to full implementation inside health organizations. Virtual Medical Record, DICOM Structured Reporting and HL7 Fast Healthcare Interoperability Resources (FHIR) are changing the way medical data are shared among organization and they will be the keys to big data interoperability. Until we do not find simple and comprehensive methods to store and disseminate detailed information on the patient's health we will not be able to get optimum results from the analysis of those data.
The e-MapScholar project—an example of interoperability in GIScience education
NASA Astrophysics Data System (ADS)
Purves, R. S.; Medyckyj-Scott, D. J.; Mackaness, W. A.
2005-03-01
The proliferation of the use of digital spatial data in learning and teaching provides a set of opportunities and challenges for the development of e-learning materials suitable for use by a broad spectrum of disciplines in Higher Education. Effective e-learning materials must both provide engaging materials with which the learner can interact and be relevant to the learners' disciplinary and background knowledge. Interoperability aims to allow sharing of data and materials through the use of common agreements and specifications. Shared learning materials can take advantage of interoperable components to provide customisable components, and must consider issues in sharing data across institutional borders. The e-MapScholar project delivers teaching materials related to spatial data, which are customisable with respect to both context and location. Issues in the provision of such interoperable materials are discussed, including suitable levels of granularity of materials, the provision of tools to facilitate customisation and mechanisms to deliver multiple data sets and the metadata issues related to such materials. The examples shown make extensive use of the OpenGIS consortium specifications in the delivery of spatial data.
Weininger, Sandy; Jaffe, Michael B; Goldman, Julian M
2017-01-01
Medical device and health information technology systems are increasingly interdependent with users demanding increased interoperability. Related safety standards must be developed taking into account these systems' perspective. In this article, we describe the current development of medical device standards and the need for these standards to address medical device informatics. Medical device information should be gathered from a broad range of clinical scenarios to lay the foundation for safe medical device interoperability. Five clinical examples show how medical device informatics principles, if applied in the development of medical device standards, could help facilitate the development of safe interoperable medical device systems. These examples illustrate the clinical implications of the failure to capture important signals and device attributes. We provide recommendations relating to the coordination between historically separate standards development groups, some of which focus on safety and effectiveness and others focus on health informatics. We identify the need for a shared understanding among stakeholders and describe organizational structures to promote cooperation such that device-to-device interactions and related safety information are considered during standards development.
Weininger, Sandy; Jaffe, Michael B.; Goldman, Julian M
2016-01-01
Medical device and health information technology systems are increasingly interdependent with users demanding increased interoperability. Related safety standards must be developed taking into account this systems perspective. In this article we describe the current development of medical device standards and the need for these standards to address medical device informatics. Medical device information should be gathered from a broad range of clinical scenarios to lay the foundation for safe medical device interoperability. Five clinical examples show how medical device informatics principles, if applied in the development of medical device standards, could help facilitate the development of safe interoperable medical device systems. These examples illustrate the clinical implications of the failure to capture important signals and device attributes. We provide recommendations relating to the coordination between historically separate standards development groups; some which focus on safety and effectiveness, and others that focus on health informatics. We identify the need for a shared understanding among stakeholders and describe organizational structures to promote cooperation such that device-to-device interactions and related safety information are considered during standards development. PMID:27584685
Application-Level Interoperability Across Grids and Clouds
NASA Astrophysics Data System (ADS)
Jha, Shantenu; Luckow, Andre; Merzky, Andre; Erdely, Miklos; Sehgal, Saurabh
Application-level interoperability is defined as the ability of an application to utilize multiple distributed heterogeneous resources. Such interoperability is becoming increasingly important with increasing volumes of data, multiple sources of data as well as resource types. The primary aim of this chapter is to understand different ways in which application-level interoperability can be provided across distributed infrastructure. We achieve this by (i) using the canonical wordcount application, based on an enhanced version of MapReduce that scales-out across clusters, clouds, and HPC resources, (ii) establishing how SAGA enables the execution of wordcount application using MapReduce and other programming models such as Sphere concurrently, and (iii) demonstrating the scale-out of ensemble-based biomolecular simulations across multiple resources. We show user-level control of the relative placement of compute and data and also provide simple performance measures and analysis of SAGA-MapReduce when using multiple, different, heterogeneous infrastructures concurrently for the same problem instance. Finally, we discuss Azure and some of the system-level abstractions that it provides and show how it is used to support ensemble-based biomolecular simulations.
On the Feasibility of Interoperable Schemes in Hand Biometrics
Morales, Aythami; González, Ester; Ferrer, Miguel A.
2012-01-01
Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors. PMID:22438714
Extravehicular activity space suit interoperability.
Skoog, A I; McBarron JW 2nd; Severin, G I
1995-10-01
The European Agency (ESA) and the Russian Space Agency (RKA) are jointly developing a new space suit system for improved extravehicular activity (EVA) capabilities in support of the MIR Space Station Programme, the EVA Suit 2000. Recent national policy agreements between the U.S. and Russia on planned cooperations in manned space also include joint extravehicular activity (EVA). With an increased number of space suit systems and a higher operational frequency towards the end of this century an improved interoperability for both routine and emergency operations is of eminent importance. It is thus timely to report the current status of ongoing work on international EVA interoperability being conducted by the Committee on EVA Protocols and Operations of the International Academy of Astronauts initiated in 1991. This paper summarises the current EVA interoperability issues to be harmonised and presents quantified vehicle interface requirements for the current U.S. Shuttle EMU and Russian MIR Orlan DMA and the new European/Russian EVA Suit 2000 extravehicular systems. Major critical/incompatible interfaces for suits/mother-craft of different combinations are discussed, and recommendations for standardisations given.
Standard-compliant real-time transmission of ECGs: harmonization of ISO/IEEE 11073-PHD and SCP-ECG.
Trigo, Jesús D; Chiarugi, Franco; Alesanco, Alvaro; Martínez-Espronceda, Miguel; Chronaki, Catherine E; Escayola, Javier; Martínez, Ignacio; García, José
2009-01-01
Ambient assisted living and integrated care in an aging society is based on the vision of the lifelong Electronic Health Record calling for HealthCare Information Systems and medical device interoperability. For medical devices this aim can be achieved by the consistent implementation of harmonized international interoperability standards. The ISO/IEEE 11073 (x73) family of standards is a reference standard for medical device interoperability. In its Personal Health Device (PHD) version several devices have been included, but an ECG device specialization is not yet available. On the other hand, the SCP-ECG standard for short-term diagnostic ECGs (EN1064) has been recently approved as an international standard ISO/IEEE 11073-91064:2009. In this paper, the relationships between a proposed x73-PHD model for an ECG device and the fields of the SCP-ECG standard are investigated. A proof-of-concept implementation of the proposed x73-PHD ECG model is also presented, identifying open issues to be addressed by standards development for the wider interoperability adoption of x73-PHD standards.
EUnetHTA information management system: development and lessons learned.
Chalon, Patrice X; Kraemer, Peter
2014-11-01
The aim of this study was to describe the techniques used in achieving consensus on common standards to be implemented in the EUnetHTA Information Management System (IMS); and to describe how interoperability between tools was explored. Three face to face meetings were organized to identify and agree on common standards to the development of online tools. Two tools were created to demonstrate the added value of implementing interoperability standards at local levels. Developers of tools outside EUnetHTA were identified and contacted. Four common standards have been agreed on by consensus; and consequently all EUnetHTA tools have been modified or designed accordingly. RDF Site Summary (RSS) has demonstrated a good potential to support rapid dissemination of HTA information. Contacts outside EUnetHTA resulted in direct collaboration (HTA glossary, HTAi Vortal), evaluation of options for interoperability between tools (CRD HTA database) or a formal framework to prepare cooperation on concrete projects (INAHTA projects database). While being entitled a project on IT infrastructure, the work program was also about people. When having to agree on complex topics, fostering a cohesive group dynamic and hosting face to face meetings brings added value and enhances understanding between partners. The adoption of widespread standards enhanced the homogeneity of the EUnetHTA tools and should thus contribute to their wider use, therefore, to the general objective of EUnetHTA. The initiatives on interoperability of systems need to be developed further to support a general interoperable information system that could benefit the whole HTA community.
Tapuria, Archana; Kalra, Dipak; Kobayashi, Shinji
2013-12-01
The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes.
A Cross-Cultural Test of the Work-Family Interface in 48 Countries
ERIC Educational Resources Information Center
Jeffrey Hill, E.; Yang, Chongming; Hawkins, Alan J.; Ferris, Maria
2004-01-01
This study tests a cross-cultural model of the work-family interface. Using multigroup structural equation modeling with IBM survey responses from 48 countries (N= 25,380), results show that the same work-family interface model that fits the data globally also fits the data in a four-group model composed of culturally related groups of countries,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, John; Halbgewachs, Ron; Chavez, Adrian
The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relatingmore » to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock utilities into proprietary and closed systems.« less
Requirements Development for Interoperability Simulation Capability for Law Enforcement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holter, Gregory M.
2004-05-19
The National Counterdrug Center (NCC) was initially authorized by Congress in FY 1999 appropriations to create a simulation-based counterdrug interoperability training capability. As the lead organization for Research and Analysis to support the NCC, the Pacific Northwest National Laboratory (PNNL) was responsible for developing the requirements for this interoperability simulation capability. These requirements were structured to address the hardware and software components of the system, as well as the deployment and use of the system. The original set of requirements was developed through a process of conducting a user-based survey of requirements for the simulation capability, coupled with an analysismore » of similar development efforts. The user-based approach ensured that existing concerns with respect to interoperability within the law enforcement community would be addressed. Law enforcement agencies within the designated pilot area of Cochise County, Arizona, were surveyed using interviews and ride-alongs during actual operations. The results of this survey were then accumulated, organized, and validated with the agencies to ensure the accuracy of the results. These requirements were then supplemented by adapting operational requirements from existing systems to ensure system reliability and operability. The NCC adopted a development approach providing incremental capability through the fielding of a phased series of progressively more capable versions of the system. This allowed for feedback from system users to be incorporated into subsequent revisions of the system requirements, and also allowed the addition of new elements as needed to adapt the system to broader geographic and geopolitical areas, including areas along the southwest and northwest U.S. borders. This paper addresses the processes used to develop and refine requirements for the NCC interoperability simulation capability, as well as the response of the law enforcement community to the use of the NCC system. The paper also addresses the applicability of such an interoperability simulation capability to a broader set of law enforcement, border protection, site/facility security, and first-responder needs.« less
Molinari, Francesco; Pirronti, Tommaso; Sverzellati, Nicola; Diciotti, Stefano; Amato, Michele; Paolantonio, Guglielmo; Gentile, Luigia; Parapatt, George K; D'Argento, Francesco; Kuhnigk, Jan-Martin
2013-01-01
We aimed to compare the intra- and interoperator variability of lobar volumetry and emphysema scores obtained by semi-automated and manual segmentation techniques in lung emphysema patients. In two sessions held three months apart, two operators performed lobar volumetry of unenhanced chest computed tomography examinations of 47 consecutive patients with chronic obstructive pulmonary disease and lung emphysema. Both operators used the manual and semi-automated segmentation techniques. The intra- and interoperator variability of the volumes and emphysema scores obtained by semi-automated segmentation was compared with the variability obtained by manual segmentation of the five pulmonary lobes. The intra- and interoperator variability of the lobar volumes decreased when using semi-automated lobe segmentation (coefficients of repeatability for the first operator: right upper lobe, 147 vs. 96.3; right middle lobe, 137.7 vs. 73.4; right lower lobe, 89.2 vs. 42.4; left upper lobe, 262.2 vs. 54.8; and left lower lobe, 260.5 vs. 56.5; coefficients of repeatability for the second operator: right upper lobe, 61.4 vs. 48.1; right middle lobe, 56 vs. 46.4; right lower lobe, 26.9 vs. 16.7; left upper lobe, 61.4 vs. 27; and left lower lobe, 63.6 vs. 27.5; coefficients of reproducibility in the interoperator analysis: right upper lobe, 191.3 vs. 102.9; right middle lobe, 219.8 vs. 126.5; right lower lobe, 122.6 vs. 90.1; left upper lobe, 166.9 vs. 68.7; and left lower lobe, 168.7 vs. 71.6). The coefficients of repeatability and reproducibility of emphysema scores also decreased when using semi-automated segmentation and had ranges that varied depending on the target lobe and selected threshold of emphysema. Semi-automated segmentation reduces the intra- and interoperator variability of lobar volumetry and provides a more objective tool than manual technique for quantifying lung volumes and severity of emphysema.
NASA Astrophysics Data System (ADS)
Lucido, J. M.; Booth, N.
2014-12-01
Interoperable sharing of groundwater data across international boarders is essential for the proper management of global water resources. However storage and management of groundwater data is often times distributed across many agencies or organizations. Furthermore these data may be represented in disparate proprietary formats, posing a significant challenge for integration. For this reason standard data models are required to achieve interoperability across geographical and political boundaries. The GroundWater Markup Language 1.0 (GWML1) was developed in 2010 as an extension of the Geography Markup Language (GML) in order to support groundwater data exchange within Spatial Data Infrastructures (SDI). In 2013, development of GWML2 was initiated under the sponsorship of the Open Geospatial Consortium (OGC) for intended adoption by the international community as the authoritative standard for the transfer of groundwater feature data, including data about water wells, aquifers, and related entities. GWML2 harmonizes GWML1 and the EU's INSPIRE models related to geology and hydrogeology. Additionally, an interoperability experiment was initiated to test the model for commercial, technical, scientific, and policy use cases. The scientific use case focuses on the delivery of data required for input into computational flow modeling software used to determine the flow of groundwater within a particular aquifer system. It involves the delivery of properties associated with hydrogeologic units, observations related to those units, and information about the related aquifers. To test this use case web services are being implemented using GWML2 and WaterML2, which is the authoritative standard for water time series observations, in order to serve USGS water well and hydrogeologic data via standard OGC protocols. Furthermore, integration of these data into a computational groundwater flow model will be tested. This submission will present the GWML2 information model and results of an interoperability experiment with a particular emphasis on the scientific use case.
Bullinger, Monika; Quitmann, Julia; Silva, Neuza; Rohenkohl, Anja; Chaplin, John E; DeBusk, Kendra; Mimoun, Emmanuelle; Feigerlova, Eva; Herdman, Michael; Sanz, Dolores; Wollmann, Hartmut; Pleil, Andreas; Power, Michael
2014-01-01
Testing cross-cultural equivalence of patient-reported outcomes requires sufficiently large samples per country, which is difficult to achieve in rare endocrine paediatric conditions. We describe a novel approach to cross-cultural testing of the Quality of Life in Short Stature Youth (QoLISSY) questionnaire in five countries by sequentially taking one country out (TOCO) from the total sample and iteratively comparing the resulting psychometric performance. Development of the QoLISSY proceeded from focus group discussions through pilot testing to field testing in 268 short-statured patients and their parents. To explore cross-cultural equivalence, the iterative TOCO technique was used to examine and compare the validity, reliability, and convergence of patient and parent responses on QoLISSY in the field test dataset, and to predict QoLISSY scores from clinical, socio-demographic and psychosocial variables. Validity and reliability indicators were satisfactory for each sample after iteratively omitting one country. Comparisons with the total sample revealed cross-cultural equivalence in internal consistency and construct validity for patients and parents, high inter-rater agreement and a substantial proportion of QoLISSY variance explained by predictors. The TOCO technique is a powerful method to overcome problems of country-specific testing of patient-reported outcome instruments. It provides an empirical support to QoLISSY's cross-cultural equivalence and is recommended for future research.
NASA Astrophysics Data System (ADS)
Glaves, Helen; Schaap, Dick
2016-04-01
The increasingly ocean basin level approach to marine research has led to a corresponding rise in the demand for large quantities of high quality interoperable data. This requirement for easily discoverable and readily available marine data is currently being addressed by initiatives such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Australian Ocean Data Network (AODN) with each having implemented an e-infrastructure to facilitate the discovery and re-use of standardised multidisciplinary marine datasets available from a network of distributed repositories, data centres etc. within their own region. However, these regional data systems have been developed in response to the specific requirements of their users and in line with the priorities of the funding agency. They have also been created independently of the marine data infrastructures in other regions often using different standards, data formats, technologies etc. that make integration of marine data from these regional systems for the purposes of basin level research difficult. Marine research at the ocean basin level requires a common global framework for marine data management which is based on existing regional marine data systems but provides an integrated solution for delivering interoperable marine data to the user. The Ocean Data Interoperability Platform (ODIP/ODIP II) project brings together those responsible for the management of the selected marine data systems and other relevant technical experts with the objective of developing interoperability across the regional e-infrastructures. The commonalities and incompatibilities between the individual data infrastructures are identified and then used as the foundation for the specification of prototype interoperability solutions which demonstrate the feasibility of sharing marine data across the regional systems and also with relevant larger global data services such as GEO, COPERNICUS, IODE, POGO etc. The potential impact for the individual regional data infrastructures of implementing these prototype interoperability solutions is also being evaluated to determine both the technical and financial implications of their integration within existing systems. These impact assessments form part of the strategy to encourage wider adoption of the ODIP solutions and approach beyond the current scope of the project which is focussed on regional marine data systems in Europe, Australia, the USA and, more recently, Canada.
Kupper, Nina; Pedersen, Susanne S; Höfer, Stefan; Saner, Hugo; Oldridge, Neil; Denollet, Johan
2013-06-20
Type D (distressed) personality, the conjoint effect of negative affectivity (NA) and social inhibition (SI), predicts adverse cardiovascular outcomes, and is assessed with the 14-item Type D Scale (DS14). However, potential cross-cultural differences in Type D have not been examined yet in a direct comparison of countries. To examine the cross-cultural validity of the Type D construct and its relation with cardiovascular risk factors, cardiac symptom severity, and depression/anxiety. In 22 countries, 6222 patients with ischemic heart disease (angina, 33%; myocardial infarction, 37%; or heart failure, 30%) completed the DS14 as part of the International HeartQoL Project. Type D personality was assessed reliably across countries (αNA>.80; αSI>.74; except Russia, which was excluded from further analysis). Cross-cultural measurement equivalence was established for Type D personality at all measurement levels, as the factor-item configuration, factor loadings, and error structure were not different across countries (fit: CFI=.91; NFI=.88; RMSEA=.018), as well as across gender and diagnostic subgroups. Type D personality was more prevalent in Southern (37%) and Eastern (35%) European countries compared to Northern (24%) and Western European and English-speaking (both 27%) countries (p<.001). Type D was not confounded by cardiac symptom severity, but was associated with a higher prevalence of hypertension, smoking, sedentary lifestyle, and depression. Cross-cultural measurement equivalence was demonstrated for the Type D scale in 21 countries. There is a pan-cultural relationship between Type D personality and some cardiovascular risk factors, supporting the role of Type D personality across countries and cardiac conditions. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Gomes de Matos, Elena; Kraus, Ludwig; Hannemann, Tessa-Virginia; Soellner, Renate; Piontek, Daniela
2017-11-01
This study estimates cross-country variation in socioeconomic disparities in adolescent alcohol use and identifies country-level characteristics associated with these disparities. The association between socioeconomic status (family wealth and parental education) and alcohol use (lifetime use and episodic heavy drinking) of 15- to 16-year-olds from 32 European countries was investigated. Country-level characteristics were national income, income inequality and per capita alcohol consumption. Multilevel modelling was applied. Across countries, lifetime use was lower in wealthy than in less wealthy families (odds ratio [OR] (girls) = 0.95, OR (boys) = 0.94). The risk of episodic heavy drinking, in contrast, was higher for children from wealthier families (OR (girls) = 1.04, OR (boys) = 1.08) and lower when parents were highly educated (ORs = 0.95-0.98). Socioeconomic disparities varied substantially between countries. National wealth and income inequality were associated with cross-country variation of disparities in lifetime use in few comparisons, such that among girls, the (negative) effect of family wealth was greatest in countries with unequally distributed income (OR = 0.86). Among boys, the (negative) effect of family wealth was greatest in low-income countries (OR = 1.00), and the (positive) effect of mothers' education was greatest in countries with high income inequality (OR = 1.11). Socioeconomic disparities in adolescent alcohol use vary across European countries. Broad country-level indicators can explain this variation only to a limited extent, but results point towards slightly greater socioeconomic disparities in drinking in countries of low national income and countries with a high income inequality. [Gomes de Matos E, Kraus L, Hannemann T-V, Soellner R, Piontek D. Cross-cultural variation in the association between family's socioeconomic status and adolescent alcohol use. © 2017 Australasian Professional Society on Alcohol and other Drugs.
Building gold standard corpora for medical natural language processing tasks.
Deleger, Louise; Li, Qi; Lingren, Todd; Kaiser, Megan; Molnar, Katalin; Stoutenborough, Laura; Kouril, Michal; Marsolo, Keith; Solti, Imre
2012-01-01
We present the construction of three annotated corpora to serve as gold standards for medical natural language processing (NLP) tasks. Clinical notes from the medical record, clinical trial announcements, and FDA drug labels are annotated. We report high inter-annotator agreements (overall F-measures between 0.8467 and 0.9176) for the annotation of Personal Health Information (PHI) elements for a de-identification task and of medications, diseases/disorders, and signs/symptoms for information extraction (IE) task. The annotated corpora of clinical trials and FDA labels will be publicly released and to facilitate translational NLP tasks that require cross-corpora interoperability (e.g. clinical trial eligibility screening) their annotation schemas are aligned with a large scale, NIH-funded clinical text annotation project.
A General Quality Classification System for eIDs and e-Signatures
NASA Astrophysics Data System (ADS)
Ølnes, Jon; Buene, Leif; Andresen, Anette; Grindheim, Håvard; Apitzsch, Jörg; Rossi, Adriano
The PEPPOL (Pan-European Public Procurement On-Line) project is a large scale pilot under the CIP programme of the EU, exploring electronic public procurement in a unified European market. Interoperability of electronic signatures across borders is identified as a major obstacle to cross-border procurement. PEPPOL suggests specify-ing signature acceptance criteria in the form of signature policies that must be transparent and non-discriminatory. Validation solutions must then not only assess signature correctness but also signature policy adherence. This paper addresses perhaps the most important topic of a signature policy: Quality of eIDs and e-signatures. Discrete levels are suggested for: eID quality, assurance level for this quality, and for cryptographic quality of signatures.
Brokerage services for Earth Science data: the EuroGEOSS legacy (Invited)
NASA Astrophysics Data System (ADS)
Nativi, S.; Craglia, M.; Pearlman, J.
2013-12-01
Global sustainability research requires an integrated multidisciplinary effort underpinned by a collaborative environment discovering and accessing heterogeneous data across disciplines. Traditionally, interoperability has been achieved by implementing federation of systems. The federating approach entails the adoption of a set of common technologies and standards. This presentation argues that for complex (and uncontrolled) environments (such as global, multidisciplinary, and voluntary-based infrastructures) federated solutions must be completed and enhanced by a brokering approach -making available a set of brokerage services. In fact, brokerage services allows a cyber-infrastructure to lower entry barriers (for both data producers and users) and to better address the different domain specificities. The brokering interoperability approach was successfully experimented by the EuroGEOSS project, funded by the European Commission in the FP7 framework (see http://www.eurogeoss.eu). The EuroGEOSS Brokering framework provided the EuroGEOSS Capacity with multidisciplinary interoperability functionalities. This platform was developed applying several of the principles/requirements that characterize the System of Systems (SoS) approach and the Internet of Services (IoS) philosophy. The framework consists of three main brokers (middleware components implementing intermediation and harmonization services): a basic Discovery Broker, an advanced Semantic Discovery Broker, and an Access Broker. They are empowered by a suite of tools developed by the ESSI-lab of the CNR-IIA, called: GI-cat, GI-sem, and GI-axe. The EuroGEOSS brokering framework was considered and successfully adopted by cross-disciplinary initiatives (notably GEOSS: Global Earth Observation System of Systems). The brokerage services have been advanced and extended; the new brokering framework is called GEO DAB (Discovery and Access Broker). New brokerage services have been developed in the framework of other European Commission funded projects (e.g. GeoViQua). More recently, the NSF EarthCube initiative decided to fund a project dealing with brokerage services. In the framework of the GEO AIP-6 (Architecture Implementation Pilot -phase 6), the presented brokerage platform has been used by the Water Working Group to carry out improved data access for parameterization and model development.
Leverage and Delegation in Developing an Information Model for Geology
NASA Astrophysics Data System (ADS)
Cox, S. J.
2007-12-01
GeoSciML is an information model and XML encoding developed by a group of primarily geologic survey organizations under the auspices of the IUGS CGI. The scope of the core model broadly corresponds with information traditionally portrayed on a geologic map, viz. interpreted geology, some observations, the map legend and accompanying memoir. The development of GeoSciML has followed the methodology specified for an Application Schema defined by OGC and ISO 19100 series standards. This requires agreement within a community concerning their domain model, its formal representation using UML, documentation as a Feature Type Catalogue, with an XML Schema implementation generated from the model by applying a rule-based transformation. The framework and technology supports a modular governance process. Standard datatypes and GI components (geometry, the feature and coverage metamodels, metadata) are imported from the ISO framework. The observation and sampling model (including boreholes) is imported from OGC. The scale used for most scalar literal values (terms, codes, measures) allows for localization where necessary. Wildcards and abstract base- classes provide explicit extensibility points. Link attributes appear in a regular way in the encodings, allowing reference to external resources using URIs. The encoding is compatible with generic GI data-service interfaces (WFS, WMS, SOS). For maximum interoperability within a community, the interfaces may be specialised through domain-specified constraints (e.g. feature-types, scale and vocabulary bindings, query-models). Formalization using UML and XML allows use of standard validation and processing tools. Use of upper-level elements defined for generic GI application reduces the development effort and governance resonsibility, while maximising cross-domain interoperability. On the other hand, enabling specialization to be delegated in a controlled manner is essential to adoption across a range of subdisciplines and jurisdictions. The GeoSciML design team is responsible only for the part of the model that is unique to geology but for which general agreement can be reached within the domain. This paper is presented on behalf of the Interoperability Working Group of the IUGS Commission for Geoscience Information (CGI) - follow web-link for details of the membership.
A Dynamic Approach to Make CDS/ISIS Databases Interoperable over the Internet Using the OAI Protocol
ERIC Educational Resources Information Center
Jayakanth, F.; Maly, K.; Zubair, M.; Aswath, L.
2006-01-01
Purpose: A dynamic approach to making legacy databases, like CDS/ISIS, interoperable with OAI-compliant digital libraries (DLs). Design/methodology/approach: There are many bibliographic databases that are being maintained using legacy database systems. CDS/ISIS is one such legacy database system. It was designed and developed specifically for…
Joint Command and Control: Integration Not Interoperability
2013-03-01
separate computer and communication equipment. Besides having to engineer interoperability, the Services also must determine the level of...effects. Determines force responsiveness and allocates resources.5 This thesis argues Joint military operations will never be fully integrated as...processes and systems. Secondly, the limited depth of discussion risks implying (or the reader inferring) the solution is more straightforward than
Increasing Interoperability of E-Learning Content in Moodle within a Franco-Arabo Educative Context
ERIC Educational Resources Information Center
El Harrassi, Souad; Labour, Michel
2010-01-01
This article examines how Moodle, as an open-source Learning Management System, can be made more interoperable. The authors tested two software standards, LAMS and RELOAD, compatible with socio-constructivism norms. The analysis showed that pedagogic activities created with the LAMS-IMS Learning Design Level A Format are useable with Moodle but…
NASA Technical Reports Server (NTRS)
Yin, J.; Oyaki, A.; Hwang, C.; Hung, C.
2000-01-01
The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.
An Approach to Semantic Interoperability for Improved Capability Exchanges in Federations of Systems
ERIC Educational Resources Information Center
Moschoglou, Georgios
2013-01-01
This study seeks an affirmative answer to the question whether a knowledge-based approach to system of systems interoperation using semantic web standards and technologies can provide the centralized control of the capability for exchanging data and services lacking in a federation of systems. Given the need to collect and share real-time…
Toward interoperable bioscience data
Sansone, Susanna-Assunta; Rocca-Serra, Philippe; Field, Dawn; Maguire, Eamonn; Taylor, Chris; Hofmann, Oliver; Fang, Hong; Neumann, Steffen; Tong, Weida; Amaral-Zettler, Linda; Begley, Kimberly; Booth, Tim; Bougueleret, Lydie; Burns, Gully; Chapman, Brad; Clark, Tim; Coleman, Lee-Ann; Copeland, Jay; Das, Sudeshna; de Daruvar, Antoine; de Matos, Paula; Dix, Ian; Edmunds, Scott; Evelo, Chris T; Forster, Mark J; Gaudet, Pascale; Gilbert, Jack; Goble, Carole; Griffin, Julian L; Jacob, Daniel; Kleinjans, Jos; Harland, Lee; Haug, Kenneth; Hermjakob, Henning; Ho Sui, Shannan J; Laederach, Alain; Liang, Shaoguang; Marshall, Stephen; McGrath, Annette; Merrill, Emily; Reilly, Dorothy; Roux, Magali; Shamu, Caroline E; Shang, Catherine A; Steinbeck, Christoph; Trefethen, Anne; Williams-Jones, Bryn; Wolstencroft, Katherine; Xenarios, Ioannis; Hide, Winston
2012-01-01
To make full use of research data, the bioscience community needs to adopt technologies and reward mechanisms that support interoperability and promote the growth of an open ‘data commoning’ culture. Here we describe the prerequisites for data commoning and present an established and growing ecosystem of solutions using the shared ‘Investigation-Study-Assay’ framework to support that vision. PMID:22281772
ERIC Educational Resources Information Center
Akpabio, Akpabio Enebong Ema
2013-01-01
Despite huge growth in hospital technology systems, there remains a dearth of literature examining health care administrator's perceptions of the efficacy of interoperable EHR systems. A qualitative research methodology was used in this multiple-case study to investigate the application of diffusion of innovations theory and the technology…
Interoperability And Value Added To Earth Observation Data
NASA Astrophysics Data System (ADS)
Gasperi, J.
2012-04-01
Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.
Security and privacy of EHR systems--ethical, social and legal requirements.
Kluge, Eike-Henner W
2003-01-01
This paper addresses social, ethical and legal concerns about security and privacy that arise in the development of international interoperable health information systems. The paper deals with these concerns under four rubrics: the ethical status of electronic health records, the social and legal embedding of interoperable health information systems, the overall information-requirements healthcare as such, and the role of health information professionals as facilitators. It argues that the concerns that arise can be met if the development of interoperability protocols is guided by the seven basic principles of information ethics that have been enunciated in the IMIA Code of Ethics for Health Information Professionals and that are central to the ethical treatment of electronic health records.
NASA Astrophysics Data System (ADS)
Danobeitia, J.; Oscar, G.; Bartolomé, R.; Sorribas, J.; Del Rio, J.; Cadena, J.; Toma, D. M.; Bghiel, I.; Martinez, E.; Bardaji, R.; Piera, J.; Favali, P.; Beranzoli, L.; Rolin, J. F.; Moreau, B.; Andriani, P.; Lykousis, V.; Hernandez Brito, J.; Ruhl, H.; Gillooly, M.; Terrinha, P.; Radulescu, V.; O'Neill, N.; Best, M.; Marinaro, G.
2016-12-01
European Multidisciplinary seafloor and the Observatory of the water column for Development (EMSODEV) is a Horizon-2020 UE project whose overall objective is the operationalization of eleven marine observatories and four test sites distributed throughout Europe, from the Arctic to the Atlantic, from the Mediterranean to the Black Sea. The whole infrastructure is managed by the European consortium EMSO-ERIC (European Research Infrastructure Consortium) with the participation of 8 European countries and other partner countries. Now, we are implementing a Generic Sensor Module (EGIM) within the EMSO ERIC distributed marine research infrastructure. Our involvement is mainly on developing standard-compliant generic software for Sensor Web Enablement (SWE) on EGIM device. The main goal of this development is to support the sensors data acquisition on a new interoperable EGIM system. The EGIM software structure is made up of one acquisition layer located between the recorded data at EGIM module and the data management services. Therefore, two main interfaces are implemented: first, assuring the EGIM hardware acquisition and second allowing push and pull data from data management layer (Sensor Web Enable standard compliant). All software components used are Open source licensed and has been configured to manage different roles on the whole system (52º North SOS Server, Zabbix Monitoring System). The acquisition data module has been implemented with the aim to join all components for EGIM data acquisition and server fulfilling SOS standards interface. The system is already achieved awaiting for the first laboratory bench test and shallow water test connection to the OBSEA node, offshore Vilanova I la Geltrú (Barcelona, Spain). The EGIM module will record a wide range of ocean parameters in a long-term consistent, accurate and comparable manner from disciplines such as biology, geology, chemistry, physics, engineering, and computer science, from polar to subtropical environments, through the water column down to the deep sea. The measurements recorded along EMSO NODES are critical to respond accurately to the social and scientific challenges such as climate change, changes in marine ecosystems, and marine hazards.
Physiological responses of low-time private pilots to cross-country flying.
DOT National Transportation Integrated Search
1971-04-01
Various physiological, biochemical, and psychophysiological measurements were made on low-time private pilots who each flew three cross-country flights. The round-trip flights were 320, 520, and 960 NM in length. Heart rate was recorded continuously ...
Direct and indirect effects of unilateral divorce law on marital stability.
Kneip, Thorsten; Bauer, Gerrit; Reinhold, Steffen
2014-12-01
Previous research examining the impact of unilateral divorce law (UDL) on the prevalence of divorce has provided mixed results. Studies based on cross-sectional cross-country/cross-state survey data have received criticism for disregarding unobserved heterogeneity across countries, as have studies using country-level panel data for failing to account for possible mediating mechanisms at the micro level. We seek to overcome both shortcomings by using individual-level event-history data from 11 European countries (SHARELIFE) and controlling for unobserved heterogeneity over countries and cohorts. We find that UDL in total increased the incidence of marital breakdown by about 20 %. This finding, however, neglects potential selection effects into marriage. Accordingly, the estimated effect of unilateral divorce laws becomes much larger when we control for age at marriage, which is used as indicator for match quality. Moreover, we find that UDL particularly affects marital stability in the presence of children.
EuroGEOSS/GENESIS ``e-Habitat'' AIP-3 Use Scenario
NASA Astrophysics Data System (ADS)
Mazzetti, P.; Dubois, G.; Santoro, M.; Peedell, S.; de Longueville, B.; Nativi, S.; Craglia, M.
2010-12-01
Natural ecosystems are in rapid decline. Major habitats are disappearing at a speed never observed before. The current rate of species extinction is several orders of magnitude higher than the background rate from the fossil record. Protected Areas (PAs) and Protected Area Systems are designed to conserve natural and cultural resources, to maintain biodiversity (ecosystems, species, genes) and ecosystem services. The scientific challenge of understanding how environmental and climatological factors impact on ecosystems and habitats requires the use of information from different scientific domains. Thus, multidisciplinary interoperability is a crucial requirement for a framework aiming to support scientists. The Group on Earth Observations (or GEO) is coordinating international efforts to build a Global Earth Observation System of Systems (GEOSS). This emerging public infrastructure is interconnecting a diverse and growing array of instruments and systems for monitoring and forecasting changes in the global environment. This “system of systems” supports multidisciplinary and cross-disciplinary scientific researches. The presented GEOSS-based interoperability framework facilitates the discovery and exploitation of datasets and models from heterogeneous scientific domains and Information Technology services (data sources). The GEO Architecture and Data Committee (ADC) launched the Architecture Implementation Pilot (AIP) Initiative to develop and deploy new processes and infrastructure components for the GEOSS Common Infrastructure (GCI) and the broader GEOSS architecture. The current AIP Phase 3 (AIP-3) aims to increase GEOSS capacity to support several strategic Societal Benefit Areas (SBAs) including: Disaster Management, Health/Air Quality, Biodiversity, Energy, Health/Disease and Water. As to Biodiversity, the EC-funded EuroGEOSS (http://www.eurogeoss.eu) and GENESIS (http://www.genesis-fp7.eu) projects have developed a use scenario called “e-Habitat”. This scenario demonstrates how a GEOSS-based interoperability infrastructure can aid decision makers to assess and possibly forecast the irreplaceability of a given protected area, an essential indicator for assessing the criticality of threats this protected area is exposed to. Based on the previous AIP-Phase2 experience, the EuroGEOSS and GENESIS projects enhanced the successfully experimented interoperability infrastructure with: a) a discovery broker service which underpins semantics enabled queries: the EuroGEOSS/GENESIS Discovery Augmentation Component (DAC); b) environmental modeling components (i.e. OGC WPS instances) implementing algorithms to predict evolution of PAs ecosystems; c) a workflow engine to: i) browse semantic repositories; ii) retrieve concepts of interest; iii) search for resources (i.e. datasets and models) related to such concepts; iv) execute WPS instances. This presentation introduces the enhanced infrastructure developed by the EuroGEOSS/GENESIS AIP-3 Pilot to implement the “e-Habitat” use scenario. The presented infrastructure is accessible through the GEO Portal and is going to be used for demonstrating the “e-Habitat” model at the GEO Ministerial Meeting - Beijing, November 2010.
Semantics Enabled Queries in EuroGEOSS: a Discovery Augmentation Approach
NASA Astrophysics Data System (ADS)
Santoro, M.; Mazzetti, P.; Fugazza, C.; Nativi, S.; Craglia, M.
2010-12-01
One of the main challenges in Earth Science Informatics is to build interoperability frameworks which allow users to discover, evaluate, and use information from different scientific domains. This needs to address multidisciplinary interoperability challenges concerning both technological and scientific aspects. From the technological point of view, it is necessary to provide a set of special interoperability arrangement in order to develop flexible frameworks that allow a variety of loosely-coupled services to interact with each other. From a scientific point of view, it is necessary to document clearly the theoretical and methodological assumptions underpinning applications in different scientific domains, and develop cross-domain ontologies to facilitate interdisciplinary dialogue and understanding. In this presentation we discuss a brokering approach that extends the traditional Service Oriented Architecture (SOA) adopted by most Spatial Data Infrastructures (SDIs) to provide the necessary special interoperability arrangements. In the EC-funded EuroGEOSS (A European approach to GEOSS) project, we distinguish among three possible functional brokering components: discovery, access and semantics brokers. This presentation focuses on the semantics broker, the Discovery Augmentation Component (DAC), which was specifically developed to address the three thematic areas covered by the EuroGEOSS project: biodiversity, forestry and drought. The EuroGEOSS DAC federates both semantics (e.g. SKOS repositories) and ISO-compliant geospatial catalog services. The DAC can be queried using common geospatial constraints (i.e. what, where, when, etc.). Two different augmented discovery styles are supported: a) automatic query expansion; b) user assisted query expansion. In the first case, the main discovery steps are: i. the query keywords (the what constraint) are “expanded” with related concepts/terms retrieved from the set of federated semantic services. A default expansion regards the multilinguality relationship; ii. The resulting queries are submitted to the federated catalog services; iii. The DAC performs a “smart” aggregation of the queries results and provides them back to the client. In the second case, the main discovery steps are: i. the user browses the federated semantic repositories and selects the concepts/terms-of-interest; ii. The DAC creates the set of geospatial queries based on the selected concepts/terms and submits them to the federated catalog services; iii. The DAC performs a “smart” aggregation of the queries results and provides them back to the client. A Graphical User Interface (GUI) was also developed for testing and interacting with the DAC. The entire brokering framework is deployed in the context of EuroGEOSS infrastructure and it is used in a couple of GEOSS AIP-3 use scenarios: the “e-Habitat Use Scenario” for the Biodiversity and Climate Change topic, and the “Comprehensive Drought Index Use Scenario” for Water/Drought topic
Scherer, Ronny; Jansen, Malte; Nilsen, Trude; Areepattamannil, Shaljan; Marsh, Herbert W.
2016-01-01
Teachers’ self-efficacy is an important motivational construct that is positively related to a variety of outcomes for both the teachers and their students. This study addresses challenges associated with the commonly used ‘Teachers’ Sense of Self-Efficacy (TSES)’ measure across countries and provides a synergism between substantive research on teachers’ self-efficacy and the novel methodological approach of exploratory structural equation modeling (ESEM). These challenges include adequately representing the conceptual overlap between the facets of self-efficacy in a measurement model (cross-loadings) and comparing means and factor structures across countries (measurement invariance). On the basis of the OECD Teaching and Learning International Survey (TALIS) 2013 data set comprising 32 countries (N = 164,687), we investigate the effects of cross-loadings in the TSES measurement model on the results of measurement invariance testing and the estimation of relations to external constructs (i.e., working experience, job satisfaction). To further test the robustness of our results, we replicate the 32-countries analyses for three selected sub-groups of countries (i.e., Nordic, East and South-East Asian, and Anglo-Saxon country clusters). For each of the TALIS 2013 participating countries, we found that the factor structure of the self-efficacy measure is better represented by ESEM than by confirmatory factor analysis (CFA) models that do not allow for cross-loadings. For both ESEM and CFA, only metric invariance could be achieved. Nevertheless, invariance levels beyond metric invariance are better achieved with ESEM within selected country clusters. Moreover, the existence of cross-loadings did not affect the relations between the dimensions of teachers’ self-efficacy and external constructs. Overall, this study shows that a conceptual overlap between the facets of self-efficacy exists and can be well-represented by ESEM. We further argue for the cross-cultural generalizability of the corresponding measurement model. PMID:26959236
Scherer, Ronny; Jansen, Malte; Nilsen, Trude; Areepattamannil, Shaljan; Marsh, Herbert W
2016-01-01
Teachers' self-efficacy is an important motivational construct that is positively related to a variety of outcomes for both the teachers and their students. This study addresses challenges associated with the commonly used 'Teachers' Sense of Self-Efficacy (TSES)' measure across countries and provides a synergism between substantive research on teachers' self-efficacy and the novel methodological approach of exploratory structural equation modeling (ESEM). These challenges include adequately representing the conceptual overlap between the facets of self-efficacy in a measurement model (cross-loadings) and comparing means and factor structures across countries (measurement invariance). On the basis of the OECD Teaching and Learning International Survey (TALIS) 2013 data set comprising 32 countries (N = 164,687), we investigate the effects of cross-loadings in the TSES measurement model on the results of measurement invariance testing and the estimation of relations to external constructs (i.e., working experience, job satisfaction). To further test the robustness of our results, we replicate the 32-countries analyses for three selected sub-groups of countries (i.e., Nordic, East and South-East Asian, and Anglo-Saxon country clusters). For each of the TALIS 2013 participating countries, we found that the factor structure of the self-efficacy measure is better represented by ESEM than by confirmatory factor analysis (CFA) models that do not allow for cross-loadings. For both ESEM and CFA, only metric invariance could be achieved. Nevertheless, invariance levels beyond metric invariance are better achieved with ESEM within selected country clusters. Moreover, the existence of cross-loadings did not affect the relations between the dimensions of teachers' self-efficacy and external constructs. Overall, this study shows that a conceptual overlap between the facets of self-efficacy exists and can be well-represented by ESEM. We further argue for the cross-cultural generalizability of the corresponding measurement model.
Kalra, Dipak; Kobayashi, Shinji
2013-01-01
Objectives The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Methods Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. Results The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Conclusions Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes. PMID:24523993
WMS and WFS Standards Implementation of Weather Data
NASA Astrophysics Data System (ADS)
Armstrong, M.
2005-12-01
CustomWeather is private weather company that delivers global weather data products. CustomWeather has built a mapping platform according to OGC standards. Currently, both a Web Mapping Service (WMS) and Web Feature Service (WFS) are supported by CustomWeather. Supporting open geospatial standards has lead to number of positive changes internally to the processes of CustomWeather, along with those of the clients accessing the data. Quite a number of challenges surfaced during this process, particularly with respect to combining a wide variety of raw modeling and sensor data into a single delivery platform. Open standards have, however, made the delivery of very different data products rather seamless. The discussion will address the issues faced in building an OGC-based mapping platform along with the limitations encountered. While the availability of these data products through open standards is still very young, there have already been many adopters in the utility and navigation industries. The discussion will take a closer look at the different approach taken by these two industries as they utilize interoperability standards with existing data. Insight will be given in regards to applications already taking advantage of this new technology and how this is affecting decision-making processes. CustomWeather has observed considerable interest and potential benefit in this technology from developing countries. Weather data is a key element in disaster management. Interoperability is literally opening up a world of data and has the potential to quickly enable functionality that would otherwise take considerable time to implement. The discussion will briefly touch on our experience.
Tsuchida, Satoshi; Thome, Kurtis
2017-01-01
Radiometric cross-calibration between the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) and the Terra-Moderate Resolution Imaging Spectroradiometer (MODIS) has been partially used to derive the ASTER radiometric calibration coefficient (RCC) curve as a function of date on visible to near-infrared bands. However, cross-calibration is not sufficiently accurate, since the effects of the differences in the sensor’s spectral and spatial responses are not fully mitigated. The present study attempts to evaluate radiometric consistency across two sensors using an improved cross-calibration algorithm to address the spectral and spatial effects and derive cross-calibration-based RCCs, which increases the ASTER calibration accuracy. Overall, radiances measured with ASTER bands 1 and 2 are on averages 3.9% and 3.6% greater than the ones measured on the same scene with their MODIS counterparts and ASTER band 3N (nadir) is 0.6% smaller than its MODIS counterpart in current radiance/reflectance products. The percentage root mean squared errors (%RMSEs) between the radiances of two sensors are 3.7, 4.2, and 2.3 for ASTER band 1, 2, and 3N, respectively, which are slightly greater or smaller than the required ASTER radiometric calibration accuracy (4%). The uncertainty of the cross-calibration is analyzed by elaborating the error budget table to evaluate the International System of Units (SI)-traceability of the results. The use of the derived RCCs will allow further reduction of errors in ASTER radiometric calibration and subsequently improve interoperability across sensors for synergistic applications. PMID:28777329
Interoperability, Data Control and Battlespace Visualization using XML, XSLT and X3D
2003-09-01
26 Rosenthal, Arnon, Seligman , Len and Costello, Roger, XML, Databases, and Interoperability, Federal Database Colloquium, AFCEA, San Diego...79 Rosenthal, Arnon, Seligman , Len and Costello, Roger, “XML, Databases, and Interoperability”, Federal Database Colloquium, AFCEA, San Diego, 1999... Linda , Mastering XML, Premium Edition, SYBEX, 2001 Wooldridge, Michael , An Introduction to MultiAgent Systems, Wiley, 2002 PAPERS Abernathy, M
Message Received How to Bridge the Communication Gap and Save Lives
2004-03-01
safety during an emergency depend on the ability of first responders to talk via radio, directly, without dispatch and in real time. Many technologies are...Words interoperability Coast Guard first responders procedures interagency communications policies 18...communication interoperability for public safety first responders entails far more than finding and emplacing a technology and training the operators. The
EVA safety: Space suit system interoperability
NASA Technical Reports Server (NTRS)
Skoog, A. I.; McBarron, J. W.; Abramov, L. P.; Zvezda, A. O.
1995-01-01
The results and the recommendations of the International Academy of Astronautics extravehicular activities (IAA EVA) Committee work are presented. The IAA EVA protocols and operation were analyzed for harmonization procedures and for the standardization of safety critical and operationally important interfaces. The key role of EVA and how to improve the situation based on the identified EVA space suit system interoperability deficiencies were considered.
ERIC Educational Resources Information Center
Rocker, JoAnne; Roncaglia, George J.; Heimerl, Lynn N.; Nelson, Michael L.
Interoperability and data-exchange are critical for the survival of government information management programs. E-government initiatives are transforming the way the government interacts with the public. More information is to be made available through Web-enabled technologies. Programs such as the NASA's Scientific and Technical Information (STI)…
ERIC Educational Resources Information Center
Data Research Associates, Inc., St. Louis, MO.
The topic of open systems as it relates to the needs of libraries to establish interoperability between dissimilar computer systems can be clarified by an understanding of the background and evolution of the issue. The International Standards Organization developed a model to link dissimilar computers, and this model has evolved into consensus…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
... this document. FOR FURTHER INFORMATION CONTACT: Brenda Boykin, Wireless Telecommunications Bureau, (202... power levels of up to 1000 kW.\\2\\ The Lower A Block is also adjacent to the unpaired Lower 700 MHz E Block, where licensees (along with Lower 700 MHz D Block licensees) may operate at power levels up to 50...
Saleh, Kutaiba; Stucke, Stephan; Uciteli, Alexandr; Faulbrück-Röhr, Sebastian; Neumann, Juliane; Tahar, Kais; Ammon, Danny; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Portheine, Frank; Herre, Heinrich; Kaeding, André; Specht, Martin
2017-01-01
With the growing strain of medical staff and complexity of patient care, the risk of medical errors increases. In this work we present the use of Fast Healthcare Interoperability Resources (FHIR) as communication standard for the integration of an ontology- and agent-based system to identify risks across medical processes in a clinical environment.
Achieving control and interoperability through unified model-based systems and software engineering
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Ingham, Michel; Dvorak, Daniel
2005-01-01
Control and interoperation of complex systems is one of the most difficult challenges facing NASA's Exploration Systems Mission Directorate. An integrated but diverse array of vehicles, habitats, and supporting facilities, evolving over the long course of the enterprise, must perform ever more complex tasks while moving steadily away from the sphere of ground support and intervention.
Gray, John
2017-01-01
Machine-to-machine (M2M) communication is a key enabling technology for industrial internet of things (IIoT)-empowered industrial networks, where machines communicate with one another for collaborative automation and intelligent optimisation. This new industrial computing paradigm features high-quality connectivity, ubiquitous messaging, and interoperable interactions between machines. However, manufacturing IIoT applications have specificities that distinguish them from many other internet of things (IoT) scenarios in machine communications. By highlighting the key requirements and the major technical gaps of M2M in industrial applications, this article describes a collaboration-oriented M2M (CoM2M) messaging mechanism focusing on flexible connectivity and discovery, ubiquitous messaging, and semantic interoperability that are well suited for the production line-scale interoperability of manufacturing applications. The designs toward machine collaboration and data interoperability at both the communication and semantic level are presented. Then, the application scenarios of the presented methods are illustrated with a proof-of-concept implementation in the PicknPack food packaging line. Eventually, the advantages and some potential issues are discussed based on the PicknPack practice. PMID:29165347
Pyke, Christopher R; Madan, Isaac
2013-08-01
The real estate industry routinely uses specialized information systems for functions, including design, construction, facilities management, brokerage, tax assessment, and utilities. These systems are mature and effective within vertically integrated market segments. However, new questions are reaching across these traditional information silos. For example, buyers may be interested in evaluating the design, energy efficiency characteristics, and operational performance of a commercial building. This requires the integration of information across multiple databases held by different institutions. Today, this type of data integration is difficult to automate and propone to errors due, in part, to the lack of generally accepted building and spaces identifiers. Moving forward, the real estate industry needs a new mechanism to assign identifiers for whole buildings and interior spaces for the purpose of interoperability, data exchange, and integration. This paper describes a systematic process to identify activities occurring at building or within interior spaces to provide a foundation for exchange and interoperability. We demonstrate the application of the approach with a prototype Web application. This concept and demonstration illustrate the elements of a practical interoperability framework that can increase productivity, create new business opportunities, and reduce errors, waste, and redundancy. © 2013 New York Academy of Sciences.
Tyndall, Timothy; Tyndall, Ayami
2018-01-01
Healthcare directories are vital for interoperability among healthcare providers, researchers and patients. Past efforts at directory services have not provided the tools to allow integration of the diverse data sources. Many are overly strict, incompatible with legacy databases, and do not provide Data Provenance. A more architecture-independent system is needed to enable secure, GDPR-compatible (8) service discovery across organizational boundaries. We review our development of a portable Data Provenance Toolkit supporting provenance within Health Information Exchange (HIE) systems. The Toolkit has been integrated with client software and successfully leveraged in clinical data integration. The Toolkit validates provenance stored in a Blockchain or Directory record and creates provenance signatures, providing standardized provenance that moves with the data. This healthcare directory suite implements discovery of healthcare data by HIE and EHR systems via FHIR. Shortcomings of past directory efforts include the ability to map complex datasets and enabling interoperability via exchange endpoint discovery. By delivering data without dictating how it is stored we improve exchange and facilitate discovery on a multi-national level through open source, fully interoperable tools. With the development of Data Provenance resources we enhance exchange and improve security and usability throughout the health data continuum.
Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.
Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher
2017-08-01
Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.
An overview of the model integration process: From pre ...
Integration of models requires linking models which can be developed using different tools, methodologies, and assumptions. We performed a literature review with the aim of improving our understanding of model integration process, and also presenting better strategies for building integrated modeling systems. We identified five different phases to characterize integration process: pre-integration assessment, preparation of models for integration, orchestration of models during simulation, data interoperability, and testing. Commonly, there is little reuse of existing frameworks beyond the development teams and not much sharing of science components across frameworks. We believe this must change to enable researchers and assessors to form complex workflows that leverage the current environmental science available. In this paper, we characterize the model integration process and compare integration practices of different groups. We highlight key strategies, features, standards, and practices that can be employed by developers to increase reuse and interoperability of science software components and systems. The paper provides a review of the literature regarding techniques and methods employed by various modeling system developers to facilitate science software interoperability. The intent of the paper is to illustrate the wide variation in methods and the limiting effect the variation has on inter-framework reuse and interoperability. A series of recommendation
A review on digital ECG formats and the relationships between them.
Trigo, Jesús Daniel; Alesanco, Alvaro; Martínez, Ignacio; García, José
2012-05-01
A plethora of digital ECG formats have been proposed and implemented. This heterogeneity hinders the design and development of interoperable systems and entails critical integration issues for the healthcare information systems. This paper aims at performing a comprehensive overview on the current state of affairs of the interoperable exchange of digital ECG signals. This includes 1) a review on existing digital ECG formats, 2) a collection of applications and cardiology settings using such formats, 3) a compilation of the relationships between such formats, and 4) a reflection on the current situation and foreseeable future of the interoperable exchange of digital ECG signals. The objectives have been approached by completing and updating previous reviews on the topic through appropriate database mining. 39 digital ECG formats, 56 applications, tools or implantation experiences, 47 mappings/converters, and 6 relationships between such formats have been found in the literature. The creation and generalization of a single standardized ECG format is a desirable goal. However, this unification requires political commitment and international cooperation among different standardization bodies. Ongoing ontology-based approaches covering ECG domain have recently emerged as a promising alternative for reaching fully fledged ECG interoperability in the near future.
Bousquet, J; Hellings, P W; Agache, I; Bedbrook, A; Bachert, C; Bergmann, K C; Bewick, M; Bindslev-Jensen, C; Bosnic-Anticevitch, S; Bucca, C; Caimmi, D P; Camargos, P A M; Canonica, G W; Casale, T; Chavannes, N H; Cruz, A A; De Carlo, G; Dahl, R; Demoly, P; Devillier, P; Fonseca, J; Fokkens, W J; Guldemond, N A; Haahtela, T; Illario, M; Just, J; Keil, T; Klimek, L; Kuna, P; Larenas-Linnemann, D; Morais-Almeida, M; Mullol, J; Murray, R; Naclerio, R; O'Hehir, R E; Papadopoulos, N G; Pawankar, R; Potter, P; Ryan, D; Samolinski, B; Schunemann, H J; Sheikh, A; Simons, F E R; Stellato, C; Todo-Bom, A; Tomazic, P V; Valiulis, A; Valovirta, E; Ventura, M T; Wickman, M; Young, I; Yorgancioglu, A; Zuberbier, T; Aberer, W; Akdis, C A; Akdis, M; Annesi-Maesano, I; Ankri, J; Ansotegui, I J; Anto, J M; Arnavielhe, S; Asarnoj, A; Arshad, H; Avolio, F; Baiardini, I; Barbara, C; Barbagallo, M; Bateman, E D; Beghé, B; Bel, E H; Bennoor, K S; Benson, M; Białoszewski, A Z; Bieber, T; Bjermer, L; Blain, H; Blasi, F; Boner, A L; Bonini, M; Bonini, S; Bosse, I; Bouchard, J; Boulet, L P; Bourret, R; Bousquet, P J; Braido, F; Briggs, A H; Brightling, C E; Brozek, J; Buhl, R; Bunu, C; Burte, E; Bush, A; Caballero-Fonseca, F; Calderon, M A; Camuzat, T; Cardona, V; Carreiro-Martins, P; Carriazo, A M; Carlsen, K H; Carr, W; Cepeda Sarabia, A M; Cesari, M; Chatzi, L; Chiron, R; Chivato, T; Chkhartishvili, E; Chuchalin, A G; Chung, K F; Ciprandi, G; de Sousa, J Correia; Cox, L; Crooks, G; Custovic, A; Dahlen, S E; Darsow, U; Dedeu, T; Deleanu, D; Denburg, J A; De Vries, G; Didier, A; Dinh-Xuan, A T; Dokic, D; Douagui, H; Dray, G; Dubakiene, R; Durham, S R; Du Toit, G; Dykewicz, M S; Eklund, P; El-Gamal, Y; Ellers, E; Emuzyte, R; Farrell, J; Fink Wagner, A; Fiocchi, A; Fletcher, M; Forastiere, F; Gaga, M; Gamkrelidze, A; Gemicioğlu, B; Gereda, J E; van Wick, R Gerth; González Diaz, S; Grisle, I; Grouse, L; Gutter, Z; Guzmán, M A; Hellquist-Dahl, B; Heinrich, J; Horak, F; Hourihane, J O' B; Humbert, M; Hyland, M; Iaccarino, G; Jares, E J; Jeandel, C; Johnston, S L; Joos, G; Jonquet, O; Jung, K S; Jutel, M; Kaidashev, I; Khaitov, M; Kalayci, O; Kalyoncu, A F; Kardas, P; Keith, P K; Kerkhof, M; Kerstjens, H A M; Khaltaev, N; Kogevinas, M; Kolek, V; Koppelman, G H; Kowalski, M L; Kuitunen, M; Kull, I; Kvedariene, V; Lambrecht, B; Lau, S; Laune, D; Le, L T T; Lieberman, P; Lipworth, B; Li, J; Lodrup Carlsen, K C; Louis, R; Lupinek, C; MacNee, W; Magar, Y; Magnan, A; Mahboub, B; Maier, D; Majer, I; Malva, J; Manning, P; De Manuel Keenoy, E; Marshall, G D; Masjedi, M R; Mathieu-Dupas, E; Maurer, M; Mavale-Manuel, S; Melén, E; Melo-Gomes, E; Meltzer, E O; Mercier, J; Merk, H; Miculinic, N; Mihaltan, F; Milenkovic, B; Millot-Keurinck, J; Mohammad, Y; Momas, I; Mösges, R; Muraro, A; Namazova-Baranova, L; Nadif, R; Neffen, H; Nekam, K; Nieto, A; Niggemann, B; Nogueira-Silva, L; Nogues, M; Nyembue, T D; Ohta, K; Okamoto, Y; Okubo, K; Olive-Elias, M; Ouedraogo, S; Paggiaro, P; Pali-Schöll, I; Palkonen, S; Panzner, P; Papi, A; Park, H S; Passalacqua, G; Pedersen, S; Pereira, A M; Pfaar, O; Picard, R; Pigearias, B; Pin, I; Plavec, D; Pohl, W; Popov, T A; Portejoie, F; Postma, D; Poulsen, L K; Price, D; Rabe, K F; Raciborski, F; Roberts, G; Robalo-Cordeiro, C; Rodenas, F; Rodriguez-Mañas, L; Rolland, C; Roman Rodriguez, M; Romano, A; Rosado-Pinto, J; Rosario, N; Rottem, M; Sanchez-Borges, M; Sastre-Dominguez, J; Scadding, G K; Scichilone, N; Schmid-Grendelmeier, P; Serrano, E; Shields, M; Siroux, V; Sisul, J C; Skrindo, I; Smit, H A; Solé, D; Sooronbaev, T; Spranger, O; Stelmach, R; Sterk, P J; Strandberg, T; Sunyer, J; Thijs, C; Triggiani, M; Valenta, R; Valero, A; van Eerd, M; van Ganse, E; van Hague, M; Vandenplas, O; Varona, L L; Vellas, B; Vezzani, G; Vazankari, T; Viegi, G; Vontetsianos, T; Wagenmann, M; Walker, S; Wang, D Y; Wahn, U; Werfel, T; Whalley, B; Williams, D M; Williams, S; Wilson, N; Wright, J; Yawn, B P; Yiallouros, P K; Yusuf, O M; Zaidi, A; Zar, H J; Zernotti, M E; Zhang, L; Zhong, N; Zidarn, M
2016-01-01
The Allergic Rhinitis and its Impact on Asthma (ARIA) initiative commenced during a World Health Organization workshop in 1999. The initial goals were (1) to propose a new allergic rhinitis classification, (2) to promote the concept of multi-morbidity in asthma and rhinitis and (3) to develop guidelines with all stakeholders that could be used globally for all countries and populations. ARIA-disseminated and implemented in over 70 countries globally-is now focusing on the implementation of emerging technologies for individualized and predictive medicine. MASK [MACVIA ( Contre les Maladies Chroniques pour un Vieillissement Actif )-ARIA Sentinel NetworK] uses mobile technology to develop care pathways for the management of rhinitis and asthma by a multi-disciplinary group and by patients themselves. An app (Android and iOS) is available in 20 countries and 15 languages. It uses a visual analogue scale to assess symptom control and work productivity as well as a clinical decision support system. It is associated with an inter-operable tablet for physicians and other health care professionals. The scaling up strategy uses the recommendations of the European Innovation Partnership on Active and Healthy Ageing. The aim of the novel ARIA approach is to provide an active and healthy life to rhinitis sufferers, whatever their age, sex or socio-economic status, in order to reduce health and social inequalities incurred by the disease.
Yorgancıoğlu, Ayşe Arzu; Kalaycı, Ömer; Cingi, Cemal; Gemicioğlu, Bilun; Kalyoncu, Ali Fuat; Agache, Iogana; Bachert, Claus; Bedbrook, Anna; Canonica, George Walter; Casale, Thomas; Cruz, Alvaro; Fokkens, Wytsk Ej; Hellings, Peter; Samolinski, Boleslaw; Bousquet, Jean
2017-03-01
The Allergic Rhinitis and its Impact on Asthma (ARIA) initiative commenced during a World Health Organization (WHO) workshop in 1999. The initial goals were (i) to propose a new allergic rhinitis classification, (ii) to promote the concept of multi-morbidity in asthma and rhinitis and (iii) to develop guidelines with all stakeholders for global use in all countries and populations. ARIA- disseminated and implemented in over 70 countries globally- is now focusing on the implementation of emerging technologies for individualized and predictive medicine. MASK (MACVIA (Contre les MAladies Chroniques pour un VIeillissement Actif)-ARIA Sentinel NetworK) uses mobile technology to develop care pathways in order to enable the management of rhinitis and asthma by a multi-disciplinary group or by patients themselves. An App (Android and iOS) is available in 20 countries and 15 languages. It uses a visual analogue scale to assess symptom control and work productivity as well as a clinical decision support system. It is associated with an inter-operable tablet for physicians and other health care professionals. The scaling up strategy uses the recommendations of the European Innovation Partnership on Active and Healthy Ageing. The aim of the novel ARIA approach is to provide an active and healthy life to rhinitis sufferers, whatever their age, sex or socio-economic status, in order to reduce health and social inequalities incurred by the disease.
Modelling and approaching pragmatic interoperability of distributed geoscience data
NASA Astrophysics Data System (ADS)
Ma, Xiaogang
2010-05-01
Interoperability of geodata, which is essential for sharing information and discovering insights within a cyberinfrastructure, is receiving increasing attention. A key requirement of interoperability in the context of geodata sharing is that data provided by local sources can be accessed, decoded, understood and appropriately used by external users. Various researchers have discussed that there are four levels in data interoperability issues: system, syntax, schematics and semantics, which respectively relate to the platform, encoding, structure and meaning of geodata. Ontology-driven approaches have been significantly studied addressing schematic and semantic interoperability issues of geodata in the last decade. There are different types, e.g. top-level ontologies, domain ontologies and application ontologies and display forms, e.g. glossaries, thesauri, conceptual schemas and logical theories. Many geodata providers are maintaining their identified local application ontologies in order to drive standardization in local databases. However, semantic heterogeneities often exist between these local ontologies, even though they are derived from equivalent disciplines. In contrast, common ontologies are being studied in different geoscience disciplines (e.g., NAMD, SWEET, etc.) as a standardization procedure to coordinate diverse local ontologies. Semantic mediation, e.g. mapping between local ontologies, or mapping local ontologies to common ontologies, has been studied as an effective way of achieving semantic interoperability between local ontologies thus reconciling semantic heterogeneities in multi-source geodata. Nevertheless, confusion still exists in the research field of semantic interoperability. One problem is caused by eliminating elements of local pragmatic contexts in semantic mediation. Comparing to the context-independent feature of a common domain ontology, local application ontologies are closely related to elements (e.g., people, time, location, intention, procedure, consequence, etc.) of local pragmatic contexts and thus context-dependent. Elimination of these elements will inevitably lead to information loss in semantic mediation between local ontologies. Correspondingly, understanding and effect of exchanged data in a new context may differ from that in its original context. Another problem is the dilemma on how to find a balance between flexibility and standardization of local ontologies, because ontologies are not fixed, but continuously evolving. It is commonly realized that we cannot use a unified ontology to replace all local ontologies because they are context-dependent and need flexibility. However, without coordination of standards, freely developed local ontologies and databases will bring enormous work of mediation between them. Finding a balance between standardization and flexibility for evolving ontologies, in a practical sense, requires negotiations (i.e. conversations, agreements and collaborations) between different local pragmatic contexts. The purpose of this work is to set up a computer-friendly model representing local pragmatic contexts (i.e. geodata sources), and propose a practical semantic negotiation procedure for approaching pragmatic interoperability between local pragmatic contexts. Information agents, objective facts and subjective dimensions are reviewed as elements of a conceptual model for representing pragmatic contexts. The author uses them to draw a practical semantic negotiation procedure approaching pragmatic interoperability of distributed geodata. The proposed conceptual model and semantic negotiation procedure were encoded with Description Logic, and then applied to analyze and manipulate semantic negotiations between different local ontologies within the National Mineral Resources Assessment (NMRA) project of China, which involves multi-source and multi-subject geodata sharing.
Networking Cyberinfrastructure Resources to Support Global, Cross-disciplinary Science
NASA Astrophysics Data System (ADS)
Lehnert, K.; Ramamurthy, M. K.
2016-12-01
Geosciences are globally connected by nature and the grand challenge problems like climate change, ocean circulations, seasonal predictions, impact of volcanic eruptions, etc. all transcend both disciplinary and geographic boundaries, requiring cross-disciplinary and international partnerships. Cross-disciplinary and international collaborations are also needed to unleash the power of cyber- (or e-) infrastructure (CI) by networking globally distributed, multi-disciplinary data, software, and computing resources to accelerate new scientific insights and discoveries. While the promises of a global and cross-disciplinary CI are exhilarating and real, a range of technical, organizational, and social challenges needs to be overcome in order to achieve alignment and linking of operational data systems, software tools, and computing facilities. New modes of collaboration require agreement on and governance of technical standards and best practices, and funding for necessary modifications. This presentation will contribute the perspective of domain-specific data facilities to the discussion of cross-disciplinary and international collaboration in CI development and deployment, in particular those of IEDA (Interdisciplinary Earth Data Alliance) serving the solid Earth sciences and Unidata serving atmospheric sciences. Both facilities are closely involved with the US NSF EarthCube program that aims to network and augment existing Geoscience CI capabilities "to make disciplinary boundaries permeable, nurture and facilitate knowledge sharing, …, and enhance collaborative pursuit of cross-disciplinary research" (EarthCube Strategic Vision), while also collaborating internationally to network domain-specific and cross-disciplinary CI resources. These collaborations are driven by the substantial benefits to the science community, but create challenges, when operational and funding constraints need to be balanced with adjustments to new joint data curation practices and interoperability standards.
Cross-Cultural Consistency of the Demand/Withdraw Interaction Pattern in Couples
ERIC Educational Resources Information Center
Christensen, Andrew; Eldridge, Kathleen; Catta-Preta, Adriana Bokel; Lim, Veronica R.; Santagata, Rossella
2006-01-01
In order to examine the cross-cultural consistency of several patterns of couple communication, 363 participants from four different countries (Brazil, Italy, Taiwan, and the United States) completed self-report measures about communication and satisfaction in their romantic relationships. Across countries, constructive communication was…
Maximum Power Training and Plyometrics for Cross-Country Running.
ERIC Educational Resources Information Center
Ebben, William P.
2001-01-01
Provides a rationale for maximum power training and plyometrics as conditioning strategies for cross-country runners, examining: an evaluation of training methods (strength training and maximum power training and plyometrics); biomechanic and velocity specificity (role in preventing injury); and practical application of maximum power training and…
Cross-Cultural Comparison of Maternal Sleep
Mindell, Jodi A.; Sadeh, Avi; Kwon, Robert; Goh, Daniel Y. T.
2013-01-01
Background: To characterize cross-cultural sleep patterns and sleep problems in a large sample of mothers of children (ages birth to 6 years) in multiple predominantly Asian and predominantly Caucasian countries. Methods: Mothers of 10,085 young children (predominantly Asian countries/regions: China, Hong Kong, India, Korea, Japan, Malaysia, Philippines, Singapore, Thailand; predominantly Caucasian countries: Australia, Canada, New Zealand, United Kingdom, United States) completed an internet-based expanded version of the Pittsburgh Sleep Quality Index. Results: Mothers in predominantly Asian countries/regions had later bedtimes, decreased number and duration of night wakings, more nighttime sleep, and more total sleep than mothers from predominantly Caucasian countries, P < 0.001. More than half (54.7%) of mothers reported having poor sleep, ranging from 50.9% of mothers in Malaysia to 77.8% of mothers in Japan. Sleep disturbance symptoms were quite common, especially symptoms related to insomnia, and were more likely to be reported by mothers in predominantly Caucasian countries. However, psychosocial factors, including having children of a younger age, being unemployed, and having a lower education level were the best predictors of poor sleep, whereas culture was not a significant predictor. Conclusions: Overall, mothers in predominantly Asian countries/regions reported later bedtimes but sleeping better and longer than mothers from predominantly Caucasian countries, which is dissimilar to cross-cultural findings of young children. Psychosocial factors were found to be the best predictors of poor sleep, irrespective of culture. Further studies are needed to understand the impact of these findings. Citation: Mindell JA; Sadeh A; Kwon R; Goh DYT. Cross-cultural comparison of maternal sleep. SLEEP 2013;36(11):1699-1706. PMID:24179304
Medical informatics in morocco.
Bouhaddou, O; Bennani Othmani, M; Diouny, S
2013-01-01
Informatics is an essential tool for helping to transform healthcare from a paper-based to a digital sector. This article explores the state-of-the-art of health informatics in Morocco. Specifically, it aims to give a general overview of the Moroccan healthcare system, the challenges it is facing, and the efforts undertaken by the informatics community and Moroccan government in terms of education, research and practice to reform the country's health sector. Through the experience of establishing Medical Informatics as a medical specialty in 2008, creating a Moroccan Medical Informatics Association in 2010 and holding a first national congress took place in April 2012, the authors present their assessment of some important priorities for health informatics in Morocco. These Moroccan initiatives are facilitating collaboration in education, research, and implementation of clinical information systems. In particular, the stakeholders have recognized the need for a national coordinator office and the development of a national framework for standards and interoperability. For developing countries like Morocco, new health IT approaches like mobile health and trans-media health advertising could help optimize scarce resources, improve access to rural areas and focus on the most prevalent health problems, optimizing health care access, quality, and cost for Morocco population.
Medical Device Plug-and-Play Interoperability Standards & Technology Leadership
2011-10-01
official Department of the Army position, policy or decision unless so designated by other documentation. REPORT DOCUMENTATION PAGE Form Approved...biomedical engineering students completed their senior design project on the X-Ray / Ventilator Use Case. We worked closely with the students to...Supporting Medical Device Adverse Event Analysis in an Interoperable Clinical Environment: Design of a Data Logging and Playback System,” Publication in
Sensor Web Interoperability Testbed Results Incorporating Earth Observation Satellites
NASA Technical Reports Server (NTRS)
Frye, Stuart; Mandl, Daniel J.; Alameh, Nadine; Bambacus, Myra; Cappelaere, Pat; Falke, Stefan; Derezinski, Linda; Zhao, Piesheng
2007-01-01
This paper describes an Earth Observation Sensor Web scenario based on the Open Geospatial Consortium s Sensor Web Enablement and Web Services interoperability standards. The scenario demonstrates the application of standards in describing, discovering, accessing and tasking satellites and groundbased sensor installations in a sequence of analysis activities that deliver information required by decision makers in response to national, regional or local emergencies.
A distributed component framework for science data product interoperability
NASA Technical Reports Server (NTRS)
Crichton, D.; Hughes, S.; Kelly, S.; Hardman, S.
2000-01-01
Correlation of science results from multi-disciplinary communities is a difficult task. Traditionally data from science missions is archived in proprietary data systems that are not interoperable. The Object Oriented Data Technology (OODT) task at the Jet Propulsion Laboratory is working on building a distributed product server as part of a distributed component framework to allow heterogeneous data systems to communicate and share scientific results.
Rafael Moreno-Sanchez
2006-01-01
The aim of this is paper is to provide a conceptual framework for the session: âThe role of web-based Geographic Information Systems in supporting sustainable management.â The concepts of sustainability, sustainable forest management, Web Services, Distributed Geographic Information Systems, interoperability, Open Specifications, and Open Source Software are defined...
Interoperable Acquisition for Systems of Systems: The Challenges
2006-09-01
Interoperable Acquisition for Systems of Systems: The Challenges James D. Smith II D. Mike Phillips September 2006 TECHNICAL NOTE...Failure of Program-Centric Risk Management 10 3.3.2 Absence of System-of-Systems Engineering 12 3.3.3 Disconnect Between System-of-Systems...SOFTWARE ENGINEERING INSTITUTE | vii viii | CMU/SEI-2006-TN-034 Abstract Large, complex systems development has always been challenging , even when the
2014-01-01
termed the Galileo -GPS Time Offset (GGTO), and it will be Type 35 in the GPS CNAV message. Knowledge of the GGTO makes it possible for a properly...U.S. Naval Observatory (USNO) [1]. Interoperability with Galileo , and perhaps someday with other Global Navigation Satellite Systems (GNSS), is to...Interoperability with Galileo , and perhaps someday with other Global Navigation Satellite Systems (GNSS), is to be established through transmission of the
Difficulties with True Interoperability in Modeling & Simulation
2011-12-01
2009. Programming Scala : Scalability = Functional Programming + Ob- jects. 1 st ed. O‟Reilly Media. 2652 Gallant and Gaughan AUTHOR BIOGRAPHIES...that develops a model or simulation has a specific purpose, set of requirements and limited funding. These programs cannot afford to coordinate with...implementation. The program offices should budget for and plan for coordination across domain projects within a limited scope to improve interoperability with
KAPSE Interface Team (KIT) Public Report. Volume 7
1989-10-01
e E&V STATUS - Ray Szymanski (Wriaht-Patterson) was announced as the new chairman of the Evaluation and Validation Team. Two RFP’s are coming, for...are language interoperability problems (implementations using multiple languages where interoperability problems are experienced such as transferring...visitors were introduced. Ray Szymanski , the Evaluation & Validation Team Leader, is replacing Jinny Castor from Wright-Patterson Air Force Base. Dr
NASA Astrophysics Data System (ADS)
Glaves, H. M.; Schaap, D.
2014-12-01
As marine research becomes increasingly multidisciplinary in its approach there has been a corresponding rise in the demand for large quantities of high quality interoperable data. A number of regional initiatives are already addressing this requirement through the establishment of e-infrastructures to improve the discovery and access of marine data. Projects such as Geo-Seas and SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and IMOS in Australia have implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these regional initiatives has been developed to address their own requirements and independently of other regions. To establish a common framework for marine data management on a global scale these is a need to develop interoperability solutions that can be implemented across these initiatives.Through a series of workshops attended by the relevant domain specialists, the Ocean Data Interoperability Platform (ODIP) project has identified areas of commonality between the regional infrastructures and used these as the foundation for the development of three prototype interoperability solutions addressing: the use of brokering services for the purposes of providing access to the data available in the regional data discovery and access services including via the GEOSS portal the development of interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) portal the establishment of a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE) These prototypes will be used to underpin the development of a common global approach to the management of marine data which can be promoted to the wider marine research community. ODIP is a community lead project that is currently focussed on regional initiatives in Europe, the USA and Australia but which is seeking to expand this framework to include other regional marine data infrastructures.
NASA Astrophysics Data System (ADS)
Schaap, D.
2015-12-01
Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP 2 for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; establishing interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) global portal; establishing common standards for a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE)
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Glaves, Helen
2016-04-01
Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP II for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: 1. establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; 2. establishing interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) global portal; 3. the establishment of common standards for a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE)
Extending the GI Brokering Suite to Support New Interoperability Specifications
NASA Astrophysics Data System (ADS)
Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.
2014-12-01
The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by public administrations. CERIF: used by CRIS (Current Research Information System) instances. HYRAX Server: a scientific dataset publishing component. This presentation will discuss these and other latest GI suite extensions implemented to support new interoperability protocols in use by the Earth Science Communities.
NASA Astrophysics Data System (ADS)
Fulker, D. W.; Gallagher, J. H. R.
2015-12-01
OPeNDAP's Hyrax data server is an open-source framework fostering interoperability via easily-deployed Web services. Compatible with solutions listed in the (PA001) session description—federation, rigid standards and brokering/mediation—the framework can support tight or loose coupling, even with dependence on community-contributed software. Hyrax is a Web-services framework with a middleware-like design and a handler-style architecture that together reduce the interoperability challenge (for N datatypes and M user contexts) to an O(N+M) problem, similar to brokering. Combined with an open-source ethos, this reduction makes Hyrax a community tool for gaining interoperability. E.g., in its response to the Big Earth Data Initiative (BEDI), NASA references OPeNDAP-based interoperability. Assuming its suitability, the question becomes: how sustainable is OPeNDAP, a small not-for-profit that produces open-source software, i.e., has no software-sales? In other words, if geoscience interoperability depends on OPeNDAP and similar organizations, are those entities in turn sustainable? Jim Collins (in Good to Great) highlights three questions that successful companies can answer (paraphrased here): What is your passion? Where is your world-class excellence? What drives your economic engine? We attempt to shed light on OPeNDAP sustainability by examining these. Passion: OPeNDAP has a focused passion for improving the effectiveness of scientific data sharing and use, as deeply-cooperative community endeavors. Excellence: OPeNDAP has few peers in remote, scientific data access. Skills include computer science with experience in data science, (operational, secure) Web services, and software design (for servers and clients, where the latter vary from Web pages to standalone apps and end-user programs). Economic Engine: OPeNDAP is an engineering services organization more than a product company, despite software being key to OPeNDAP's reputation. In essence, provision of engineering expertise, via contracts and grants, is the economic engine. Hence sustainability, as needed to address global grand challenges in geoscience, depends on agencies' and others' abilities and willingness to offer grants and let contracts for continually upgrading open-source software from OPeNDAP and others.
Oil price and exchange rate co-movements in Asian countries: Detrended cross-correlation approach
NASA Astrophysics Data System (ADS)
Hussain, Muntazir; Zebende, Gilney Figueira; Bashir, Usman; Donghong, Ding
2017-01-01
Most empirical literature investigates the relation between oil prices and exchange rate through different models. These models measure this relationship on two time scales (long and short terms), and often fail to observe the co-movement of these variables at different time scales. We apply a detrended cross-correlation approach (DCCA) to investigate the co-movements of the oil price and exchange rate in 12 Asian countries. This model determines the co-movements of oil price and exchange rate at different time scale. The exchange rate and oil price time series indicate unit root problem. Their correlation and cross-correlation are very difficult to measure. The result becomes spurious when periodic trend or unit root problem occurs in these time series. This approach measures the possible cross-correlation at different time scale and controlling the unit root problem. Our empirical results support the co-movements of oil prices and exchange rate. Our results support a weak negative cross-correlation between oil price and exchange rate for most Asian countries included in our sample. The results have important monetary, fiscal, inflationary, and trade policy implications for these countries.
Rahman, Syed Abidur; Taghizadeh, Seyedeh Khadijeh; Ramayah, T; Ahmad, Noor Hazlina
2015-01-01
Service innovation management practice is currently being widely scrutinized mainly in the developed countries, where it has been initiated. The current study attempts to propose a framework and empirically validate and explain the service innovation practices for successful performance in the telecommunications industry of two developing countries, Malaysia and Bangladesh. The research framework proposes relationships among organisational culture, operating core (innovation process, cross-functional organisation, and implementation of tools/technology), competition-informed pricing, and performance. A total of 176 usable data from both countries are analysed for the purpose of the research. The findings show that organisational culture tends to be more influential on innovation process and cross-functional organisation in Malaysian telecommunication industry. In contrast, implementation of tools/technology plays a more instrumental role in competition-informed pricing practices in Bangladesh. This study revealed few differences in the innovation management practices between two developing countries. The findings have strategic implications for the service sectors in both the developing countries regarding implementation of innovative enterprises, especially in Bangladesh where innovation is the basis for survival. Testing the innovation management practices in the developing countries perhaps contains uniqueness in the field of innovation management.
Dahlin, Johanna; Härkönen, Juho
2013-12-01
Multiple studies have found that women report being in worse health despite living longer. Gender gaps vary cross-nationally, but relatively little is known about the causes of comparative differences. Existing literature is inconclusive as to whether gender gaps in health are smaller in more gender equal societies. We analyze gender gaps in self-rated health (SRH) and limiting longstanding illness (LLI) with five waves of European Social Survey data for 191,104 respondents from 28 countries. We use means, odds ratios, logistic regressions, and multilevel random slopes logistic regressions. Gender gaps in subjective health vary visibly across Europe. In many countries (especially in Eastern and Southern Europe), women report distinctly worse health, while in others (such as Estonia, Finland, and Great Britain) there are small or no differences. Logistic regressions ran separately for each country revealed that individual-level socioeconomic and demographic variables explain a majority of these gaps in some countries, but contribute little to their understanding in most countries. In yet other countries, men had worse health when these variables were controlled for. Cross-national variation in the gender gaps exists after accounting for individual-level factors. Against expectations, the remaining gaps are not systematically related to societal-level gender inequality in the multilevel analyses. Our findings stress persistent cross-national variability in gender gaps in health and call for further analysis. Copyright © 2013 Elsevier Ltd. All rights reserved.
An Updated Global Picture of Cigarette Smoking Persistence among Adults
Troost, Jonathan P.; Barondess, David A.; Storr, Carla L.; Wells, J. Elisabeth; Al-Hamzawi, Ali Obaid; Andrade, Laura Helena; Bromet, Evelyn; Bruffaerts, Ronny; Florescu, Silvia; de Girolamo, Giovanni; de Graaf, Ron; Gureje, Oye; Haro, Josep Maria; Hu, Chiyi; Huang, Yueqin; Karam, Aimee N.; Kessler, Ronald C.; Lepine, Jean-Pierre; Matschinger, Herbert; Medina-Mora, Maria Elena; O'Neill, Siobhan; Posada-Villa, Jose; Sagar, Rajesh; Takeshima, Tadashi; Tomov, Toma; Williams, David R.; Anthony, James C.
2012-01-01
Background Cross-national variance in smoking prevalence is relatively well documented. The aim of this study is to estimate levels of smoking persistence across 21 countries with a hypothesized inverse relationship between country income level and smoking persistence. Methods Data from the World Health Organization World Mental Health Survey Initiative were used to estimate cross-national differences in smoking persistence–the proportion of adults who started to smoke and persisted in smoking by the date of the survey. Result There is large variation in smoking persistence from 25% (Nigeria) to 85% (China), with a random-effects meta-analytic summary estimate of 55% with considerable cross-national variation. (Cochran's heterogeneity Q statistic=6,845; p<0.001). Meta-regressions indicated observed differences are not attributable to differences in country income level, age distribution of smokers, or how recent the onset of smoking began within each country. Conclusion While smoking should remain an important public health issue in any country where smokers are present, this report identifies several countries with higher levels of smoking persistence (namely, China and India). PMID:23626929
Ndosi, Mwidimi; Alcacer-Pitarch, Begonya; Allanore, Yannick; Del Galdo, Francesco; Frerix, Marc; García-Díaz, Sílvia; Hesselstrand, Roger; Kendall, Christine; Matucci-Cerinic, Marco; Mueller-Ladner, Ulf; Sandqvist, Gunnel; Torrente-Segarra, Vicenç; Schmeiser, Tim; Sierakowska, Matylda; Sierakowska, Justyna; Sierakowski, Stanslaw; Redmond, Anthony
2018-02-20
The aim of this study was to adapt the Systemic Sclerosis Quality of Life Questionnaire (SScQoL) into six European cultures and validate it as a common measure of quality of life in systemic sclerosis (SSc). This was a seven-country (Germany, France, Italy, Poland, Spain, Sweden and UK) cross-sectional study. A forward-backward translation process was used to adapt the English SScQoL into target languages. SScQoL was completed by patients with SSc, then data were validated against the Rasch model. To correct local response dependency, items were grouped into the following subscales: function, emotion, sleep, social and pain and reanalysed for fit to the model, unidimensionality and cross-cultural equivalence. The adaptation of the SScQoL was seamless in all countries except Germany. Cross-cultural validation included 1080 patients with a mean age 58.0 years (SD 13.9) and 87% were women. Local dependency was evident in individual country data. Grouping items into testlets corrected the local dependency in most country specific data. Fit to the model, reliability and unidimensionality was achieved in six-country data after cross-cultural adjustment for Italy in the social subscale. The SScQoL was then calibrated into an interval level scale. The individual SScQoL items have translated well into five languages and overall, the scale maintained its construct validity, working well as a five-subscale questionnaire. Measures of quality of life in SSc can be directly compared across five countries (France, Poland Spain, Sweden and UK). Data from Italy are also comparable with the other five countries although require an adjustment. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Mapping cross-border collaboration and communication in cardiovascular research from 1992 to 2012
Gal, Diane; Glänzel, Wolfgang; Sipido, Karin R.
2017-01-01
Aims The growing burden of cardiovascular disease requires growth in research and innovation. We examine world-wide participation and citation impact across the cardiovascular research landscape from 1992 to 2012; we investigate cross-fertilization between countries and examine whether cross-border collaboration affects impact. Methods and Results State-of-the-art bibliometric methods and indicators are used to identify cardiovascular publications from the Web of Science, and to map trends over time in output, citation impact, and collaboration. The publication output in cardiovascular research has grown steadily from 1992 to 2012 with increased participation worldwide. China has the highest growth as relative share. The USA share initially predominated yet has reduced steadily. Over time, the EU-27 supra-national region has increased its participation above the USA, though on average it has not had greater citation impact than the USA. However, a number of European countries, as well as Australia and Canada, have improved their absolute and relative citation impact above that of the USA by 2006–2012. Europe is a hub of cross-fertilization with strengthening collaborations and strong citation links; the UK, Germany, and France remain central in this network. The USA has the highest number of strong citation links with other countries. All countries, but especially smaller, highly collaborative countries, have higher citation impact for their internationally collaborative research when compared with their domestic publications. Conclusion Participation in cardiovascular research is growing but growth and impact show wide variability between countries. Cross-border collaboration is increasing, in particular within the EU, and is associated with greater citation impact. PMID:27997881
Towards global environmental information and data management
NASA Astrophysics Data System (ADS)
Gurney, Robert; Allison, Lee; Cesar, Roberto; Cossu, Roberto; Dietz, Volkmar; Gemeinholzer, Birgit; Koike, Toshio; Mokrane, Mustapha; Peters, Dale; Thaller-Honold, Svetlana; Treloar, Andrew; Vilotte, Jean-Pierre; Waldmann, Christoph
2014-05-01
The Belmont Forum, a coalition of national science agencies from 13 countries, is supporting an 18-month effort to implement a 'Knowledge Hub' community-building and strategy development program as a first step to coordinate and streamline international efforts on community governance, interoperability and system architectures so that environmental data and information can be exchanged internationally and across subject domains easily and efficiently. This initiative represents a first step to build collaboratively an international capacity and e-infrastructure framework to address societally relevant global environmental change challenges. The project will deliver a community-owned strategy and implementation plan, which will prioritize international funding opportunities for Belmont Forum members to build pilots and exemplars in order to accelerate delivery of end-to end global change decision support systems. In 2012, the Belmont Forum held a series of public town hall meetings, and a two-day scoping meeting of scientists and program officers, which concluded that transformative approaches and innovative technologies are needed for heterogeneous data/information to be integrated and made interoperable for researchers in disparate fields and for myriad uses across international, institutional, disciplinary, spatial and temporal boundaries. Pooling Belmont Forum members' resources to bring communities together for further integration, cooperation, and leveraging of existing initiatives and resources has the potential to develop the e-infrastructure framework necessary to solve pressing environmental problems, and to support the aims of many international data sharing initiatives. The plan is expected to serve as the foundation of future Belmont Forum calls for proposals for e-Infrastructures and Data Management. The Belmont Forum is uniquely able to align resources of major national funders to support global environmental change research on specific technical and governance challenges, and the development of focused pilot systems that could be complementary to other initiatives such as GEOSS, ICSU World Data System, and Global Framework for Climate Services (GFCS). The development of this Belmont Forum Knowledge Hub represents an extraordinary effort to bring together international leaders in interoperability, governance and other fields pertinent to decision-support systems in global environmental change research. It is also addressing related issues such as ensuring a cohort of environmental scientists who can use up-to-date computing techniques for data and information management, and investigating which legal issues need common international attention.
Education, Gender, and Economic Development: A Cross-National Study.
ERIC Educational Resources Information Center
Benavot, Aaron
1989-01-01
Examines the effects of gender differences in educational expansion on national economic growth. Using cross-national data from 96 countries, the authors found that in less-developed countries, educational expansion among primary school-age girls had a stronger impact on long-term economic prosperity than did educational expansion among primary…
Physiological Profiles of High School Female Cross Country Runners.
ERIC Educational Resources Information Center
Butts, Nancy Kay
1982-01-01
Percentage of body fat, ratings of perceived exertion, and maximal oxygen consumption were obtained during a continuous running treadmill test on 127 high school female cross country runners. The relatively low relationships between the variables tested and running performance indicated that other factors may be more important determinants of…
Measuring Youth Development: A Nonparametric Cross-Country "Youth Welfare Index"
ERIC Educational Resources Information Center
Chaaban, Jad M.
2009-01-01
This paper develops an empirical methodology for the construction of a synthetic multi-dimensional cross-country comparison of the performance of governments around the world in improving the livelihood of their younger population. The devised "Youth Welfare Index" is based on the nonparametric Data Envelopment Analysis (DEA) methodology and…
Literacy Gaps by Educational Attainment: A Cross-National Analysis.
Park, Hyunjoon; Kyei, Pearl
2011-03-01
Existing cross-national research on educational attainment does not fully address whether the same level of educational attainment generates the same level of literacy skills in different countries. We analyze literacy skills data for young adults from 19 countries in the 1994-1998 International Adult Literacy Survey and find that in all countries, individuals with a higher level of educational attainment tend to have greater literacy skills. However, there is substantial variation across countries in the size of literacy gaps by levels of educational attainment. In particular, young adults in the United States show the largest literacy gaps. Using two-level hierarchical linear models, we find that cross-national differences in the literacy gap between more- and less-educated individuals are systematically linked to the degree of between-school inequality in school resources (instructional materials, class size, teachers' experience and certification).
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration.
Kamimura, Emi; Tanaka, Shinpei; Takaba, Masayuki; Tachi, Keita; Baba, Kazuyoshi
2017-01-01
The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D) images of teeth captured by a digital impression technique to a conventional impression technique in vivo. Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE). A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE). Stereolithography (STL) data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D) laboratory scanner (D810, 3shape). The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software) for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test). The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm) than when using a conventional impression technique (0.023 ± 0.01 mm). The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator.
A future-proof architecture for telemedicine using loose-coupled modules and HL7 FHIR.
Gøeg, Kirstine Rosenbeck; Rasmussen, Rune Kongsgaard; Jensen, Lasse; Wollesen, Christian Møller; Larsen, Søren; Pape-Haugaard, Louise Bilenberg
2018-07-01
Most telemedicine solutions are proprietary and disease specific which cause a heterogeneous and silo-oriented system landscape with limited interoperability. Solving the interoperability problem would require a strong focus on data integration and standardization in telemedicine infrastructures. Our objective was to suggest a future-proof architecture, that consisted of small loose-coupled modules to allow flexible integration with new and existing services, and the use of international standards to allow high re-usability of modules, and interoperability in the health IT landscape. We identified core features of our future-proof architecture as the following (1) To provide extended functionality the system should be designed as a core with modules. Database handling and implementation of security protocols are modules, to improve flexibility compared to other frameworks. (2) To ensure loosely coupled modules the system should implement an inversion of control mechanism. (3) A focus on ease of implementation requires the system should use HL7 FHIR (Fast Interoperable Health Resources) as the primary standard because it is based on web-technologies. We evaluated the feasibility of our architecture by developing an open source implementation of the system called ORDS. ORDS is written in TypeScript, and makes use of the Express Framework and HL7 FHIR DSTU2. The code is distributed on GitHub. All modules have been tested unit wise, but end-to-end testing awaits our first clinical example implementations. Our study showed that highly adaptable and yet interoperable core frameworks for telemedicine can be designed and implemented. Future work includes implementation of a clinical use case and evaluation. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
The CHAIN-REDS Project is organising a workshop on "e-Infrastructures for e-Sciences" focusing on Cloud Computing and Data Repositories under the aegis of the European Commission and in co-location with the International Conference on e-Science 2013 (IEEE2013) that will be held in Beijing, P.R. of China on October 17-22, 2013. The core objective of the CHAIN-REDS project is to promote, coordinate and support the effort of a critical mass of non-European e-Infrastructures for Research and Education to collaborate with Europe addressing interoperability and interoperation of Grids and other Distributed Computing Infrastructures (DCI). From this perspective, CHAIN-REDS will optimise the interoperation of European infrastructures with those present in 6 other regions of the world, both from a development and use point of view, and catering to different communities. Overall, CHAIN-REDS will provide input for future strategies and decision-making regarding collaboration with other regions on e-Infrastructure deployment and availability of related data; it will raise the visibility of e-Infrastructures towards intercontinental audiences, covering most of the world and will provide support to establish globally connected and interoperable infrastructures, in particular between the EU and the developing regions. Organised by IHEP, INFN and Sigma Orionis with the support of all project partners, this workshop will aim at: - Presenting the state of the art of Cloud computing in Europe and in China and discussing the opportunities offered by having interoperable and federated e-Infrastructures; - Exploring the existing initiatives of Data Infrastructures in Europe and China, and highlighting the Data Repositories of interest for the Virtual Research Communities in several domains such as Health, Agriculture, Climate, etc.