Putting the School Interoperability Framework to the Test
ERIC Educational Resources Information Center
Mercurius, Neil; Burton, Glenn; Hopkins, Bill; Larsen, Hans
2004-01-01
The Jurupa Unified School District in Southern California recently partnered with Microsoft, Dell and the Zone Integration Group for the implementation of a School Interoperability Framework (SIF) database repository model throughout the district (Magner 2002). A two-week project--the Integrated District Education Applications System, better known…
The Future of Library Automation in Schools.
ERIC Educational Resources Information Center
Anderson, Elaine
2000-01-01
Addresses the future of library automation programs for schools. Discusses requirements of emerging OPACs and circulation systems; the Schools Interoperability Framework (SIF), an industry initiatives to develop an open specification for ensuring that K-12 instructional and administrative software applications work together more effectively; home…
ERIC Educational Resources Information Center
Waters, John K.
2008-01-01
Data integration is one of the single most challenging tasks any district can face. Fortunately for school districts throughout the country with data scattered in disparate systems, an open specification known as the Schools Interoperability Framework (SIF) is mitigating that challenge. SIF has emerged as a cornerstone of K-12 data warehousing,…
Making Network Markets in Education: The Development of Data Infrastructure in Australian Schooling
ERIC Educational Resources Information Center
Sellar, Sam
2017-01-01
This paper examines the development of data infrastructure in Australian schooling with a specific focus on interoperability standards that help to make new markets for education data. The conceptual framework combines insights from studies of infrastructure, economic markets and digital data. The case of the Australian National Schools…
ERIC Educational Resources Information Center
Levine, Elliott
2002-01-01
Describes how to build a data warehouse, using the Schools Interoperability Framework (www.sifinfo.org), that supports data-driven decision making and complies with the Freedom of Information Act. Provides several suggestions for building and maintaining a data warehouse. (PKP)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-25
...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...
A step-by-step methodology for enterprise interoperability projects
NASA Astrophysics Data System (ADS)
Chalmeta, Ricardo; Pazos, Verónica
2015-05-01
Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.
PACS/information systems interoperability using Enterprise Communication Framework.
alSafadi, Y; Lord, W P; Mankovich, N J
1998-06-01
Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).
ERIC Educational Resources Information Center
Waters, John K.
2009-01-01
This article introduces Schools Interoperability Framework (SIF), a specification for data sharing among educational software applications that has grown to 10 disparate software applications. This new version (code name Columbus) is likely to give districts more vendors to choose from--maybe a lot more--because it will be arriving with a profound…
System and methods of resource usage using an interoperable management framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.
Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.
NASA Astrophysics Data System (ADS)
Tomas, Robert; Harrison, Matthew; Barredo, José I.; Thomas, Florian; Llorente Isidro, Miguel; Cerba, Otakar; Pfeiffer, Manuela
2014-05-01
The vast amount of information and data necessary for comprehensive hazard and risk assessment presents many challenges regarding the lack of accessibility, comparability, quality, organisation and dissemination of natural hazards spatial data. In order to mitigate these limitations an interoperable framework has been developed in the framework of the development of legally binding Implementing rules of the EU INSPIRE Directive1* aiming at the establishment of the European Spatial Data Infrastructure. The interoperability framework is described in the Data Specification on Natural risk zones - Technical Guidelines (DS) document2* that was finalized and published on 10.12. 2013. This framework provides means for facilitating access, integration, harmonisation and dissemination of natural hazard data from different domains and sources. The objective of this paper is twofold. Firstly, the paper demonstrates the applicability of the interoperable framework developed in the DS and highlights the key aspects of the interoperability to the various natural hazards communities. Secondly, the paper "translates" into common language the main features and potentiality of the interoperable framework of the DS for a wider audience of scientists and practitioners in the natural hazards domain. Further in this paper the main five aspects of the interoperable framework will be presented. First, the issue of a common terminology for the natural hazards domain will be addressed. A common data model to facilitate cross domain data integration will follow secondly. Thirdly, the common methodology developed to provide qualitative or quantitative assessments of natural hazards will be presented. Fourthly, the extensible classification schema for natural hazards developed from a literature review and key reference documents from the contributing community of practice will be shown. Finally, the applicability of the interoperable framework for the various stakeholder groups will be also presented. This paper closes discussing open issues and next steps regarding the sustainability and evolution of the interoperable framework and missing aspects such as multi-hazard and multi-risk. --------------- 1*INSPIRE - Infrastructure for spatial information in Europe, http://inspire.ec.europa.eu 2*http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_NZ_v3.0.pdf
Approaching semantic interoperability in Health Level Seven
Alschuler, Liora
2010-01-01
‘Semantic Interoperability’ is a driving objective behind many of Health Level Seven's standards. The objective in this paper is to take a step back, and consider what semantic interoperability means, assess whether or not it has been achieved, and, if not, determine what concrete next steps can be taken to get closer. A framework for measuring semantic interoperability is proposed, using a technique called the ‘Single Logical Information Model’ framework, which relies on an operational definition of semantic interoperability and an understanding that interoperability improves incrementally. Whether semantic interoperability tomorrow will enable one computer to talk to another, much as one person can talk to another person, is a matter for speculation. It is assumed, however, that what gets measured gets improved, and in that spirit this framework is offered as a means to improvement. PMID:21106995
1987-06-01
DECLASSIFICATION OWNGRAONG SCIEDULE distribution is unlimited. 4 PERFORMING ORGANIATION REPORT NUMBIR(S) S MONITORING ORGANIZATION REPORT NUVBER(S) 6a NAME OF...PERFORMING ORGANIZATION 60 OFFICE SYMBOL 7a NAME OF MONITORING ORGANIZATION (if applicable) Naval Postgraduate SchoolJ Code 74 Naval Postgraduate School 6c...FUNOINGi SPONSORING Sb OFFICE SYMBOL 9 PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (If dappicable) 8c AODRESS (City, State. ard ZIP Code
A development framework for semantically interoperable health information systems.
Lopez, Diego M; Blobel, Bernd G M E
2009-02-01
Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.
Sinaci, A Anil; Laleci Erturkmen, Gokce B
2013-10-01
In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. Copyright © 2013 Elsevier Inc. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Environmental modeling framework (EMF) design goals are multi-dimensional and often include many aspects of general software framework development. Many functional capabilities offered by current EMFs are closely related to interoperability and reuse aspects. For example, an EMF needs to support dev...
75 FR 63462 - Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-15
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid Interoperability Standards October 7, 2010... directs the development of a framework to achieve interoperability of smart grid devices and systems...
Towards technical interoperability in telemedicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard Layne, II
2004-05-01
For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.
An Interoperability Framework and Capability Profiling for Manufacturing Software
NASA Astrophysics Data System (ADS)
Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.
ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.
The interoperability force in the ERP field
NASA Astrophysics Data System (ADS)
Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon
2015-04-01
Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.
NASA Astrophysics Data System (ADS)
Graves, S. J.; Keiser, K.; Law, E.; Yang, C. P.; Djorgovski, S. G.
2016-12-01
ECITE (EarthCube Integration and Testing Environment) is providing both cloud-based computational testing resources and an Assessment Framework for Technology Interoperability and Integration. NSF's EarthCube program is funding the development of cyberinfrastructure building block components as technologies to address Earth science research problems. These EarthCube building blocks need to support integration and interoperability objectives to work towards a coherent cyberinfrastructure architecture for the program. ECITE is being developed to provide capabilities to test and assess the interoperability and integration across funded EarthCube technology projects. EarthCube defined criteria for interoperability and integration are applied to use cases coordinating science problems with technology solutions. The Assessment Framework facilitates planning, execution and documentation of the technology assessments for review by the EarthCube community. This presentation will describe the components of ECITE and examine the methodology of cross walking between science and technology use cases.
A Framework for Seamless Interoperation of Heterogeneous Distributed Software Components
2005-05-01
interoperability, b) distributed resource discovery, and c) validation of quality requirements. Principles and prototypical systems were created to demonstrate the successful completion of the research.
Information Management Challenges in Achieving Coalition Interoperability
2001-12-01
by J. Dyer SESSION I: ARCHITECTURES AND STANDARDS: FUNDAMENTAL ISSUES Chairman: Dr I. WHITE (UK) Planning for Interoperability 1 by W.M. Gentleman...framework – a crucial step toward achieving coalition C4I interoperability. TOPICS TO BE COVERED: 1 ) Maintaining secure interoperability 2) Command...d’une coalition. SUJETS À EXAMINER : 1 ) Le maintien d’une interopérabilité sécurisée 2) Les interfaces des systèmes de commandement : 2a
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration.
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Objective Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. Materials and methods We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Results Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. Conclusions We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration. PMID:23571850
Scientific Digital Libraries, Interoperability, and Ontologies
NASA Technical Reports Server (NTRS)
Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.
2009-01-01
Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.
Lemnos interoperable security project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halbgewachs, Ronald D.
2010-03-01
With the Lemnos framework, interoperability of control security equipment is straightforward. To obtain interoperability between proprietary security appliance units, one or both vendors must now write cumbersome 'translation code.' If one party changes something, the translation code 'breaks.' The Lemnos project is developing and testing a framework that uses widely available security functions and protocols like IPsec - to form a secure communications channel - and Syslog, to exchange security log messages. Using this model, security appliances from two or more different vendors can clearly and securely exchange information, helping to better protect the total system. Simplify regulatory compliance inmore » a complicated security environment by leveraging the Lemnos framework. As an electric utility, are you struggling to implement the NERC CIP standards and other regulations? Are you weighing the misery of multiple management interfaces against committing to a ubiquitous single-vendor solution? When vendors build their security appliances to interoperate using the Lemnos framework, it becomes practical to match best-of-breed offerings from an assortment of vendors to your specific control systems needs. The Lemnos project is developing and testing a framework that uses widely available open-source security functions and protocols like IPsec and Syslog to create a secure communications channel between appliances in order to exchange security data.« less
Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A
2015-01-01
This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.
NASA Astrophysics Data System (ADS)
Fox, P. A.; Diviacco, P.; Busato, A.
2016-12-01
Geo-scientific research collaboration commonly faces of complex systems where multiple skills and competences are needed at the same time. Efficacy of such collaboration among researchers then becomes of paramount importance. Multidisciplinary studies draw from domains that are far from each other. Researchers also need to understand: how to extract what data they need and eventually produce something that can be used by others. The management of information and knowledge in this perspective is non-trivial. Interoperability is frequently sought in computer-to-computer environements, so-as to overcome mismatches in vocabulary, data formats, coordinate reference system and so on. Successful researcher collaboration also relies on interoperability of the people! Smaller, synchronous and face-to-face settings for researchers are knownn to enhance people interoperability. However changing settings; either geographically; temporally; or with increasing the team size, diversity, and expertise requires people-computer-people-computer (...) interoperability. To date, knowledge representation framework have been proposed but not proven as necessary and sufficient to achieve multi-way interoperability. In this contribution, we address epistemology and sociology of science advocating for a fluid perspective where science is mostly a social construct, conditioned by cognitive issues; especially cognitive bias. Bias cannot be obliterated. On the contrary it must be carefully taken into consideration. Information-centric interfaces built from different perspectives and ways of thinking by actors with different point of views, approaches and aims, are proposed as a means for enhancing people interoperability in computer-based settings. The contribution will provide details on the approach of augmenting and interfacing to knowledge representation frameworks to the cognitive-conceptual frameworks for people that are needed to meet and exceed collaborative research goals in the 21st century. A web based collaborative portal has been developed that integrates both approaches and will be presented. Reports will be given on initial tests that have encouraging results.
A distributed component framework for science data product interoperability
NASA Technical Reports Server (NTRS)
Crichton, D.; Hughes, S.; Kelly, S.; Hardman, S.
2000-01-01
Correlation of science results from multi-disciplinary communities is a difficult task. Traditionally data from science missions is archived in proprietary data systems that are not interoperable. The Object Oriented Data Technology (OODT) task at the Jet Propulsion Laboratory is working on building a distributed product server as part of a distributed component framework to allow heterogeneous data systems to communicate and share scientific results.
The Health Service Bus: an architecture and case study in achieving interoperability in healthcare.
Ryan, Amanda; Eklund, Peter
2010-01-01
Interoperability in healthcare is a requirement for effective communication between entities, to ensure timely access to up to-date patient information and medical knowledge, and thus facilitate consistent patient care. An interoperability framework called the Health Service Bus (HSB), based on the Enterprise Service Bus (ESB) middleware software architecture is presented here as a solution to all three levels of interoperability as defined by the HL7 EHR Interoperability Work group in their definitive white paper "Coming to Terms". A prototype HSB system was implemented based on the Mule Open-Source ESB and is outlined and discussed, followed by a clinically-based example.
The role of architecture and ontology for interoperability.
Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka
2010-01-01
Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.
Zhao, Lei; Lim Choi Keung, Sarah N; Taweel, Adel; Tyler, Edward; Ogunsina, Ire; Rossiter, James; Delaney, Brendan C; Peterson, Kevin A; Hobbs, F D Richard; Arvanitis, Theodoros N
2012-01-01
Heterogeneous data models and coding schemes for electronic health records present challenges for automated search across distributed data sources. This paper describes a loosely coupled software framework based on the terminology controlled approach to enable the interoperation between the search interface and heterogeneous data sources. Software components interoperate via common terminology service and abstract criteria model so as to promote component reuse and incremental system evolution.
Open data models for smart health interconnected applications: the example of openEHR.
Demski, Hans; Garde, Sebastian; Hildebrand, Claudia
2016-10-22
Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.
An overview of the model integration process: From pre ...
Integration of models requires linking models which can be developed using different tools, methodologies, and assumptions. We performed a literature review with the aim of improving our understanding of model integration process, and also presenting better strategies for building integrated modeling systems. We identified five different phases to characterize integration process: pre-integration assessment, preparation of models for integration, orchestration of models during simulation, data interoperability, and testing. Commonly, there is little reuse of existing frameworks beyond the development teams and not much sharing of science components across frameworks. We believe this must change to enable researchers and assessors to form complex workflows that leverage the current environmental science available. In this paper, we characterize the model integration process and compare integration practices of different groups. We highlight key strategies, features, standards, and practices that can be employed by developers to increase reuse and interoperability of science software components and systems. The paper provides a review of the literature regarding techniques and methods employed by various modeling system developers to facilitate science software interoperability. The intent of the paper is to illustrate the wide variation in methods and the limiting effect the variation has on inter-framework reuse and interoperability. A series of recommendation
ERIC Educational Resources Information Center
Arms, William Y.; Hillmann, Diane; Lagoze, Carl; Krafft, Dean; Marisa, Richard; Saylor, John; Terizzi, Carol; Van de Sompel, Herbert; Gill, Tony; Miller, Paul; Kenney, Anne R.; McGovern, Nancy Y.; Botticelli, Peter; Entlich, Richard; Payette, Sandra; Berthon, Hilary; Thomas, Susan; Webb, Colin; Nelson, Michael L.; Allen, B. Danette; Bennett, Nuala A.; Sandore, Beth; Pianfetti, Evangeline S.
2002-01-01
Discusses digital libraries, including interoperability, metadata, and international standards; Web resource preservation efforts at Cornell University; digital preservation at the National Library of Australia; object persistence and availability; collaboration among libraries, museums and elementary schools; Asian digital libraries; and a Web…
2011-07-01
Orlando, Florida, September 2009, 09F- SIW -090. [HLA (2000) - 1] - Modeling and Simulation Standard - High Level Architecture (HLA) – Framework and...Simulation Interoperability Workshop, Orlando, FL, USA, September 2009, 09F- SIW -023. [MaK] - www.mak.com [MIL-STD-3011] - MIL-STD-3011...Spring Simulation Interoperability Workshop, Norfolk, VA, USA, March 2007, 07S- SIW -072. [Ross] - Ross, P. and Clark, P. (2005), “Recommended
2010-06-01
Military Scenario Definition Language (MSDL) for Nontraditional Warfare Scenarios," Paper 09S- SIW -001, Proceedings of the Spring Simulation...Update to the M&S Community," Paper 09S- SIW -002, Proceedings of the Spring Simulation Interoperability Workshop, Simulation Interoperability...Multiple Simulations: An Application of the Military Scenario Definition Language (MSDL)," Paper 09S- SIW -003, Proc. of the Spring Simulation
NASA Astrophysics Data System (ADS)
Arney, David; Goldman, Julian M.; Whitehead, Susan F.; Lee, Insup
When a x-ray image is needed during surgery, clinicians may stop the anesthesia machine ventilator while the exposure is made. If the ventilator is not restarted promptly, the patient may experience severe complications. This paper explores the interconnection of a ventilator and simulated x-ray into a prototype plug-and-play medical device system. This work assists ongoing interoperability framework development standards efforts to develop functional and non-functional requirements and illustrates the potential patient safety benefits of interoperable medical device systems by implementing a solution to a clinical use case requiring interoperability.
The Next Stage: Moving from Isolated Digital Collections to Interoperable Digital Libraries.
ERIC Educational Resources Information Center
Besser, Howard
2002-01-01
Presents a conceptual framework for digital library development and discusses how to move from isolated digital collections to interoperable digital libraries. Topics include a history of digital libraries; user-centered architecture; stages of technological development; standards, including metadata; and best practices. (Author/LRW)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.
This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less
Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.
2016-08-10
This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less
Employing Semantic Technologies for the Orchestration of Government Services
NASA Astrophysics Data System (ADS)
Sabol, Tomáš; Furdík, Karol; Mach, Marián
The main aim of the eGovernment is to provide efficient, secure, inclusive services for its citizens and businesses. The necessity to integrate services and information resources, to increase accessibility, to reduce the administrative burden on citizens and enterprises - these are only a few reasons why the paradigm of the eGovernment has been shifted from the supply-driven approach toward the connected governance, emphasizing the concept of interoperability (Archmann and Nielsen 2008). On the EU level, the interoperability is explicitly addressed as one of the four main challenges, including in the i2010 strategy (i2010 2005). The Commission's Communication (Interoperability for Pan-European eGovernment Services 2006) strongly emphasizes the necessity of interoperable eGovernment services, based on standards, open specifications, and open interfaces. The Pan-European interoperability initiatives, such as the European Interoperability Framework (2004) and IDABC, as well as many projects supported by the European Commission within the IST Program and the Competitiveness and Innovation Program (CIP), illustrate the importance of interoperability on the EU level.
Generic Educational Knowledge Representation for Adaptive and Cognitive Systems
ERIC Educational Resources Information Center
Caravantes, Arturo; Galan, Ramon
2011-01-01
The interoperability of educational systems, encouraged by the development of specifications, standards and tools related to the Semantic Web is limited to the exchange of information in domain and student models. High system interoperability requires that a common framework be defined that represents the functional essence of educational systems.…
An Access Control and Trust Management Framework for Loosely-Coupled Multidomain Environments
ERIC Educational Resources Information Center
Zhang, Yue
2010-01-01
Multidomain environments where multiple organizations interoperate with each other are becoming a reality as can be seen in emerging Internet-based enterprise applications. Access control to ensure secure interoperation in such an environment is a crucial challenge. A multidomain environment can be categorized as "tightly-coupled" and…
Interoperability and information discovery
Christian, E.
2001-01-01
In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-24
... Docket 07-100; FCC 11-6] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the... framework for the nationwide public safety broadband network. This document considers and proposes... broadband networks operating in the 700 MHz band. This document addresses public safety broadband network...
Jian, Wen-Shan; Hsu, Chien-Yeh; Hao, Te-Hui; Wen, Hsyien-Chia; Hsu, Min-Huei; Lee, Yen-Liang; Li, Yu-Chuan; Chang, Polun
2007-11-01
Traditional electronic health record (EHR) data are produced from various hospital information systems. They could not have existed independently without an information system until the incarnation of XML technology. The interoperability of a healthcare system can be divided into two dimensions: functional interoperability and semantic interoperability. Currently, no single EHR standard exists that provides complete EHR interoperability. In order to establish a national EHR standard, we developed a set of local EHR templates. The Taiwan Electronic Medical Record Template (TMT) is a standard that aims to achieve semantic interoperability in EHR exchanges nationally. The TMT architecture is basically composed of forms, components, sections, and elements. Data stored in the elements which can be referenced by the code set, data type, and narrative block. The TMT was established with the following requirements in mind: (1) transformable to international standards; (2) having a minimal impact on the existing healthcare system; (3) easy to implement and deploy, and (4) compliant with Taiwan's current laws and regulations. The TMT provides a basis for building a portable, interoperable information infrastructure for EHR exchange in Taiwan.
Toward interoperable bioscience data
Sansone, Susanna-Assunta; Rocca-Serra, Philippe; Field, Dawn; Maguire, Eamonn; Taylor, Chris; Hofmann, Oliver; Fang, Hong; Neumann, Steffen; Tong, Weida; Amaral-Zettler, Linda; Begley, Kimberly; Booth, Tim; Bougueleret, Lydie; Burns, Gully; Chapman, Brad; Clark, Tim; Coleman, Lee-Ann; Copeland, Jay; Das, Sudeshna; de Daruvar, Antoine; de Matos, Paula; Dix, Ian; Edmunds, Scott; Evelo, Chris T; Forster, Mark J; Gaudet, Pascale; Gilbert, Jack; Goble, Carole; Griffin, Julian L; Jacob, Daniel; Kleinjans, Jos; Harland, Lee; Haug, Kenneth; Hermjakob, Henning; Ho Sui, Shannan J; Laederach, Alain; Liang, Shaoguang; Marshall, Stephen; McGrath, Annette; Merrill, Emily; Reilly, Dorothy; Roux, Magali; Shamu, Caroline E; Shang, Catherine A; Steinbeck, Christoph; Trefethen, Anne; Williams-Jones, Bryn; Wolstencroft, Katherine; Xenarios, Ioannis; Hide, Winston
2012-01-01
To make full use of research data, the bioscience community needs to adopt technologies and reward mechanisms that support interoperability and promote the growth of an open ‘data commoning’ culture. Here we describe the prerequisites for data commoning and present an established and growing ecosystem of solutions using the shared ‘Investigation-Study-Assay’ framework to support that vision. PMID:22281772
A future-proof architecture for telemedicine using loose-coupled modules and HL7 FHIR.
Gøeg, Kirstine Rosenbeck; Rasmussen, Rune Kongsgaard; Jensen, Lasse; Wollesen, Christian Møller; Larsen, Søren; Pape-Haugaard, Louise Bilenberg
2018-07-01
Most telemedicine solutions are proprietary and disease specific which cause a heterogeneous and silo-oriented system landscape with limited interoperability. Solving the interoperability problem would require a strong focus on data integration and standardization in telemedicine infrastructures. Our objective was to suggest a future-proof architecture, that consisted of small loose-coupled modules to allow flexible integration with new and existing services, and the use of international standards to allow high re-usability of modules, and interoperability in the health IT landscape. We identified core features of our future-proof architecture as the following (1) To provide extended functionality the system should be designed as a core with modules. Database handling and implementation of security protocols are modules, to improve flexibility compared to other frameworks. (2) To ensure loosely coupled modules the system should implement an inversion of control mechanism. (3) A focus on ease of implementation requires the system should use HL7 FHIR (Fast Interoperable Health Resources) as the primary standard because it is based on web-technologies. We evaluated the feasibility of our architecture by developing an open source implementation of the system called ORDS. ORDS is written in TypeScript, and makes use of the Express Framework and HL7 FHIR DSTU2. The code is distributed on GitHub. All modules have been tested unit wise, but end-to-end testing awaits our first clinical example implementations. Our study showed that highly adaptable and yet interoperable core frameworks for telemedicine can be designed and implemented. Future work includes implementation of a clinical use case and evaluation. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Glaves, Helen
2015-04-01
Marine research is rapidly moving away from traditional discipline specific science to a wider ecosystem level approach. This more multidisciplinary approach to ocean science requires large amounts of good quality, interoperable data to be readily available for use in an increasing range of new and complex applications. Significant amounts of marine data and information are already available throughout the world as a result of e-infrastructures being established at a regional level to manage and deliver marine data to the end user. However, each of these initiatives has been developed to address specific regional requirements and independently of those in other regions. Establishing a common framework for marine data management on a global scale necessitates that there is interoperability across these existing data infrastructures and active collaboration between the organisations responsible for their management. The Ocean Data Interoperability Platform (ODIP) project is promoting co-ordination between a number of these existing regional e-infrastructures including SeaDataNet and Geo-Seas in Europe, the Integrated Marine Observing System (IMOS) in Australia, the Rolling Deck to Repository (R2R) in the USA and the international IODE initiative. To demonstrate this co-ordinated approach the ODIP project partners are currently working together to develop several prototypes to test and evaluate potential interoperability solutions for solving the incompatibilities between the individual regional marine data infrastructures. However, many of the issues being addressed by the Ocean Data Interoperability Platform are not specific to marine science. For this reason many of the outcomes of this international collaborative effort are equally relevant and transferable to other domains.
Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd
2014-01-01
Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.
Ryan, Amanda; Eklund, Peter
2008-01-01
Healthcare information is composed of many types of varying and heterogeneous data. Semantic interoperability in healthcare is especially important when all these different types of data need to interact. Presented in this paper is a solution to interoperability in healthcare based on a standards-based middleware software architecture used in enterprise solutions. This architecture has been translated into the healthcare domain using a messaging and modeling standard which upholds the ideals of the Semantic Web (HL7 V3) combined with a well-known standard terminology of clinical terms (SNOMED CT).
System architecture of communication infrastructures for PPDR organisations
NASA Astrophysics Data System (ADS)
Müller, Wilmuth
2017-04-01
The growing number of events affecting public safety and security (PS and S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on organizations responsible for PS and S. In order to respond timely and in an adequate manner to such events Public Protection and Disaster Relief (PPDR) organizations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies do not provide broadband capability, which is a major limitation in supporting new services hence new information flows and currently they have no successor. There is also no known standard that addresses interoperability of these technologies. The paper at hands provides an approach to tackle the above mentioned aspects by defining an Enterprise Architecture (EA) of PPDR organizations and a System Architecture of next generation PPDR communication networks for a variety of applications and services on broadband networks, including the ability of inter-system, inter-agency and cross-border operations. The Open Safety and Security Architecture Framework (OSSAF) provides a framework and approach to coordinate the perspectives of different types of stakeholders within a PS and S organization. It aims at bridging the silos in the chain of commands and on leveraging interoperability between PPDR organizations. The framework incorporates concepts of several mature enterprise architecture frameworks including the NATO Architecture Framework (NAF). However, OSSAF is not providing details on how NAF should be used for describing the OSSAF perspectives and views. In this contribution a mapping of the NAF elements to the OSSAF views is provided. Based on this mapping, an EA of PPDR organizations with a focus on communication infrastructure related capabilities is presented. Following the capability modeling, a system architecture for secure and interoperable communication infrastructures for PPDR organizations is presented. This architecture was implemented within a project sponsored by the European Union and successfully demonstrated in a live validation exercise in June 2016.
Smart Grid Interoperability Maturity Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Levinson, Alex; Mater, J.
2010-04-28
The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizationalmore » alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.« less
NASA Astrophysics Data System (ADS)
Orellana, Diego A.; Salas, Alberto A.; Solarz, Pablo F.; Medina Ruiz, Luis; Rotger, Viviana I.
2016-04-01
The production of clinical information about each patient is constantly increasing, and it is noteworthy that the information is created in different formats and at diverse points of care, resulting in fragmented, incomplete, inaccurate and isolated, health information. The use of health information technology has been promoted as having a decisive impact to improve the efficiency, cost-effectiveness, quality and safety of medical care delivery. However in developing countries the utilization of health information technology is insufficient and lacking of standards among other situations. In the present work we evaluate the framework EHRGen, based on the openEHR standard, as mean to reach generation and availability of patient centered information. The framework has been evaluated through the provided tools for final users, that is, without intervention of computer experts. It makes easier to adopt the openEHR ideas and provides an open source basis with a set of services, although some limitations in its current state conspire against interoperability and usability. However, despite the described limitations respect to usability and semantic interoperability, EHRGen is, at least regionally, a considerable step toward EHR adoption and interoperability, so that it should be supported from academic and administrative institutions.
Interoperable cross-domain semantic and geospatial framework for automatic change detection
NASA Astrophysics Data System (ADS)
Kuo, Chiao-Ling; Hong, Jung-Hong
2016-01-01
With the increasingly diverse types of geospatial data established over the last few decades, semantic interoperability in integrated applications has attracted much interest in the field of Geographic Information System (GIS). This paper proposes a new strategy and framework to process cross-domain geodata at the semantic level. This framework leverages the semantic equivalence of concepts between domains through bridge ontology and facilitates the integrated use of different domain data, which has been long considered as an essential superiority of GIS, but is impeded by the lack of understanding about the semantics implicitly hidden in the data. We choose the task of change detection to demonstrate how the introduction of ontology concept can effectively make the integration possible. We analyze the common properties of geodata and change detection factors, then construct rules and summarize possible change scenario for making final decisions. The use of topographic map data to detect changes in land use shows promising success, as far as the improvement of efficiency and level of automation is concerned. We believe the ontology-oriented approach will enable a new way for data integration across different domains from the perspective of semantic interoperability, and even open a new dimensionality for the future GIS.
Rafael Moreno-Sanchez
2006-01-01
The aim of this is paper is to provide a conceptual framework for the session: âThe role of web-based Geographic Information Systems in supporting sustainable management.â The concepts of sustainability, sustainable forest management, Web Services, Distributed Geographic Information Systems, interoperability, Open Specifications, and Open Source Software are defined...
NASA Astrophysics Data System (ADS)
Glaves, H. M.; Schaap, D.
2014-12-01
As marine research becomes increasingly multidisciplinary in its approach there has been a corresponding rise in the demand for large quantities of high quality interoperable data. A number of regional initiatives are already addressing this requirement through the establishment of e-infrastructures to improve the discovery and access of marine data. Projects such as Geo-Seas and SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and IMOS in Australia have implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these regional initiatives has been developed to address their own requirements and independently of other regions. To establish a common framework for marine data management on a global scale these is a need to develop interoperability solutions that can be implemented across these initiatives.Through a series of workshops attended by the relevant domain specialists, the Ocean Data Interoperability Platform (ODIP) project has identified areas of commonality between the regional infrastructures and used these as the foundation for the development of three prototype interoperability solutions addressing: the use of brokering services for the purposes of providing access to the data available in the regional data discovery and access services including via the GEOSS portal the development of interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) portal the establishment of a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE) These prototypes will be used to underpin the development of a common global approach to the management of marine data which can be promoted to the wider marine research community. ODIP is a community lead project that is currently focussed on regional initiatives in Europe, the USA and Australia but which is seeking to expand this framework to include other regional marine data infrastructures.
NASA Astrophysics Data System (ADS)
Fulker, D. W.; Gallagher, J. H. R.
2015-12-01
OPeNDAP's Hyrax data server is an open-source framework fostering interoperability via easily-deployed Web services. Compatible with solutions listed in the (PA001) session description—federation, rigid standards and brokering/mediation—the framework can support tight or loose coupling, even with dependence on community-contributed software. Hyrax is a Web-services framework with a middleware-like design and a handler-style architecture that together reduce the interoperability challenge (for N datatypes and M user contexts) to an O(N+M) problem, similar to brokering. Combined with an open-source ethos, this reduction makes Hyrax a community tool for gaining interoperability. E.g., in its response to the Big Earth Data Initiative (BEDI), NASA references OPeNDAP-based interoperability. Assuming its suitability, the question becomes: how sustainable is OPeNDAP, a small not-for-profit that produces open-source software, i.e., has no software-sales? In other words, if geoscience interoperability depends on OPeNDAP and similar organizations, are those entities in turn sustainable? Jim Collins (in Good to Great) highlights three questions that successful companies can answer (paraphrased here): What is your passion? Where is your world-class excellence? What drives your economic engine? We attempt to shed light on OPeNDAP sustainability by examining these. Passion: OPeNDAP has a focused passion for improving the effectiveness of scientific data sharing and use, as deeply-cooperative community endeavors. Excellence: OPeNDAP has few peers in remote, scientific data access. Skills include computer science with experience in data science, (operational, secure) Web services, and software design (for servers and clients, where the latter vary from Web pages to standalone apps and end-user programs). Economic Engine: OPeNDAP is an engineering services organization more than a product company, despite software being key to OPeNDAP's reputation. In essence, provision of engineering expertise, via contracts and grants, is the economic engine. Hence sustainability, as needed to address global grand challenges in geoscience, depends on agencies' and others' abilities and willingness to offer grants and let contracts for continually upgrading open-source software from OPeNDAP and others.
NASA Astrophysics Data System (ADS)
Glaves, H. M.
2015-12-01
In recent years marine research has become increasingly multidisciplinary in its approach with a corresponding rise in the demand for large quantities of high quality interoperable data as a result. This requirement for easily discoverable and readily available marine data is currently being addressed by a number of regional initiatives with projects such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Integrated Marine Observing System (IMOS) in Australia, having implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these systems has been developed to address local requirements and created in isolation from those in other regions.Multidisciplinary marine research on a global scale necessitates a common framework for marine data management which is based on existing data systems. The Ocean Data Interoperability Platform project is seeking to address this requirement by bringing together selected regional marine e-infrastructures for the purposes of developing interoperability across them. By identifying the areas of commonality and incompatibility between these data infrastructures, and leveraging the development activities and expertise of these individual systems, three prototype interoperability solutions are being created which demonstrate the effective sharing of marine data and associated metadata across the participating regional data infrastructures as well as with other target international systems such as GEO, COPERNICUS etc.These interoperability solutions combined with agreed best practice and approved standards, form the basis of a common global approach to marine data management which can be adopted by the wider marine research community. To encourage implementation of these interoperability solutions by other regional marine data infrastructures an impact assessment is being conducted to determine both the technical and financial implications of deploying them alongside existing services. The associated best practice and common standards are also being disseminated to the user community through relevant accreditation processes and related initiatives such as the Research Data Alliance and the Belmont Forum.
Software design and implementation concepts for an interoperable medical communication framework.
Besting, Andreas; Bürger, Sebastian; Kasparick, Martin; Strathen, Benjamin; Portheine, Frank
2018-02-23
The new IEEE 11073 service-oriented device connectivity (SDC) standard proposals for networked point-of-care and surgical devices constitutes the basis for improved interoperability due to its independence of vendors. To accelerate the distribution of the standard a reference implementation is indispensable. However, the implementation of such a framework has to overcome several non-trivial challenges. First, the high level of complexity of the underlying standard must be reflected in the software design. An efficient implementation has to consider the limited resources of the underlying hardware. Moreover, the frameworks purpose of realizing a distributed system demands a high degree of reliability of the framework itself and its internal mechanisms. Additionally, a framework must provide an easy-to-use and fail-safe application programming interface (API). In this work, we address these challenges by discussing suitable software engineering principles and practical coding guidelines. A descriptive model is developed that identifies key strategies. General feasibility is shown by outlining environments in which our implementation has been utilized.
Promoting Interoperability: The Case for Discipline-Specific PSAPS
2014-12-01
incidents for two reasons: first, numerous steel and concrete floors that affected signal penetration; and second, so many different companies were...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited PROMOTING...INTEROPERABILITY: THE CASE FOR DISCIPLINE-SPECIFIC PSAPS by Thomas Michael Walsh December 2014 Thesis Advisor: Fathali Moghaddam Second Reader
Solving Identity Management and Interoperability Problems at Pan-European Level
NASA Astrophysics Data System (ADS)
Sánchez García, Sergio; Gómez Oliva, Ana
In a globalized digital world, it is essential for persons and entities to have a recognized and unambiguous electronic identity that allows them to communicate with one another. The management of this identity by public administrations is an important challenge that becomes even more crucial when interoperability among public administrations of different countries becomes necessary, as persons and entities have different credentials depending on their own national legal frameworks. More specifically, different credentials and legal frameworks cause interoperability problems that prevent reliable access to public services in a cross-border scenarios like today's European Union. Work in this doctoral thesis try to analyze the problem in a carefully detailed manner by studying existing proposals (basically in Europe), proposing improvements in defined architectures and performing practical work to test the viability of solutions. Moreover, this thesis will also address the long-standing security problem of identity delegation, which is especially important in complex and heterogeneous service delivery environments like those mentioned above. This is a position paper.
CEOS WGISS Common Data Framework for WGISS Connected Data Assets
NASA Technical Reports Server (NTRS)
Enloe, Yonsook; Mitchell, Andrew; Albani, Mirko; Yapur, Martin
2016-01-01
This session will explore the benefits of having such a policy framework and future steps both domestically and internationally. Speakers can highlight current work being done to improve data interoperability, how the Common Framework is relevant for other data types, other countries and multinational organizations, and considerations for data management that have yet to be addressed in the Common Framework.
Semantically Interoperable XML Data
Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel
2013-01-01
XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789
A Problem-Solving Environment for Biological Network Informatics: Bio-Spice
2007-06-01
user an environment to access software tools. The Dashboard is built upon the NetBeans Integrated Development Environment (IDE), an open source Java...based integration platform was demonstrated. During the subsequent six month development cycle, the first version of the NetBeans based Bio-SPICE...frameworks (OAA, NetBeans , and Systems Biology Workbench (SBW)[15]), it becomes possible for Bio-SPICE tools to truly interoperate. This interoperation
An HL7/CDA Framework for the Design and Deployment of Telemedicine Services
2001-10-25
schemes and prescription databases. Furthermore, interoperability with the Electronic Health Re- cord ( EHR ) facilitates automatic retrieval of relevant...local EHR system or the integrated electronic health record (I- EHR ) [9], which indexes all medical contacts of a patient in the regional net- work...suspected medical problem. Interoperability with middleware services of the HII and other data sources such as the local EHR sys- tem affects
a Virtual Hub Brokering Approach for Integration of Historical and Modern Maps
NASA Astrophysics Data System (ADS)
Bruno, N.; Previtali, M.; Barazzetti, L.; Brumana, R.; Roncella, R.
2016-06-01
Geospatial data are today more and more widespread. Many different institutions, such as Geographical Institutes, Public Administrations, collaborative communities (e.g., OSM) and web companies, make available nowadays a large number of maps. Besides this cartography, projects of digitizing, georeferencing and web publication of historical maps have increasingly spread in the recent years. In spite of these variety and availability of data, information overload makes difficult their discovery and management: without knowing the specific repository where the data are stored, it is difficult to find the information required and problems of interconnection between different data sources and their restricted interoperability limit a wide utilization of available geo-data. This paper aims to describe some actions performed to assure interoperability between data, in particular spatial and geographic data, gathered from different data providers, with different features and referring to different historical periods. The article summarizes and exemplifies how, starting from projects of historical map digitizing and Historical GIS implementation, respectively for the Lombardy and for the city of Parma, the interoperability is possible in the framework of the ENERGIC OD project. The European project ENERGIC OD, thanks to a specific component - the virtual hub - based on a brokering framework, copes with the previous listed problems and allows the interoperability between different data sources.
Managing Interoperability for GEOSS - A Report from the SIF
NASA Astrophysics Data System (ADS)
Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.
2009-04-01
The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of GEOSS.
Metadata mapping and reuse in caBIG.
Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis
2009-02-05
This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG framework or other frameworks that use metadata repositories. The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG framework and potentially any framework that uses a metadata repository. This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG. This effort contributes to facilitating the development of interoperable systems within caBIG as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies.
Achieving Interoperability Through Base Registries for Governmental Services and Document Management
NASA Astrophysics Data System (ADS)
Charalabidis, Yannis; Lampathaki, Fenareti; Askounis, Dimitris
As digital infrastructures increase their presence worldwide, following the efforts of governments to provide citizens and businesses with high-quality one-stop services, there is a growing need for the systematic management of those newly defined and constantly transforming processes and electronic documents. E-government Interoperability Frameworks usually cater to the technical standards of e-government systems interconnection, but do not address service composition and use by citizens, businesses, or other administrations.
Assessing Quality of Data Standards: Framework and Illustration Using XBRL GAAP Taxonomy
NASA Astrophysics Data System (ADS)
Zhu, Hongwei; Wu, Harris
The primary purpose of data standards or metadata schemas is to improve the interoperability of data created by multiple standard users. Given the high cost of developing data standards, it is desirable to assess the quality of data standards. We develop a set of metrics and a framework for assessing data standard quality. The metrics include completeness and relevancy. Standard quality can also be indirectly measured by assessing interoperability of data instances. We evaluate the framework using data from the financial sector: the XBRL (eXtensible Business Reporting Language) GAAP (Generally Accepted Accounting Principles) taxonomy and US Securities and Exchange Commission (SEC) filings produced using the taxonomy by approximately 500 companies. The results show that the framework is useful and effective. Our analysis also reveals quality issues of the GAAP taxonomy and provides useful feedback to taxonomy users. The SEC has mandated that all publicly listed companies must submit their filings using XBRL. Our findings are timely and have practical implications that will ultimately help improve the quality of financial data.
A Framework for a Decision Support System in a Hierarchical Extended Enterprise Decision Context
NASA Astrophysics Data System (ADS)
Boza, Andrés; Ortiz, Angel; Vicens, Eduardo; Poler, Raul
Decision Support System (DSS) tools provide useful information to decision makers. In an Extended Enterprise, a new goal, changes in the current objectives or small changes in the extended enterprise configuration produce a necessary adjustment in its decision system. A DSS in this context must be flexible and agile to make suitable an easy and quickly adaptation to this new context. This paper proposes to extend the Hierarchical Production Planning (HPP) structure to an Extended Enterprise decision making context. In this way, a framework for DSS in Extended Enterprise context is defined using components of HPP. Interoperability details have been reviewed to identify the impact in this framework. The proposed framework allows overcoming some interoperability barriers, identifying and organizing components for a DSS in Extended Enterprise context, and working in the definition of an architecture to be used in the design process of a flexible DSS in Extended Enterprise context which can reuse components for futures Extended Enterprise configurations.
Dynamic Business Networks: A Headache for Sustainable Systems Interoperability
NASA Astrophysics Data System (ADS)
Agostinho, Carlos; Jardim-Goncalves, Ricardo
Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.
Semantics-Based Interoperability Framework for the Geosciences
NASA Astrophysics Data System (ADS)
Sinha, A.; Malik, Z.; Raskin, R.; Barnes, C.; Fox, P.; McGuinness, D.; Lin, K.
2008-12-01
Interoperability between heterogeneous data, tools and services is required to transform data to knowledge. To meet geoscience-oriented societal challenges such as forcing of climate change induced by volcanic eruptions, we suggest the need to develop semantic interoperability for data, services, and processes. Because such scientific endeavors require integration of multiple data bases associated with global enterprises, implicit semantic-based integration is impossible. Instead, explicit semantics are needed to facilitate interoperability and integration. Although different types of integration models are available (syntactic or semantic) we suggest that semantic interoperability is likely to be the most successful pathway. Clearly, the geoscience community would benefit from utilization of existing XML-based data models, such as GeoSciML, WaterML, etc to rapidly advance semantic interoperability and integration. We recognize that such integration will require a "meanings-based search, reasoning and information brokering", which will be facilitated through inter-ontology relationships (ontologies defined for each discipline). We suggest that Markup languages (MLs) and ontologies can be seen as "data integration facilitators", working at different abstraction levels. Therefore, we propose to use an ontology-based data registration and discovery approach to compliment mark-up languages through semantic data enrichment. Ontologies allow the use of formal and descriptive logic statements which permits expressive query capabilities for data integration through reasoning. We have developed domain ontologies (EPONT) to capture the concept behind data. EPONT ontologies are associated with existing ontologies such as SUMO, DOLCE and SWEET. Although significant efforts have gone into developing data (object) ontologies, we advance the idea of developing semantic frameworks for additional ontologies that deal with processes and services. This evolutionary step will facilitate the integrative capabilities of scientists as we examine the relationships between data and external factors such as processes that may influence our understanding of "why" certain events happen. We emphasize the need to go from analysis of data to concepts related to scientific principles of thermodynamics, kinetics, heat flow, mass transfer, etc. Towards meeting these objectives, we report on a pair of related service engines: DIA (Discovery, integration and analysis), and SEDRE (Semantically-Enabled Data Registration Engine) that utilize ontologies for semantic interoperability and integration.
ERIC Educational Resources Information Center
Blanchi, Christophe; Petrone, Jason; Pinfield, Stephen; Suleman, Hussein; Fox, Edward A.; Bauer, Charly; Roddy, Carol Lynn
2001-01-01
Includes four articles that discuss a distributed architecture for managing metadata that promotes interoperability between digital libraries; the use of electronic print (e-print) by physicists; the development of digital libraries; and a collaborative project between two library consortia in Ohio to provide digital versions of Sanborn Fire…
Friedman, Charles P; Iakovidis, Ilias; Debenedetti, Laurent; Lorenzi, Nancy M
2009-11-01
Countries on both sides of the Atlantic Ocean have invested in health information and communication technologies. Since eHealth challenges cross borders a European Union-United States of America conference on public policies relating to health IT and eHealth was held October 20-21, 2008 in Paris, France. The conference was organized around the four themes: (1) privacy and security, (2) health IT interoperability, (3) deployment and adoption of health IT, and (4) Public Private Collaborative Governance. The four key themes framed the discussion over the two days of plenary sessions and workshops. Key findings of the conference were organized along the four themes. (1) Privacy and security: Patients' access to their own data and key elements of a patient identification management framework were discussed. (2) Health IT interoperability: Three significant and common interoperability challenges emerged: (a) the need to establish common or compatible standards and clear guidelines for their implementation, (b) the desirability for shared certification criteria and (c) the need for greater awareness of the importance of interoperability. (3) Deployment and adoption of health IT: Three major areas of need emerged: (a) a shared knowledge base and assessment framework, (b) public-private collaboration and (c) and effective organizational change strategies. (4) Public Private Collaborative Governance: Sharing and communication are central to success in this area. Nations can learn from one another about ways to develop harmonious, effective partnerships. Three areas that were identified as highest priority for collaboration included: (1) health data security, (2) developing effective strategies to ensure healthcare professionals' acceptance of health IT tools, and (3) interoperability.
Metadata mapping and reuse in caBIG™
Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis
2009-01-01
Background This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG™). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG™ framework or other frameworks that use metadata repositories. Results The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG™ framework and potentially any framework that uses a metadata repository. Conclusion This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG™. This effort contributes to facilitating the development of interoperable systems within caBIG™ as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies. PMID:19208192
NASA Astrophysics Data System (ADS)
López García, Álvaro; Fernández del Castillo, Enol; Orviz Fernández, Pablo
In this document we present an implementation of the Open Grid Forum's Open Cloud Computing Interface (OCCI) for OpenStack, namely ooi (Openstack occi interface, 2015) [1]. OCCI is an open standard for management tasks over cloud resources, focused on interoperability, portability and integration. ooi aims to implement this open interface for the OpenStack cloud middleware, promoting interoperability with other OCCI-enabled cloud management frameworks and infrastructures. ooi focuses on being non-invasive with a vanilla OpenStack installation, not tied to a particular OpenStack release version.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, Dave; Stephan, Eric G.; Wang, Weimin
Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability liemore » the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.« less
Digital Rights Management Implemented by RDF Graph Approach
ERIC Educational Resources Information Center
Yang, Jin Tan; Horng, Huai-Chien
2006-01-01
This paper proposes a design framework for constructing Digital Rights Management (DRM) that enables learning objects in legal usage. The central theme of this framework is that any design of a DRM must have theories as foundations to make the maintenance, extension or interoperability easy. While a learning objective consists of learning…
Achieving interoperability for metadata registries using comparative object modeling.
Park, Yu Rang; Kim, Ju Han
2010-01-01
Achieving data interoperability between organizations relies upon agreed meaning and representation (metadata) of data. For managing and registering metadata, many organizations have built metadata registries (MDRs) in various domains based on international standard for MDR framework, ISO/IEC 11179. Following this trend, two pubic MDRs in biomedical domain have been created, United States Health Information Knowledgebase (USHIK) and cancer Data Standards Registry and Repository (caDSR), from U.S. Department of Health & Human Services and National Cancer Institute (NCI), respectively. Most MDRs are implemented with indiscriminate extending for satisfying organization-specific needs and solving semantic and structural limitation of ISO/IEC 11179. As a result it is difficult to address interoperability among multiple MDRs. In this paper, we propose an integrated metadata object model for achieving interoperability among multiple MDRs. To evaluate this model, we developed an XML Schema Definition (XSD)-based metadata exchange format. We created an XSD-based metadata exporter, supporting both the integrated metadata object model and organization-specific MDR formats.
Myneni, Sahiti; Patel, Vimla L.
2009-01-01
Biomedical researchers often have to work on massive, detailed, and heterogeneous datasets that raise new challenges of information management. This study reports an investigation into the nature of the problems faced by the researchers in two bioscience test laboratories when dealing with their data management applications. Data were collected using ethnographic observations, questionnaires, and semi-structured interviews. The major problems identified in working with these systems were related to data organization, publications, and collaboration. The interoperability standards were analyzed using a C4I framework at the level of connection, communication, consolidation, and collaboration. Such an analysis was found to be useful in judging the capabilities of data management systems at different levels of technological competency. While collaboration and system interoperability are the “must have” attributes of these biomedical scientific laboratory information management applications, usability and human interoperability are the other design concerns that must also be addressed for easy use and implementation. PMID:20351900
Borlawsky, Tara B.; Dhaval, Rakesh; Hastings, Shannon L.; Payne, Philip R. O.
2009-01-01
In October 2006, the National Institutes of Health launched a new national consortium, funded through Clinical and Translational Science Awards (CTSA), with the primary objective of improving the conduct and efficiency of the inherently multi-disciplinary field of translational research. To help meet this goal, the Ohio State University Center for Clinical and Translational Science has launched a knowledge management initiative that is focused on facilitating widespread semantic interoperability among administrative, basic science, clinical and research computing systems, both internally and among the translational research community at-large, through the integration of domain-specific standard terminologies and ontologies with local annotations. This manuscript describes an agile framework that builds upon prevailing knowledge engineering and semantic interoperability methods, and will be implemented as part this initiative. PMID:21347164
Borlawsky, Tara B; Dhaval, Rakesh; Hastings, Shannon L; Payne, Philip R O
2009-03-01
In October 2006, the National Institutes of Health launched a new national consortium, funded through Clinical and Translational Science Awards (CTSA), with the primary objective of improving the conduct and efficiency of the inherently multi-disciplinary field of translational research. To help meet this goal, the Ohio State University Center for Clinical and Translational Science has launched a knowledge management initiative that is focused on facilitating widespread semantic interoperability among administrative, basic science, clinical and research computing systems, both internally and among the translational research community at-large, through the integration of domain-specific standard terminologies and ontologies with local annotations. This manuscript describes an agile framework that builds upon prevailing knowledge engineering and semantic interoperability methods, and will be implemented as part this initiative.
Interoperable Data Sharing for Diverse Scientific Disciplines
NASA Astrophysics Data System (ADS)
Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean
2016-04-01
For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.
Evolving Frameworks for Different Communities of Scientists and End Users
NASA Astrophysics Data System (ADS)
Graves, S. J.; Keiser, K.
2016-12-01
Two evolving frameworks for interdisciplinary science will be described in the context of the Common Data Framework for Earth-Observation Data and the importance of standards and protocols. The Event Data Driven Delivery (ED3) Framework, funded by NASA Applied Sciences, provides the delivery of data based on predetermined subscriptions and associated workflows to various communities of end users. ED3's capabilities are used by scientists, as well as policy and resource managers, when event alerts are triggered to respond to their needs. The EarthCube Integration and Testing Environment (ECITE) Assessment Framework for Technology Interoperability and Integration is being developed to facilitate the EarthCube community's assessment of NSF funded technologies addressing Earth science problems. ECITE is addressing the translation of geoscience researchers' use cases into technology use case that apply EarthCube-funded building block technologies (and other existing technologies) for solving science problems. EarthCube criteria for technology assessment include the use of data, metadata and service standards to improve interoperability and integration across program components. The long-range benefit will be the growth of a cyberinfrastructure with technology components that have been shown to work together to solve known science objectives.
Building Future Transatlantic Interoperability Around a Robust NATO Response Force
2012-10-01
than already traveled . However, this accrued wealth of interoperable capa- bility may be at its apogee, soon to decline as the result of two looming...and Bydgo- szcz, Poland, as well as major national training centers such as the bilateral U.S.- Romanian Joint Task Force– East at Kogalniceanu...operations. Increase U.S. and Allied Exchange Students at National and NATO military schools. Austerity measures may eventually affect the investment
NASA Astrophysics Data System (ADS)
Li, Ni; Huai, Wenqing; Wang, Shaodan
2017-08-01
C2 (command and control) has been understood to be a critical military component to meet an increasing demand for rapid information gathering and real-time decision-making in a dynamically changing battlefield environment. In this article, to improve a C2 behaviour model's reusability and interoperability, a behaviour modelling framework was proposed to specify a C2 model's internal modules and a set of interoperability interfaces based on the C-BML (coalition battle management language). WTA (weapon target assignment) is a typical C2 autonomous decision-making behaviour modelling problem. Different from most WTA problem descriptions, here sensors were considered to be available resources of detection and the relationship constraints between weapons and sensors were also taken into account, which brought it much closer to actual application. A modified differential evolution (MDE) algorithm was developed to solve this high-dimension optimisation problem and obtained an optimal assignment plan with high efficiency. In case study, we built a simulation system to validate the proposed C2 modelling framework and interoperability interface specification. Also, a new optimisation solution was used to solve the WTA problem efficiently and successfully.
NASA Astrophysics Data System (ADS)
Glaves, Helen; Schaap, Dick
2016-04-01
The increasingly ocean basin level approach to marine research has led to a corresponding rise in the demand for large quantities of high quality interoperable data. This requirement for easily discoverable and readily available marine data is currently being addressed by initiatives such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Australian Ocean Data Network (AODN) with each having implemented an e-infrastructure to facilitate the discovery and re-use of standardised multidisciplinary marine datasets available from a network of distributed repositories, data centres etc. within their own region. However, these regional data systems have been developed in response to the specific requirements of their users and in line with the priorities of the funding agency. They have also been created independently of the marine data infrastructures in other regions often using different standards, data formats, technologies etc. that make integration of marine data from these regional systems for the purposes of basin level research difficult. Marine research at the ocean basin level requires a common global framework for marine data management which is based on existing regional marine data systems but provides an integrated solution for delivering interoperable marine data to the user. The Ocean Data Interoperability Platform (ODIP/ODIP II) project brings together those responsible for the management of the selected marine data systems and other relevant technical experts with the objective of developing interoperability across the regional e-infrastructures. The commonalities and incompatibilities between the individual data infrastructures are identified and then used as the foundation for the specification of prototype interoperability solutions which demonstrate the feasibility of sharing marine data across the regional systems and also with relevant larger global data services such as GEO, COPERNICUS, IODE, POGO etc. The potential impact for the individual regional data infrastructures of implementing these prototype interoperability solutions is also being evaluated to determine both the technical and financial implications of their integration within existing systems. These impact assessments form part of the strategy to encourage wider adoption of the ODIP solutions and approach beyond the current scope of the project which is focussed on regional marine data systems in Europe, Australia, the USA and, more recently, Canada.
Collaborative development of predictive toxicology applications
2010-01-01
OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way. PMID:20807436
Collaborative development of predictive toxicology applications.
Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia
2010-08-31
OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.
An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web
NASA Astrophysics Data System (ADS)
Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.
2013-09-01
Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.
Converting ODM Metadata to FHIR Questionnaire Resources.
Doods, Justin; Neuhaus, Philipp; Dugas, Martin
2016-01-01
Interoperability between systems and data sharing between domains is becoming more and more important. The portal medical-data-models.org offers more than 5.300 UMLS annotated forms in CDISC ODM format in order to support interoperability, but several additional export formats are available. CDISC's ODM and HL7's framework FHIR Questionnaire resource were analyzed, a mapping between elements created and a converter implemented. The developed converter was integrated into the portal with FHIR Questionnaire XML or JSON download options. New FHIR applications can now use this large library of forms.
Beštek, Mate; Stanimirović, Dalibor
2017-08-09
The main aims of the paper comprise the characterization and examination of the potential approaches regarding interoperability. This includes openEHR, SNOMED, IHE, and Continua as combined interoperability approaches, possibilities for their incorporation into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. The paper represents an in-depth analysis regarding the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research and charting of existing experience in the field, and sources, both electronic and written, which include interoperability concepts and related implementation issues. The paper will try to answer the following inquiries that are complementing each other: 1. Scrutiny of the potential approaches, which could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field that critically influence development and implementation of eHealth projects in an efficient manner. Provided insights and identified success factors could serve as a constituent of the strategic starting points for continuous integration of interoperability principles into the healthcare domain. Moreover, the general implementation of the identified success factors could facilitate better penetration of ICT into the healthcare environment and enable the eHealth-based transformation of the health system especially in the countries which are still in an early phase of eHealth planning and development and are often confronted with differing interests, requirements, and contending strategies.
A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World
NASA Astrophysics Data System (ADS)
Wright, D. J.; Sankaran, S.
2015-12-01
In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted as part of our everyday use of technology.
Extending the GI Brokering Suite to Support New Interoperability Specifications
NASA Astrophysics Data System (ADS)
Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.
2014-12-01
The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by public administrations. CERIF: used by CRIS (Current Research Information System) instances. HYRAX Server: a scientific dataset publishing component. This presentation will discuss these and other latest GI suite extensions implemented to support new interoperability protocols in use by the Earth Science Communities.
Documenting Models for Interoperability and Reusability (proceedings)
Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration be...
Documenting Models for Interoperability and Reusability
Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration be...
Knowledge Discovery from Biomedical Ontologies in Cross Domains.
Shen, Feichen; Lee, Yugyung
2016-01-01
In recent years, there is an increasing demand for sharing and integration of medical data in biomedical research. In order to improve a health care system, it is required to support the integration of data by facilitating semantic interoperability systems and practices. Semantic interoperability is difficult to achieve in these systems as the conceptual models underlying datasets are not fully exploited. In this paper, we propose a semantic framework, called Medical Knowledge Discovery and Data Mining (MedKDD), that aims to build a topic hierarchy and serve the semantic interoperability between different ontologies. For the purpose, we fully focus on the discovery of semantic patterns about the association of relations in the heterogeneous information network representing different types of objects and relationships in multiple biological ontologies and the creation of a topic hierarchy through the analysis of the discovered patterns. These patterns are used to cluster heterogeneous information networks into a set of smaller topic graphs in a hierarchical manner and then to conduct cross domain knowledge discovery from the multiple biological ontologies. Thus, patterns made a greater contribution in the knowledge discovery across multiple ontologies. We have demonstrated the cross domain knowledge discovery in the MedKDD framework using a case study with 9 primary biological ontologies from Bio2RDF and compared it with the cross domain query processing approach, namely SLAP. We have confirmed the effectiveness of the MedKDD framework in knowledge discovery from multiple medical ontologies.
Knowledge Discovery from Biomedical Ontologies in Cross Domains
Shen, Feichen; Lee, Yugyung
2016-01-01
In recent years, there is an increasing demand for sharing and integration of medical data in biomedical research. In order to improve a health care system, it is required to support the integration of data by facilitating semantic interoperability systems and practices. Semantic interoperability is difficult to achieve in these systems as the conceptual models underlying datasets are not fully exploited. In this paper, we propose a semantic framework, called Medical Knowledge Discovery and Data Mining (MedKDD), that aims to build a topic hierarchy and serve the semantic interoperability between different ontologies. For the purpose, we fully focus on the discovery of semantic patterns about the association of relations in the heterogeneous information network representing different types of objects and relationships in multiple biological ontologies and the creation of a topic hierarchy through the analysis of the discovered patterns. These patterns are used to cluster heterogeneous information networks into a set of smaller topic graphs in a hierarchical manner and then to conduct cross domain knowledge discovery from the multiple biological ontologies. Thus, patterns made a greater contribution in the knowledge discovery across multiple ontologies. We have demonstrated the cross domain knowledge discovery in the MedKDD framework using a case study with 9 primary biological ontologies from Bio2RDF and compared it with the cross domain query processing approach, namely SLAP. We have confirmed the effectiveness of the MedKDD framework in knowledge discovery from multiple medical ontologies. PMID:27548262
Brokerage services for Earth Science data: the EuroGEOSS legacy (Invited)
NASA Astrophysics Data System (ADS)
Nativi, S.; Craglia, M.; Pearlman, J.
2013-12-01
Global sustainability research requires an integrated multidisciplinary effort underpinned by a collaborative environment discovering and accessing heterogeneous data across disciplines. Traditionally, interoperability has been achieved by implementing federation of systems. The federating approach entails the adoption of a set of common technologies and standards. This presentation argues that for complex (and uncontrolled) environments (such as global, multidisciplinary, and voluntary-based infrastructures) federated solutions must be completed and enhanced by a brokering approach -making available a set of brokerage services. In fact, brokerage services allows a cyber-infrastructure to lower entry barriers (for both data producers and users) and to better address the different domain specificities. The brokering interoperability approach was successfully experimented by the EuroGEOSS project, funded by the European Commission in the FP7 framework (see http://www.eurogeoss.eu). The EuroGEOSS Brokering framework provided the EuroGEOSS Capacity with multidisciplinary interoperability functionalities. This platform was developed applying several of the principles/requirements that characterize the System of Systems (SoS) approach and the Internet of Services (IoS) philosophy. The framework consists of three main brokers (middleware components implementing intermediation and harmonization services): a basic Discovery Broker, an advanced Semantic Discovery Broker, and an Access Broker. They are empowered by a suite of tools developed by the ESSI-lab of the CNR-IIA, called: GI-cat, GI-sem, and GI-axe. The EuroGEOSS brokering framework was considered and successfully adopted by cross-disciplinary initiatives (notably GEOSS: Global Earth Observation System of Systems). The brokerage services have been advanced and extended; the new brokering framework is called GEO DAB (Discovery and Access Broker). New brokerage services have been developed in the framework of other European Commission funded projects (e.g. GeoViQua). More recently, the NSF EarthCube initiative decided to fund a project dealing with brokerage services. In the framework of the GEO AIP-6 (Architecture Implementation Pilot -phase 6), the presented brokerage platform has been used by the Water Working Group to carry out improved data access for parameterization and model development.
Building a Global Earth Observation System of Systems (GEOSS) and Its Interoperability Challenges
NASA Astrophysics Data System (ADS)
Ryan, B. J.
2015-12-01
Launched in 2005 by industrialized nations, the Group on Earth Observations (GEO) began building the Global Earth Observation System of Systems (GEOSS). Consisting of both a policy framework, and an information infrastructure, GEOSS, was intended to link and/or integrate the multitude of Earth observation systems, primarily operated by its Member Countries and Participating Organizations, so that users could more readily benefit from global information assets for a number of society's key environmental issues. It was recognized that having ready access to observations from multiple systems was a prerequisite for both environmental decision-making, as well as economic development. From the very start, it was also recognized that the shear complexity of the Earth's system cannot be captured by any single observation system, and that a federated, interoperable approach was necessary. While this international effort has met with much success, primarily in advancing broad, open data policies and practices, challenges remain. In 2014 (Geneva, Switzerland) and 2015 (Mexico City, Mexico), Ministers from GEO's Member Countries, including the European Commission, came together to assess progress made during the first decade (2005 to 2015), and approve implementation strategies and mechanisms for the second decade (2016 to 2025), respectively. The approved implementation strategies and mechanisms are intended to advance GEOSS development thereby facilitating the increased uptake of Earth observations for informed decision-making. Clearly there are interoperability challenges that are technological in nature, and several will be discussed in this presentation. There are, however, interoperability challenges that can be better characterized as economic, governmental and/or political in nature, and these will be discussed as well. With the emergence of the Sustainable Development Goals (SDGs), the World Conference on Disaster Risk Reduction (WCDRR), and the United Nations Framework Convention on Climate Change (UNFCCC) having occurred this year, it will be essential that the interoperability challenges described herein, regardless of their nature, be expeditiously addressed so that Earth observations can indeed inform societal decision-making.
Geoscience Information Network (USGIN) Solutions for Interoperable Open Data Access Requirements
NASA Astrophysics Data System (ADS)
Allison, M. L.; Richard, S. M.; Patten, K.
2014-12-01
The geosciences are leading development of free, interoperable open access to data. US Geoscience Information Network (USGIN) is a freely available data integration framework, jointly developed by the USGS and the Association of American State Geologists (AASG), in compliance with international standards and protocols to provide easy discovery, access, and interoperability for geoscience data. USGIN standards include the geologic exchange language 'GeoSciML' (v 3.2 which enables instant interoperability of geologic formation data) which is also the base standard used by the 117-nation OneGeology consortium. The USGIN deployment of NGDS serves as a continent-scale operational demonstration of the expanded OneGeology vision to provide access to all geoscience data worldwide. USGIN is developed to accommodate a variety of applications; for example, the International Renewable Energy Agency streams data live to the Global Atlas of Renewable Energy. Alternatively, users without robust data sharing systems can download and implement a free software packet, "GINstack" to easily deploy web services for exposing data online for discovery and access. The White House Open Data Access Initiative requires all federally funded research projects and federal agencies to make their data publicly accessible in an open source, interoperable format, with metadata. USGIN currently incorporates all aspects of the Initiative as it emphasizes interoperability. The system is successfully deployed as the National Geothermal Data System (NGDS), officially launched at the White House Energy Datapalooza in May, 2014. The USGIN Foundation has been established to ensure this technology continues to be accessible and available.
A Working Framework for Enabling International Science Data System Interoperability
NASA Astrophysics Data System (ADS)
Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.
2016-07-01
For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.
Yuksel, Mustafa; Gonul, Suat; Laleci Erturkmen, Gokce Banu; Sinaci, Ali Anil; Invernizzi, Paolo; Facchinetti, Sara; Migliavacca, Andrea; Bergvall, Tomas; Depraetere, Kristof; De Roo, Jos
2016-01-01
Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information. PMID:27123451
Cardea: Providing Support for Dynamic Resource Access in a Distributed Computing Environment
NASA Technical Reports Server (NTRS)
Lepro, Rebekah
2003-01-01
The environment framing the modem authorization process span domains of administration, relies on many different authentication sources, and manages complex attributes as part of the authorization process. Cardea facilitates dynamic access control within this environment as a central function of an inter-operable authorization framework. The system departs from the traditional authorization model by separating the authentication and authorization processes, distributing the responsibility for authorization data and allowing collaborating domains to retain control over their implementation mechanisms. Critical features of the system architecture and its handling of the authorization process differentiate the system from existing authorization components by addressing common needs not adequately addressed by existing systems. Continuing system research seeks to enhance the implementation of the current authorization model employed in Cardea, increase the robustness of current features, further the framework for establishing trust and promote interoperability with existing security mechanisms.
Interoperability between phenotype and anatomy ontologies.
Hoehndorf, Robert; Oellrich, Anika; Rebholz-Schuhmann, Dietrich
2010-12-15
Phenotypic information is important for the analysis of the molecular mechanisms underlying disease. A formal ontological representation of phenotypic information can help to identify, interpret and infer phenotypic traits based on experimental findings. The methods that are currently used to represent data and information about phenotypes fail to make the semantics of the phenotypic trait explicit and do not interoperate with ontologies of anatomy and other domains. Therefore, valuable resources for the analysis of phenotype studies remain unconnected and inaccessible to automated analysis and reasoning. We provide a framework to formalize phenotypic descriptions and make their semantics explicit. Based on this formalization, we provide the means to integrate phenotypic descriptions with ontologies of other domains, in particular anatomy and physiology. We demonstrate how our framework leads to the capability to represent disease phenotypes, perform powerful queries that were not possible before and infer additional knowledge. http://bioonto.de/pmwiki.php/Main/PheneOntology.
Towards E-Society Policy Interoperability
NASA Astrophysics Data System (ADS)
Iannella, Renato
The move towards the Policy-Oriented Web is destined to provide support for policy expression and management in the core web layers. One of the most promising areas that can drive this new technology adoption is e-Society communities. With so much user-generated content being shared by these social networks, there is the real danger that the implicit sharing rules that communities have developed over time will be lost in translation in the new digital communities. This will lead to a corresponding loss in confidence in e-Society sites. The Policy-Oriented Web attempts to turn the implicit into the explicit with a common framework for policy language interoperability and awareness. This paper reports on the policy driving factors from the Social Networks experiences using real-world use cases and scenarios. In particular, the key functions of policy-awareness - for privacy, rights, and identity - will be the driving force that enables the e-Society to appreciate new interoperable policy regimes.
Interconnecting Multidiscilinary Data Infrastructures: From Federation to Brokering Framework
NASA Astrophysics Data System (ADS)
Nativi, S.
2014-12-01
Standardization and federation activities have been played an essential role to push interoperability at the disciplinary and cross-disciplinary level. However, they demonstrated not to be sufficient to resolve important interoperability challenges, including: disciplinary heterogeneity, cross-organizations diversities, cultural differences. Significant international initiatives like GEOSS, IODE, and CEOS demonstrated that a federation system dealing with global and multi-disciplinary domain turns out to be rater complex, raising more the already high entry level barriers for both Providers and Users. In particular, GEOSS demonstrated that standardization and federation actions must be accompanied and complemented by a brokering approach. Brokering architecture and its implementing technologies are able to implement an effective interoperability level among multi-disciplinary systems, lowering the entry level barriers for both data providers and users. This presentation will discuss the brokering philosophy as a complementary approach for standardization and federation to interconnect existing and heterogeneous infrastructures and systems. The GEOSS experience will be analyzed, specially.
Economic impact of a nationwide interoperable e-Health system using the PENG evaluation tool.
Parv, L; Saluse, J; Aaviksoo, A; Tiik, M; Sepper, R; Ross, P
2012-01-01
The aim of this paper is to evaluate the costs and benefits of the Estonian interoperable health information exchange system. In addition, a framework will be built for follow-up monitoring and analysis of a nationwide HIE system. PENG evaluation tool was used to map and quantify the costs and benefits arising from type II diabetic patient management for patients, providers and the society. The analysis concludes with a quantification based on real costs and potential benefits identified by a panel of experts. Setting up a countrywide interoperable eHealth system incurs a large initial investment. However, if the system is working seamlessly, benefits will surpass costs within three years. The results show that while the society stands to benefit the most, the costs will be mainly borne by the healthcare providers. Therefore, new government policies should be devised to encourage providers to invest to ensure society wide benefits.
2015-09-30
originate from NASA , NOAA , and community modeling efforts, and support for creation of the suite was shared by sponsors from other agencies. ESPS...Framework (ESMF) Software and Application Development Cecelia Deluca NESII/CIRES/ NOAA Earth System Research Laboratory 325 Broadway Boulder, CO...Capability (NUOPC) was established between NOAA and Navy to develop a common software architecture for easy and efficient interoperability. The
Watershed and Economic Data InterOperability (WEDO)??
The annual public meeting of the Federal Interagency Steering Committee on Multimedia Environmental Modeling (ISCMEM) will convene to discuss some of the latest developments in environmental modeling applications, tools and frameworks, as well as new operational initiatives for F...
IHE cross-enterprise document sharing for imaging: interoperability testing software
2010-01-01
Background With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties. PMID:20858241
IHE cross-enterprise document sharing for imaging: interoperability testing software.
Noumeir, Rita; Renaud, Bérubé
2010-09-21
With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.
2011-01-01
Background The practice and research of medicine generates considerable quantities of data and model resources (DMRs). Although in principle biomedical resources are re-usable, in practice few can currently be shared. In particular, the clinical communities in physiology and pharmacology research, as well as medical education, (i.e. PPME communities) are facing considerable operational and technical obstacles in sharing data and models. Findings We outline the efforts of the PPME communities to achieve automated semantic interoperability for clinical resource documentation in collaboration with the RICORDO project. Current community practices in resource documentation and knowledge management are overviewed. Furthermore, requirements and improvements sought by the PPME communities to current documentation practices are discussed. The RICORDO plan and effort in creating a representational framework and associated open software toolkit for the automated management of PPME metadata resources is also described. Conclusions RICORDO is providing the PPME community with tools to effect, share and reason over clinical resource annotations. This work is contributing to the semantic interoperability of DMRs through ontology-based annotation by (i) supporting more effective navigation and re-use of clinical DMRs, as well as (ii) sustaining interoperability operations based on the criterion of biological similarity. Operations facilitated by RICORDO will range from automated dataset matching to model merging and managing complex simulation workflows. In effect, RICORDO is contributing to community standards for resource sharing and interoperability. PMID:21878109
NASA Astrophysics Data System (ADS)
Mueller, Wolfgang; Mueller, Henning; Marchand-Maillet, Stephane; Pun, Thierry; Squire, David M.; Pecenovic, Zoran; Giess, Christoph; de Vries, Arjen P.
2000-10-01
While in the area of relational databases interoperability is ensured by common communication protocols (e.g. ODBC/JDBC using SQL), Content Based Image Retrieval Systems (CBIRS) and other multimedia retrieval systems are lacking both a common query language and a common communication protocol. Besides its obvious short term convenience, interoperability of systems is crucial for the exchange and analysis of user data. In this paper, we present and describe an extensible XML-based query markup language, called MRML (Multimedia Retrieval markup Language). MRML is primarily designed so as to ensure interoperability between different content-based multimedia retrieval systems. Further, MRML allows researchers to preserve their freedom in extending their system as needed. MRML encapsulates multimedia queries in a way that enable multimedia (MM) query languages, MM content descriptions, MM query engines, and MM user interfaces to grow independently from each other, reaching a maximum of interoperability while ensuring a maximum of freedom for the developer. For benefitting from this, only a few simple design principles have to be respected when extending MRML for one's fprivate needs. The design of extensions withing the MRML framework will be described in detail in the paper. MRML has been implemented and tested for the CBIRS Viper, using the user interface Snake Charmer. Both are part of the GNU project and can be downloaded at our site.
Model and Interoperability using Meta Data Annotations
NASA Astrophysics Data System (ADS)
David, O.
2011-12-01
Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside. While providing all those capabilities, a significant reduction in the size of the model source code was achieved. To support the benefit of annotations for a modeler, studies were conducted to evaluate the effectiveness of an annotation based framework approach with other modeling frameworks and libraries, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A typical hydrological model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks.
Pyke, Christopher R; Madan, Isaac
2013-08-01
The real estate industry routinely uses specialized information systems for functions, including design, construction, facilities management, brokerage, tax assessment, and utilities. These systems are mature and effective within vertically integrated market segments. However, new questions are reaching across these traditional information silos. For example, buyers may be interested in evaluating the design, energy efficiency characteristics, and operational performance of a commercial building. This requires the integration of information across multiple databases held by different institutions. Today, this type of data integration is difficult to automate and propone to errors due, in part, to the lack of generally accepted building and spaces identifiers. Moving forward, the real estate industry needs a new mechanism to assign identifiers for whole buildings and interior spaces for the purpose of interoperability, data exchange, and integration. This paper describes a systematic process to identify activities occurring at building or within interior spaces to provide a foundation for exchange and interoperability. We demonstrate the application of the approach with a prototype Web application. This concept and demonstration illustrate the elements of a practical interoperability framework that can increase productivity, create new business opportunities, and reduce errors, waste, and redundancy. © 2013 New York Academy of Sciences.
A Framework for Resilient Remote Monitoring
2014-08-01
of low-level observables are availa- ble, audited , and recorded. This establishes the need for a re- mote monitoring framework that can integrate with...Security, WS-Policy, SAML, XML Signature, and XML Encryption. Pearson Higher Education, 2004. [3] OMG, “Common Secure Interoperability Protocol...www.darpa.mil/Our_Work/I2O/Programs/Integrated_Cyb er_Analysis_System_%28ICAS%29.aspx. [8] D. Miller and B. Pearson , Security information and event man
An Approach to Information Management for AIR7000 with Metadata and Ontologies
2009-10-01
metadata. We then propose an approach based on Semantic Technologies including the Resource Description Framework (RDF) and Upper Ontologies, for the...mandating specific metadata schemas can result in interoperability problems. For example, many standards within the ADO mandate the use of XML for metadata...such problems, we propose an archi- tecture in which different metadata schemes can inter operate. By using RDF (Resource Description Framework ) as a
The MMI Semantic Framework: Rosetta Stones for Earth Sciences
NASA Astrophysics Data System (ADS)
Rueda, C.; Bermudez, L. E.; Graybeal, J.; Alexander, P.
2009-12-01
Semantic interoperability—the exchange of meaning among computer systems—is needed to successfully share data in Ocean Science and across all Earth sciences. The best approach toward semantic interoperability requires a designed framework, and operationally tested tools and infrastructure within that framework. Currently available technologies make a scientific semantic framework feasible, but its development requires sustainable architectural vision and development processes. This presentation outlines the MMI Semantic Framework, including recent progress on it and its client applications. The MMI Semantic Framework consists of tools, infrastructure, and operational and community procedures and best practices, to meet short-term and long-term semantic interoperability goals. The design and prioritization of the semantic framework capabilities are based on real-world scenarios in Earth observation systems. We describe some key uses cases, as well as the associated requirements for building the overall infrastructure, which is realized through the MMI Ontology Registry and Repository. This system includes support for community creation and sharing of semantic content, ontology registration, version management, and seamless integration of user-friendly tools and application programming interfaces. The presentation describes the architectural components for semantic mediation, registry and repository for vocabularies, ontology, and term mappings. We show how the technologies and approaches in the framework can address community needs for managing and exchanging semantic information. We will demonstrate how different types of users and client applications exploit the tools and services for data aggregation, visualization, archiving, and integration. Specific examples from OOSTethys (http://www.oostethys.org) and the Ocean Observatories Initiative Cyberinfrastructure (http://www.oceanobservatories.org) will be cited. Finally, we show how semantic augmentation of web services standards could be performed using framework tools.
NASA Astrophysics Data System (ADS)
Leskiw, Donald M.; Zhau, Junmei
2000-06-01
This paper reports on results from an ongoing project to develop methodologies for representing and managing multiple, concurrent levels of detail and enabling high performance computing using parallel arrays within distributed object-based simulation frameworks. At this time we present the methodology for representing and managing multiple, concurrent levels of detail and modeling accuracy by using a representation based on the Kalman approach for estimation. The Kalman System Model equations are used to represent model accuracy, Kalman Measurement Model equations provide transformations between heterogeneous levels of detail, and interoperability among disparate abstractions is provided using a form of the Kalman Update equations.
Interoperability challenges for the Sustainable Management of seagrass meadows (Invited)
NASA Astrophysics Data System (ADS)
Nativi, S.; Pastres, R.; Bigagli, L.; Venier, C.; Zucchetta, M.; Santoro, M.
2013-12-01
Seagrass meadows (marine angiosperm plants) occupy less than 0.2% of the global ocean surface, annually store about 10-18% of the so-called 'Blue Carbon', i.e. the Carbon stored in coastal vegetated areas. Recent literature estimates that the flux to the long-term carbon sink in seagrasses represents 10-20% of seagrasses global average production. Such figures can be translated into economic benefits, taking into account that a ton of carbon dioxide in Europe is paid at around 15 € in the carbon market. This means that the organic carbon retained in seagrass sediments in the Mediterranean is worth 138 - 1128 billion €, which represents 6-23 € per square meter. This is 9-35 times more than one square meter of tropical forest soil (0.66 € per square meter), or 5-17 times when considering both the above and the belowground compartments in tropical forests. According the most conservative estimations, about 10% of the Mediterranean meadows have been lost during the last century. In the framework of the GEOSS (Global Earth Observation System of Systems) initiative, the MEDINA project (funded by the European Commission and coordinated by the University of Ca'Foscari in Venice) prepared a showcase as part of the GEOSS Architecture Interoperability Pilot -phase 6 (AIP-6). This showcase aims at providing a tool for the sustainable management of seagrass meadows along the Mediterranean coastline. The application is based on an interoperability framework providing a set of brokerage services to easily ingest and run a Habitat Suitability model (a model predicting the probability a given site to provide a suitable habitat for the development of seagrass meadow and the average coverage expected). The presentation discusses such a framework explaining how the input data is discovered, accessed and processed to ingest the model (developed in the MEDINA project). Furthermore, the brokerage framework provides the necessary services to run the model and visualize results with a low entry barrier for Scientists.
Lessons learned in detailed clinical modeling at Intermountain Healthcare
Oniki, Thomas A; Coyle, Joseph F; Parker, Craig G; Huff, Stanley M
2014-01-01
Background and objective Intermountain Healthcare has a long history of using coded terminology and detailed clinical models (DCMs) to govern storage of clinical data to facilitate decision support and semantic interoperability. The latest iteration of DCMs at Intermountain is called the clinical element model (CEM). We describe the lessons learned from our CEM efforts with regard to subjective decisions a modeler frequently needs to make in creating a CEM. We present insights and guidelines, but also describe situations in which use cases conflict with the guidelines. We propose strategies that can help reconcile the conflicts. The hope is that these lessons will be helpful to others who are developing and maintaining DCMs in order to promote sharing and interoperability. Methods We have used the Clinical Element Modeling Language (CEML) to author approximately 5000 CEMs. Results Based on our experience, we have formulated guidelines to lead our modelers through the subjective decisions they need to make when authoring models. Reported here are guidelines regarding precoordination/postcoordination, dividing content between the model and the terminology, modeling logical attributes, and creating iso-semantic models. We place our lessons in context, exploring the potential benefits of an implementation layer, an iso-semantic modeling framework, and ontologic technologies. Conclusions We assert that detailed clinical models can advance interoperability and sharing, and that our guidelines, an implementation layer, and an iso-semantic framework will support our progress toward that goal. PMID:24993546
Semantic Integration for Marine Science Interoperability Using Web Technologies
NASA Astrophysics Data System (ADS)
Rueda, C.; Bermudez, L.; Graybeal, J.; Isenor, A. W.
2008-12-01
The Marine Metadata Interoperability Project, MMI (http://marinemetadata.org) promotes the exchange, integration, and use of marine data through enhanced data publishing, discovery, documentation, and accessibility. A key effort is the definition of an Architectural Framework and Operational Concept for Semantic Interoperability (http://marinemetadata.org/sfc), which is complemented with the development of tools that realize critical use cases in semantic interoperability. In this presentation, we describe a set of such Semantic Web tools that allow performing important interoperability tasks, ranging from the creation of controlled vocabularies and the mapping of terms across multiple ontologies, to the online registration, storage, and search services needed to work with the ontologies (http://mmisw.org). This set of services uses Web standards and technologies, including Resource Description Framework (RDF), Web Ontology language (OWL), Web services, and toolkits for Rich Internet Application development. We will describe the following components: MMI Ontology Registry: The MMI Ontology Registry and Repository provides registry and storage services for ontologies. Entries in the registry are associated with projects defined by the registered users. Also, sophisticated search functions, for example according to metadata items and vocabulary terms, are provided. Client applications can submit search requests using the WC3 SPARQL Query Language for RDF. Voc2RDF: This component converts an ASCII comma-delimited set of terms and definitions into an RDF file. Voc2RDF facilitates the creation of controlled vocabularies by using a simple form-based user interface. Created vocabularies and their descriptive metadata can be submitted to the MMI Ontology Registry for versioning and community access. VINE: The Vocabulary Integration Environment component allows the user to map vocabulary terms across multiple ontologies. Various relationships can be established, for example exactMatch, narrowerThan, and subClassOf. VINE can compute inferred mappings based on the given associations. Attributes about each mapping, like comments and a confidence level, can also be included. VINE also supports registering and storing resulting mapping files in the Ontology Registry. The presentation will describe the application of semantic technologies in general, and our planned applications in particular, to solve data management problems in the marine and environmental sciences.
Meshkati, Najmedin; Tabibzadeh, Maryam; Farshid, Ali; Rahimi, Mansour; Alhanaee, Ghena
2016-02-01
The aim of this study is to identify the interdependencies of human and organizational subsystems of multiple complex, safety-sensitive technological systems and their interoperability in the context of sustainability and resilience of an ecosystem. Recent technological disasters with severe environmental impact are attributed to human factors and safety culture causes. One of the most populous and environmentally sensitive regions in the world, the (Persian) Gulf, is on the confluence of an exponentially growing number of two industries--nuclear power and seawater desalination plants--that is changing its land- and seascape. Building upon Rasmussen's model, a macrosystem integrative framework, based on the broader context of human factors, is developed, which can be considered in this context as a "meta-ergonomics" paradigm, for the analysis of interactions, design of interoperability, and integration of decisions of major actors whose actions can affect safety and sustainability of the focused industries during routine and nonroutine (emergency) operations. Based on the emerging realities in the Gulf region, it is concluded that without such systematic approach toward addressing the interdependencies of water and energy sources, sustainability will be only a short-lived dream and prosperity will be a disappearing mirage for millions of people in the region. This multilayered framework for the integration of people, technology, and ecosystem--which has been applied to the (Persian) Gulf--offers a viable and vital approach to the design and operation of large-scale complex systems wherever the nexus of water, energy, and food sources are concerned, such as the Black Sea. © 2016, Human Factors and Ergonomics Society.
Aggregate Exposure Pathways in Support of Risk Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Yu-Mei; Leonard, Jeremy A.; Edwards, Stephen
Over time, risk assessment has shifted from establishing relationships between exposure to a single chemical and a resulting adverse health outcome, to evaluation of multiple chemicals and disease outcomes simultaneously. As a result, there is an increasing need to better understand the complex mechanisms that influence risk of chemical and non-chemical stressors, beginning at their source and ending at a biological endpoint relevant to human or ecosystem health risk assessment. Just as the Adverse Outcome Pathway (AOP) framework has emerged as a means of providing insight into mechanism-based toxicity, the exposure science community has seen the recent introduction of themore » Aggregate Exposure Pathway (AEP) framework. AEPs aid in making exposure data applicable to the FAIR (i.e., findable, accessible, interoperable, and reusable) principle, especially by (1) organizing continuous flow of disjointed exposure information;(2) identifying data gaps, to focus resources on acquiring the most relevant data; (3) optimizing use and repurposing of existing exposure data; and (4) facilitating interoperability among predictive models. Herein, we discuss integration of the AOP and AEP frameworks and how such integration can improve confidence in both traditional and cumulative risk assessment approaches.« less
Aggregate Exposure Pathways in Support of Risk Assessment
Tan, Yu-Mei; Leonard, Jeremy A.; Edwards, Stephen; ...
2018-03-29
Over time, risk assessment has shifted from establishing relationships between exposure to a single chemical and a resulting adverse health outcome, to evaluation of multiple chemicals and disease outcomes simultaneously. As a result, there is an increasing need to better understand the complex mechanisms that influence risk of chemical and non-chemical stressors, beginning at their source and ending at a biological endpoint relevant to human or ecosystem health risk assessment. Just as the Adverse Outcome Pathway (AOP) framework has emerged as a means of providing insight into mechanism-based toxicity, the exposure science community has seen the recent introduction of themore » Aggregate Exposure Pathway (AEP) framework. AEPs aid in making exposure data applicable to the FAIR (i.e., findable, accessible, interoperable, and reusable) principle, especially by (1) organizing continuous flow of disjointed exposure information;(2) identifying data gaps, to focus resources on acquiring the most relevant data; (3) optimizing use and repurposing of existing exposure data; and (4) facilitating interoperability among predictive models. Herein, we discuss integration of the AOP and AEP frameworks and how such integration can improve confidence in both traditional and cumulative risk assessment approaches.« less
A Framework for Integration of Heterogeneous Medical Imaging Networks
Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos
2014-01-01
Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS. PMID:25279021
U-Compare: share and compare text mining tools with UIMA.
Kano, Yoshinobu; Baumgartner, William A; McCrohon, Luke; Ananiadou, Sophia; Cohen, K Bretonnel; Hunter, Lawrence; Tsujii, Jun'ichi
2009-08-01
Due to the increasing number of text mining resources (tools and corpora) available to biologists, interoperability issues between these resources are becoming significant obstacles to using them effectively. UIMA, the Unstructured Information Management Architecture, is an open framework designed to aid in the construction of more interoperable tools. U-Compare is built on top of the UIMA framework, and provides both a concrete framework for out-of-the-box text mining and a sophisticated evaluation platform allowing users to run specific tools on any target text, generating both detailed statistics and instance-based visualizations of outputs. U-Compare is a joint project, providing the world's largest, and still growing, collection of UIMA-compatible resources. These resources, originally developed by different groups for a variety of domains, include many famous tools and corpora. U-Compare can be launched straight from the web, without needing to be manually installed. All U-Compare components are provided ready-to-use and can be combined easily via a drag-and-drop interface without any programming. External UIMA components can also simply be mixed with U-Compare components, without distinguishing between locally and remotely deployed resources. http://u-compare.org/
A framework for integration of heterogeneous medical imaging networks.
Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos
2014-01-01
Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS.
EUnetHTA information management system: development and lessons learned.
Chalon, Patrice X; Kraemer, Peter
2014-11-01
The aim of this study was to describe the techniques used in achieving consensus on common standards to be implemented in the EUnetHTA Information Management System (IMS); and to describe how interoperability between tools was explored. Three face to face meetings were organized to identify and agree on common standards to the development of online tools. Two tools were created to demonstrate the added value of implementing interoperability standards at local levels. Developers of tools outside EUnetHTA were identified and contacted. Four common standards have been agreed on by consensus; and consequently all EUnetHTA tools have been modified or designed accordingly. RDF Site Summary (RSS) has demonstrated a good potential to support rapid dissemination of HTA information. Contacts outside EUnetHTA resulted in direct collaboration (HTA glossary, HTAi Vortal), evaluation of options for interoperability between tools (CRD HTA database) or a formal framework to prepare cooperation on concrete projects (INAHTA projects database). While being entitled a project on IT infrastructure, the work program was also about people. When having to agree on complex topics, fostering a cohesive group dynamic and hosting face to face meetings brings added value and enhances understanding between partners. The adoption of widespread standards enhanced the homogeneity of the EUnetHTA tools and should thus contribute to their wider use, therefore, to the general objective of EUnetHTA. The initiatives on interoperability of systems need to be developed further to support a general interoperable information system that could benefit the whole HTA community.
Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz
2016-01-01
As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optimizations1 to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor a , an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions. PMID:27896971
Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz
2017-01-01
As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.
Integrated Nationwide Electronic Health Records system: Semi-distributed architecture approach.
Fragidis, Leonidas L; Chatzoglou, Prodromos D; Aggelidis, Vassilios P
2016-11-14
The integration of heterogeneous electronic health records systems by building an interoperable nationwide electronic health record system provides undisputable benefits in health care, like superior health information quality, medical errors prevention and cost saving. This paper proposes a semi-distributed system architecture approach for an integrated national electronic health record system incorporating the advantages of the two dominant approaches, the centralized architecture and the distributed architecture. The high level design of the main elements for the proposed architecture is provided along with diagrams of execution and operation and data synchronization architecture for the proposed solution. The proposed approach effectively handles issues related to redundancy, consistency, security, privacy, availability, load balancing, maintainability, complexity and interoperability of citizen's health data. The proposed semi-distributed architecture offers a robust interoperability framework without healthcare providers to change their local EHR systems. It is a pragmatic approach taking into account the characteristics of the Greek national healthcare system along with the national public administration data communication network infrastructure, for achieving EHR integration with acceptable implementation cost.
An ontology for component-based models of water resource systems
NASA Astrophysics Data System (ADS)
Elag, Mostafa; Goodall, Jonathan L.
2013-08-01
Component-based modeling is an approach for simulating water resource systems where a model is composed of a set of components, each with a defined modeling objective, interlinked through data exchanges. Component-based modeling frameworks are used within the hydrologic, atmospheric, and earth surface dynamics modeling communities. While these efforts have been advancing, it has become clear that the water resources modeling community in particular, and arguably the larger earth science modeling community as well, faces a challenge of fully and precisely defining the metadata for model components. The lack of a unified framework for model component metadata limits interoperability between modeling communities and the reuse of models across modeling frameworks due to ambiguity about the model and its capabilities. To address this need, we propose an ontology for water resources model components that describes core concepts and relationships using the Web Ontology Language (OWL). The ontology that we present, which is termed the Water Resources Component (WRC) ontology, is meant to serve as a starting point that can be refined over time through engagement by the larger community until a robust knowledge framework for water resource model components is achieved. This paper presents the methodology used to arrive at the WRC ontology, the WRC ontology itself, and examples of how the ontology can aid in component-based water resources modeling by (i) assisting in identifying relevant models, (ii) encouraging proper model coupling, and (iii) facilitating interoperability across earth science modeling frameworks.
Pape-Haugaard, Louise; Frank, Lars
2011-01-01
A major obstacle in ensuring ubiquitous information is the utilization of heterogeneous systems in eHealth. The objective in this paper is to illustrate how an architecture for distributed eHealth databases can be designed without lacking the characteristic features of traditional sustainable databases. The approach is firstly to explain traditional architecture in central and homogeneous distributed database computing, followed by a possible approach to use an architectural framework to obtain sustainability across disparate systems i.e. heterogeneous databases, concluded with a discussion. It is seen that through a method of using relaxed ACID properties on a service-oriented architecture it is possible to achieve data consistency which is essential when ensuring sustainable interoperability.
ImageJ-MATLAB: a bidirectional framework for scientific image analysis interoperability.
Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W
2017-02-15
ImageJ-MATLAB is a lightweight Java library facilitating bi-directional interoperability between MATLAB and ImageJ. By defining a standard for translation between matrix and image data structures, researchers are empowered to select the best tool for their image-analysis tasks. Freely available extension to ImageJ2 ( http://imagej.net/Downloads ). Installation and use instructions available at http://imagej.net/MATLAB_Scripting. Tested with ImageJ 2.0.0-rc-54 , Java 1.8.0_66 and MATLAB R2015b. eliceiri@wisc.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Ontology Mappings to Improve Learning Resource Search
ERIC Educational Resources Information Center
Gasevic, Dragan; Hatala, Marek
2006-01-01
This paper proposes an ontology mapping-based framework that allows searching for learning resources using multiple ontologies. The present applications of ontologies in e-learning use various ontologies (eg, domain, curriculum, context), but they do not give a solution on how to interoperate e-learning systems based on different ontologies. The…
Modeling Educational Content: The Cognitive Approach of the PALO Language
ERIC Educational Resources Information Center
Rodriguez-Artacho, Miguel; Verdejo Maillo, M. Felisa
2004-01-01
This paper presents a reference framework to describe educational material. It introduces the PALO Language as a cognitive based approach to Educational Modeling Languages (EML). In accordance with recent trends for reusability and interoperability in Learning Technologies, EML constitutes an evolution of the current content-centered…
Reusable Models of Pedagogical Concepts--A Framework for Pedagogical and Content Design.
ERIC Educational Resources Information Center
Pawlowski, Jan M.
Standardization initiatives in the field of learning technologies have produced standards for the interoperability of learning environments and learning management systems. Learning resources based on these standards can be reused, recombined, and adapted to the user. However, these standards follow a content-oriented approach; the process of…
Multimedia content description framework
NASA Technical Reports Server (NTRS)
Bergman, Lawrence David (Inventor); Mohan, Rakesh (Inventor); Li, Chung-Sheng (Inventor); Smith, John Richard (Inventor); Kim, Michelle Yoonk Yung (Inventor)
2003-01-01
A framework is provided for describing multimedia content and a system in which a plurality of multimedia storage devices employing the content description methods of the present invention can interoperate. In accordance with one form of the present invention, the content description framework is a description scheme (DS) for describing streams or aggregations of multimedia objects, which may comprise audio, images, video, text, time series, and various other modalities. This description scheme can accommodate an essentially limitless number of descriptors in terms of features, semantics or metadata, and facilitate content-based search, index, and retrieval, among other capabilities, for both streamed or aggregated multimedia objects.
A framework for evaluating and utilizing medical terminology mappings.
Hussain, Sajjad; Sun, Hong; Sinaci, Anil; Erturkmen, Gokce Banu Laleci; Mead, Charles; Gray, Alasdair J G; McGuinness, Deborah L; Prud'Hommeaux, Eric; Daniel, Christel; Forsberg, Kerstin
2014-01-01
Use of medical terminologies and mappings across them are considered to be crucial pre-requisites for achieving interoperable eHealth applications. Built upon the outcomes of several research projects, we introduce a framework for evaluating and utilizing terminology mappings that offers a platform for i) performing various mappings strategies, ii) representing terminology mappings together with their provenance information, and iii) enabling terminology reasoning for inferring both new and erroneous mappings. We present the results of the introduced framework from SALUS project where we evaluated the quality of both existing and inferred terminology mappings among standard terminologies.
Ambient assisted living healthcare frameworks, platforms, standards, and quality attributes.
Memon, Mukhtiar; Wagner, Stefan Rahr; Pedersen, Christian Fischer; Beevi, Femina Hassan Aysha; Hansen, Finn Overgaard
2014-03-04
Ambient Assisted Living (AAL) is an emerging multi-disciplinary field aiming at exploiting information and communication technologies in personal healthcare and telehealth systems for countering the effects of growing elderly population. AAL systems are developed for personalized, adaptive, and anticipatory requirements, necessitating high quality-of-service to achieve interoperability, usability, security, and accuracy. The aim of this paper is to provide a comprehensive review of the AAL field with a focus on healthcare frameworks, platforms, standards, and quality attributes. To achieve this, we conducted a literature survey of state-of-the-art AAL frameworks, systems and platforms to identify the essential aspects of AAL systems and investigate the critical issues from the design, technology, quality-of-service, and user experience perspectives. In addition, we conducted an email-based survey for collecting usage data and current status of contemporary AAL systems. We found that most AAL systems are confined to a limited set of features ignoring many of the essential AAL system aspects. Standards and technologies are used in a limited and isolated manner, while quality attributes are often addressed insufficiently. In conclusion, we found that more inter-organizational collaboration, user-centered studies, increased standardization efforts, and a focus on open systems is needed to achieve more interoperable and synergetic AAL solutions.
U-Compare: share and compare text mining tools with UIMA
Kano, Yoshinobu; Baumgartner, William A.; McCrohon, Luke; Ananiadou, Sophia; Cohen, K. Bretonnel; Hunter, Lawrence; Tsujii, Jun'ichi
2009-01-01
Summary: Due to the increasing number of text mining resources (tools and corpora) available to biologists, interoperability issues between these resources are becoming significant obstacles to using them effectively. UIMA, the Unstructured Information Management Architecture, is an open framework designed to aid in the construction of more interoperable tools. U-Compare is built on top of the UIMA framework, and provides both a concrete framework for out-of-the-box text mining and a sophisticated evaluation platform allowing users to run specific tools on any target text, generating both detailed statistics and instance-based visualizations of outputs. U-Compare is a joint project, providing the world's largest, and still growing, collection of UIMA-compatible resources. These resources, originally developed by different groups for a variety of domains, include many famous tools and corpora. U-Compare can be launched straight from the web, without needing to be manually installed. All U-Compare components are provided ready-to-use and can be combined easily via a drag-and-drop interface without any programming. External UIMA components can also simply be mixed with U-Compare components, without distinguishing between locally and remotely deployed resources. Availability: http://u-compare.org/ Contact: kano@is.s.u-tokyo.ac.jp PMID:19414535
Archetype Model-Driven Development Framework for EHR Web System.
Kobayashi, Shinji; Kimura, Eizen; Ishihara, Ken
2013-12-01
This article describes the Web application framework for Electronic Health Records (EHRs) we have developed to reduce construction costs for EHR sytems. The openEHR project has developed clinical model driven architecture for future-proof interoperable EHR systems. This project provides the specifications to standardize clinical domain model implementations, upon which the ISO/CEN 13606 standards are based. The reference implementation has been formally described in Eiffel. Moreover C# and Java implementations have been developed as reference. While scripting languages had been more popular because of their higher efficiency and faster development in recent years, they had not been involved in the openEHR implementations. From 2007, we have used the Ruby language and Ruby on Rails (RoR) as an agile development platform to implement EHR systems, which is in conformity with the openEHR specifications. We implemented almost all of the specifications, the Archetype Definition Language parser, and RoR scaffold generator from archetype. Although some problems have emerged, most of them have been resolved. We have provided an agile EHR Web framework, which can build up Web systems from archetype models using RoR. The feasibility of the archetype model to provide semantic interoperability of EHRs has been demonstrated and we have verified that that it is suitable for the construction of EHR systems.
Ambient Assisted Living Healthcare Frameworks, Platforms, Standards, and Quality Attributes
Memon, Mukhtiar; Wagner, Stefan Rahr; Pedersen, Christian Fischer; Beevi, Femina Hassan Aysha; Hansen, Finn Overgaard
2014-01-01
Ambient Assisted Living (AAL) is an emerging multi-disciplinary field aiming at exploiting information and communication technologies in personal healthcare and telehealth systems for countering the effects of growing elderly population. AAL systems are developed for personalized, adaptive, and anticipatory requirements, necessitating high quality-of-service to achieve interoperability, usability, security, and accuracy. The aim of this paper is to provide a comprehensive review of the AAL field with a focus on healthcare frameworks, platforms, standards, and quality attributes. To achieve this, we conducted a literature survey of state-of-the-art AAL frameworks, systems and platforms to identify the essential aspects of AAL systems and investigate the critical issues from the design, technology, quality-of-service, and user experience perspectives. In addition, we conducted an email-based survey for collecting usage data and current status of contemporary AAL systems. We found that most AAL systems are confined to a limited set of features ignoring many of the essential AAL system aspects. Standards and technologies are used in a limited and isolated manner, while quality attributes are often addressed insufficiently. In conclusion, we found that more inter-organizational collaboration, user-centered studies, increased standardization efforts, and a focus on open systems is needed to achieve more interoperable and synergetic AAL solutions. PMID:24599192
NASA Astrophysics Data System (ADS)
Glaves, Helen; Schaap, Dick
2017-04-01
In recent years there has been a paradigm shift in marine research moving from the traditional discipline based methodology employed at the national level by one or more organizations, to a multidisciplinary, ecosystem level approach conducted on an international scale. This increasingly holistic approach to marine research is in part being driven by policy and legislation. For example, the European Commission's Blue Growth strategy promotes sustainable growth in the marine environment including the development of sea-basin strategies (European Commission 2014). As well as this policy driven shift to ecosystem level marine research there are also scientific and economic drivers for a basin level approach. Marine monitoring is essential for assessing the health of an ecosystem and determining the impacts of specific factors and activities on it. The availability of large volumes of good quality data is fundamental to this increasingly holistic approach to ocean research but there are significant barriers to its re-use. These are due to the heterogeneity of the data resulting from having been collected by many organizations around the globe using a variety of sensors mounted on a range of different platforms. The data is then delivered and archived in a range of formats, using various spatial coordinate systems and aligned with different standards. This heterogeneity coupled with organizational and national policies on data sharing make access and re-use of marine data problematic. In response to the need for greater sharing of marine data a number of e-infrastructures have been developed but these have different levels of granularity with the majority having been developed at the regional level to address specific requirements for data e.g. SeaDataNet in Europe, the Australian Ocean Data Network (AODN). These data infrastructures are also frequently aligned with the priorities of the local funding agencies and have been created in isolation from those developed elsewhere. To add a further layer of complexity there are also global initiatives providing marine data infrastructures e.g. IOC-IODE, POGO as well as those with a wider remit which includes environmental data e.g. GEOSS, COPERNICUS etc. Ecosystem level marine research requires a common framework for marine data management that supports the sharing of data across these regional and global data systems, and provides the user with access to the data available from these services via a single point of access. This framework must be based on existing data systems and established by developing interoperability between them. The Ocean Data and Interoperability Platform (ODIP/ODIP II) project brings together those organisations responsible for maintaining selected regional data infrastructures along with other relevant experts in order to identify the common standards and best practice necessary to underpin this framework, and to evaluate the differences and commonalties between the regional data infrastructures in order to establish interoperability between them for the purposes of data sharing. This coordinated approach is being demonstrated and validated through the development of a series of prototype interoperability solutions that demonstrate the mechanisms and standards necessary to facilitate the sharing of marine data across these existing data infrastructures.
NASA Astrophysics Data System (ADS)
Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.
2005-12-01
As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS/DOT, and Intergraph; and 5) develop WFS-based solutions and technical documents using the GeoMedia WebMap WFS toolkit. Geospatial Web Feature Service is demonstrated to be more efficient in sharing vector data and supports direct Internet access transportation data. Developed WFS solutions also enhanced the interoperable service provided by CEOSR through the FGDC clearinghouse node and the GOS Portal.
Governance in the Digital Age: A Research and Action Framework for an Uncertain Future
ERIC Educational Resources Information Center
Dawes, Sharon S.
2009-01-01
Research into relationships among government, society and technology has grown substantially over the past 30 years. However, most research and most advances in practice address narrowly defined categories of concern such as government organization, citizen services, interoperability, or personal privacy. By contrast, the future presents complex…
Enabling Interoperable and Selective Data Sharing among Social Networking Sites
NASA Astrophysics Data System (ADS)
Shin, Dongwan; Lopes, Rodrigo
With the widespread use of social networking (SN) sites and even introduction of a social component in non-social oriented services, there is a growing concern over user privacy in general, how to handle and share user profiles across SN sites in particular. Although there have been several proprietary or open source-based approaches to unifying the creation of third party applications, the availability and retrieval of user profile information are still limited to the site where the third party application is run, mostly devoid of the support for data interoperability. In this paper we propose an approach to enabling interopearable and selective data sharing among SN sites. To support selective data sharing, we discuss an authenticated dictionary (ADT)-based credential which enables a user to share only a subset of her information certified by external SN sites with applications running on an SN site. For interoperable data sharing, we propose an extension to the OpenSocial API so that it can provide an open source-based framework for allowing the ADT-based credential to be used seamlessly among different SN sites.
Cross-language Babel structs—making scientific interfaces more efficient
NASA Astrophysics Data System (ADS)
Prantl, Adrian; Ebner, Dietmar; Epperly, Thomas G. W.
2013-01-01
Babel is an open-source language interoperability framework tailored to the needs of high-performance scientific computing. As an integral element of the Common Component Architecture, it is employed in a wide range of scientific applications where it is used to connect components written in different programming languages. In this paper we describe how we extended Babel to support interoperable tuple data types (structs). Structs are a common idiom in (mono-lingual) scientific application programming interfaces (APIs); they are an efficient way to pass tuples of nonuniform data between functions, and are supported natively by most programming languages. Using our extended version of Babel, developers of scientific codes can now pass structs as arguments between functions implemented in any of the supported languages. In C, C++, Fortran 2003/2008 and Chapel, structs can be passed without the overhead of data marshaling or copying, providing language interoperability at minimal cost. Other supported languages are Fortran 77, Fortran 90/95, Java and Python. We will show how we designed a struct implementation that is interoperable with all of the supported languages and present benchmark data to compare the performance of all language bindings, highlighting the differences between languages that offer native struct support and an object-oriented interface with getter/setter methods. A case study shows how structs can help simplify the interfaces of scientific codes significantly.
Seebregts, Christopher; Dane, Pierre; Parsons, Annie Neo; Fogwill, Thomas; Rogers, Debbie; Bekker, Marcha; Shaw, Vincent; Barron, Peter
2018-01-01
MomConnect is a national initiative coordinated by the South African National Department of Health that sends text-based mobile phone messages free of charge to pregnant women who voluntarily register at any public healthcare facility in South Africa. We describe the system design and architecture of the MomConnect technical platform, planned as a nationally scalable and extensible initiative. It uses a health information exchange that can connect any standards-compliant electronic front-end application to any standards-compliant electronic back-end database. The implementation of the MomConnect technical platform, in turn, is a national reference application for electronic interoperability in line with the South African National Health Normative Standards Framework. The use of open content and messaging standards enables the architecture to include any application adhering to the selected standards. Its national implementation at scale demonstrates both the use of this technology and a key objective of global health information systems, which is to achieve implementation scale. The system’s limited clinical information, initially, allowed the architecture to focus on the base standards and profiles for interoperability in a resource-constrained environment with limited connectivity and infrastructural capacity. Maintenance of the system requires mobilisation of national resources. Future work aims to use the standard interfaces to include data from additional applications as well as to extend and interface the framework with other public health information systems in South Africa. The development of this platform has also shown the benefits of interoperability at both an organisational and technical level in South Africa. PMID:29713506
Seebregts, Christopher; Dane, Pierre; Parsons, Annie Neo; Fogwill, Thomas; Rogers, Debbie; Bekker, Marcha; Shaw, Vincent; Barron, Peter
2018-01-01
MomConnect is a national initiative coordinated by the South African National Department of Health that sends text-based mobile phone messages free of charge to pregnant women who voluntarily register at any public healthcare facility in South Africa. We describe the system design and architecture of the MomConnect technical platform, planned as a nationally scalable and extensible initiative. It uses a health information exchange that can connect any standards-compliant electronic front-end application to any standards-compliant electronic back-end database. The implementation of the MomConnect technical platform, in turn, is a national reference application for electronic interoperability in line with the South African National Health Normative Standards Framework. The use of open content and messaging standards enables the architecture to include any application adhering to the selected standards. Its national implementation at scale demonstrates both the use of this technology and a key objective of global health information systems, which is to achieve implementation scale. The system's limited clinical information, initially, allowed the architecture to focus on the base standards and profiles for interoperability in a resource-constrained environment with limited connectivity and infrastructural capacity. Maintenance of the system requires mobilisation of national resources. Future work aims to use the standard interfaces to include data from additional applications as well as to extend and interface the framework with other public health information systems in South Africa. The development of this platform has also shown the benefits of interoperability at both an organisational and technical level in South Africa.
Creating personalised clinical pathways by semantic interoperability with electronic health records.
Wang, Hua-Qiong; Li, Jing-Song; Zhang, Yi-Fan; Suzuki, Muneou; Araki, Kenji
2013-06-01
There is a growing realisation that clinical pathways (CPs) are vital for improving the treatment quality of healthcare organisations. However, treatment personalisation is one of the main challenges when implementing CPs, and the inadequate dynamic adaptability restricts the practicality of CPs. The purpose of this study is to improve the practicality of CPs using semantic interoperability between knowledge-based CPs and semantic electronic health records (EHRs). Simple protocol and resource description framework query language is used to gather patient information from semantic EHRs. The gathered patient information is entered into the CP ontology represented by web ontology language. Then, after reasoning over rules described by semantic web rule language in the Jena semantic framework, we adjust the standardised CPs to meet different patients' practical needs. A CP for acute appendicitis is used as an example to illustrate how to achieve CP customisation based on the semantic interoperability between knowledge-based CPs and semantic EHRs. A personalised care plan is generated by comprehensively analysing the patient's personal allergy history and past medical history, which are stored in semantic EHRs. Additionally, by monitoring the patient's clinical information, an exception is recorded and handled during CP execution. According to execution results of the actual example, the solutions we present are shown to be technically feasible. This study contributes towards improving the clinical personalised practicality of standardised CPs. In addition, this study establishes the foundation for future work on the research and development of an independent CP system. Copyright © 2013 Elsevier B.V. All rights reserved.
Architectural approaches for HL7-based health information systems implementation.
López, D M; Blobel, B
2010-01-01
Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.
A Standardization Framework for Electronic Government Service Portals
NASA Astrophysics Data System (ADS)
Sarantis, Demetrios; Tsiakaliaris, Christos; Lampathaki, Fenareti; Charalabidis, Yannis
Although most eGovernment interoperability frameworks (eGIFs) cover adequately the technical aspects of developing and supporting the provision of electronic services to citizens and businesses, they do not exclusively address several important areas regarding the organization, presentation, accessibility and security of the content and the electronic services offered through government portals. This chapter extends the scope of existing eGIFs presenting the overall architecture and the basic concepts of the Greek standardization framework for electronic government service portals which, for the first time in Europe, is part of a country's eGovernment framework. The proposed standardization framework includes standards, guidelines and recommendations regarding the design, development and operation of government portals that support the provision of administrative information and services to citizens and businesses. By applying the guidelines of the framework, the design, development and operation of portals in central, regional and municipal government can be systematically addressed resulting in an applicable, sustainable and ever-expanding framework.
An Open Source Extensible Smart Energy Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rankin, Linda
Aggregated distributed energy resources are the subject of much interest in the energy industry and are expected to play an important role in meeting our future energy needs by changing how we use, distribute and generate electricity. This energy future includes an increased amount of energy from renewable resources, load management techniques to improve resiliency and reliability, and distributed energy storage and generation capabilities that can be managed to meet the needs of the grid as well as individual customers. These energy assets are commonly referred to as Distributed Energy Resources (DER). DERs rely on a means to communicate informationmore » between an energy provider and multitudes of devices. Today DER control systems are typically vendor-specific, using custom hardware and software solutions. As a result, customers are locked into communication transport protocols, applications, tools, and data formats. Today’s systems are often difficult to extend to meet new application requirements, resulting in stranded assets when business requirements or energy management models evolve. By partnering with industry advisors and researchers, an implementation DER research platform was developed called the Smart Energy Framework (SEF). The hypothesis of this research was that an open source Internet of Things (IoT) framework could play a role in creating a commodity-based eco-system for DER assets that would reduce costs and provide interoperable products. SEF is based on the AllJoynTM IoT open source framework. The demonstration system incorporated DER assets, specifically batteries and smart water heaters. To verify the behavior of the distributed system, models of water heaters and batteries were also developed. An IoT interface for communicating between the assets and a control server was defined. This interface supports a series of “events” and telemetry reporting, similar to those defined by current smart grid communication standards. The results of this effort demonstrated the feasibility and application potential of using IoT frameworks for the creation of commodity-based DER systems. All of the identified commodity-based system requirements were met by the AllJoyn framework. By having commodity solutions, small vendors can enter the market and the cost of implementation for all parties is reduced. Utilities and aggregators can choose from multiple interoperable products reducing the risk of stranded assets. Based on this research it is recommended that interfaces based on existing smart grid communication protocol standards be created for these emerging IoT frameworks. These interfaces should be standardized as part of the IoT framework allowing for interoperability testing and certification. Similarly, IoT frameworks are introducing application level security. This type of security is needed for protecting application and platforms and will be important moving forward. Recommendations are that along with DER-based data model interfaces, platform and application security requirements also be prescribed when IoT devices support DER applications.« less
Holistic Framework For Establishing Interoperability of Heterogeneous Software Development Tools
2003-06-01
cannot be obtained or is zero while in autocontrol , CARA will terminate auto-control 1.637 16 5 FEAT12 EMF Present, Log Manual Mode If back EMF...auto- control mode a ’Terminate Autocontrol ’ button should be made available. 16.37 9 5 300 FEAT Tag Name Requirement Text AHP Priority Rqts
Introduction to Architectures: HSCB Information - What It Is and How It Fits (or Doesn’t Fit)
2010-10-01
Simulation Interoperability Workshop, 01E- SIW -080 [15] Barry G. Silverman, Gnana Gharathy, Kevin O’Brien, Jason Cornwell, “Human Behavior Models for Agents...Workshop, 10F- SIW -023, September 2010. [17] Christiansen, John H., “A flexible object-based software framework for modelling complex systems with
Enhancing security and improving interoperability in healthcare information systems.
Gritzalis, D A
1998-01-01
Security is a key issue in healthcare information systems, since most aspects of security become of considerable or even critical importance when handling healthcare information. In addition, the intense need for information exchange has revealed interoperability of systems and applications as another key issue. Standardization can play an important role towards both these issues. In this paper, relevant standardization activities are briefly presented, and existing and emerging healthcare information security standards are identified and critically analysed. The analysis is based on a framework which has been developed for this reason. Therefore, the identification of gaps and inconsistencies in current standardization, the description of the conflicts of standards with legislation, and the analysis of implications of these standards to user organizations, are the main results of this paper.
NASA Astrophysics Data System (ADS)
Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.
2014-12-01
In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a modeling framework's native component interface. (3) Create semantic mappings between modeling frameworks that support semantic mediation. This third goal involves creating a crosswalk between the CF Standard Names and the CSDMS Standard Names (a set of naming conventions). This talk will summarize progress towards these goals.
Archetype Model-Driven Development Framework for EHR Web System
Kimura, Eizen; Ishihara, Ken
2013-01-01
Objectives This article describes the Web application framework for Electronic Health Records (EHRs) we have developed to reduce construction costs for EHR sytems. Methods The openEHR project has developed clinical model driven architecture for future-proof interoperable EHR systems. This project provides the specifications to standardize clinical domain model implementations, upon which the ISO/CEN 13606 standards are based. The reference implementation has been formally described in Eiffel. Moreover C# and Java implementations have been developed as reference. While scripting languages had been more popular because of their higher efficiency and faster development in recent years, they had not been involved in the openEHR implementations. From 2007, we have used the Ruby language and Ruby on Rails (RoR) as an agile development platform to implement EHR systems, which is in conformity with the openEHR specifications. Results We implemented almost all of the specifications, the Archetype Definition Language parser, and RoR scaffold generator from archetype. Although some problems have emerged, most of them have been resolved. Conclusions We have provided an agile EHR Web framework, which can build up Web systems from archetype models using RoR. The feasibility of the archetype model to provide semantic interoperability of EHRs has been demonstrated and we have verified that that it is suitable for the construction of EHR systems. PMID:24523991
Interoperability challenges in river discharge modelling: A cross domain application scenario
NASA Astrophysics Data System (ADS)
Santoro, Mattia; Andres, Volker; Jirka, Simon; Koike, Toshio; Looser, Ulrich; Nativi, Stefano; Pappenberger, Florian; Schlummer, Manuela; Strauch, Adrian; Utech, Michael; Zsoter, Ervin
2018-06-01
River discharge is a critical water cycle variable, as it integrates all the processes (e.g. runoff and evapotranspiration) occurring within a river basin and provides a hydrological output variable that can be readily measured. Its prediction is of invaluable help for many water-related tasks including water resources assessment and management, flood protection, and disaster mitigation. Observations of river discharge are important to calibrate and validate hydrological or coupled land, atmosphere and ocean models. This requires using datasets from different scientific domains (Water, Weather, etc.). Typically, such datasets are provided using different technological solutions. This complicates the integration of new hydrological data sources into application systems. Therefore, a considerable effort is often spent on data access issues instead of the actual scientific question. This paper describes the work performed to address multidisciplinary interoperability challenges related to river discharge modeling and validation. This includes definition and standardization of domain specific interoperability standards for hydrological data sharing and their support in global frameworks such as the Global Earth Observation System of Systems (GEOSS). The research was developed in the context of the EU FP7-funded project GEOWOW (GEOSS Interoperability for Weather, Ocean and Water), which implemented a "River Discharge" application scenario. This scenario demonstrates the combination of river discharge observations data from the Global Runoff Data Centre (GRDC) database and model outputs produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) predicting river discharge based on weather forecast information in the context of the GEOSS.
A common layer of interoperability for biomedical ontologies based on OWL EL.
Hoehndorf, Robert; Dumontier, Michel; Oellrich, Anika; Wimalaratne, Sarala; Rebholz-Schuhmann, Dietrich; Schofield, Paul; Gkoutos, Georgios V
2011-04-01
Ontologies are essential in biomedical research due to their ability to semantically integrate content from different scientific databases and resources. Their application improves capabilities for querying and mining biological knowledge. An increasing number of ontologies is being developed for this purpose, and considerable effort is invested into formally defining them in order to represent their semantics explicitly. However, current biomedical ontologies do not facilitate data integration and interoperability yet, since reasoning over these ontologies is very complex and cannot be performed efficiently or is even impossible. We propose the use of less expressive subsets of ontology representation languages to enable efficient reasoning and achieve the goal of genuine interoperability between ontologies. We present and evaluate EL Vira, a framework that transforms OWL ontologies into the OWL EL subset, thereby enabling the use of tractable reasoning. We illustrate which OWL constructs and inferences are kept and lost following the conversion and demonstrate the performance gain of reasoning indicated by the significant reduction of processing time. We applied EL Vira to the open biomedical ontologies and provide a repository of ontologies resulting from this conversion. EL Vira creates a common layer of ontological interoperability that, for the first time, enables the creation of software solutions that can employ biomedical ontologies to perform inferences and answer complex queries to support scientific analyses. The EL Vira software is available from http://el-vira.googlecode.com and converted OBO ontologies and their mappings are available from http://bioonto.gen.cam.ac.uk/el-ont.
ERIC Educational Resources Information Center
Thornton, Bradley D.; Smalley, Robert A.
2008-01-01
Building information modeling (BIM) uses three-dimensional modeling concepts, information technology and interoperable software to design, construct and operate a facility. However, BIM can be more than a tool for virtual modeling--it can provide schools with a 3-D walkthrough of a project while it still is on the electronic drawing board. BIM can…
The EuroGEOSS Advanced Operating Capacity
NASA Astrophysics Data System (ADS)
Nativi, S.; Vaccari, L.; Stock, K.; Diaz, L.; Santoro, M.
2012-04-01
The concept of multidisciplinary interoperability for managing societal issues is a major challenge presently faced by the Earth and Space Science Informatics community. With this in mind, EuroGEOSS project was launched on May 1st 2009 for a three year period aiming to demonstrate the added value to the scientific community and society of providing existing earth observing systems and applications in an interoperable manner and used within the GEOSS and INSPIRE frameworks. In the first period, the project built an Initial Operating Capability (IOC) in the three strategic areas of Drought, Forestry and Biodiversity; this was then enhanced into an Advanced Operating Capacity (AOC) for multidisciplinary interoperability. Finally, the project extended the infrastructure to other scientific domains (geology, hydrology, etc.). The EuroGEOSS multidisciplinary AOC is based on the Brokering Approach. This approach aims to achieve multidisciplinary interoperability by developing an extended SOA (Service Oriented Architecture) where a new type of "expert" components is introduced: the Broker. These implement all mediation and distribution functionalities needed to interconnect the distributed and heterogeneous resources characterizing a System of Systems (SoS) environment. The EuroGEOSS AOC is comprised of the following components: • EuroGEOSS Discovery Broker: providing harmonized discovery functionalities by mediating and distributing user queries against tens of heterogeneous services; • EuroGEOSS Access Broker: enabling users to seamlessly access and use heterogeneous remote resources via a unique and standard service; • EuroGEOSS Web 2.0 Broker: enhancing the capabilities of the Discovery Broker with queries towards the new Web 2.0 services; • EuroGEOSS Semantic Discovery Broker: enhancing the capabilities of the Discovery Broker with semantic query-expansion; • EuroGEOSS Natural Language Search Component: providing users with the possibilities to search for resources using natural language queries; • Service Composition Broker: allowing users to compose and execute complex Business Processes, based on the technology developed by the FP7 UncertWeb project. Recently, the EuroGEOSS Brokering framework was presented at the GEO-VIII Plenary and Exhibition in Istanbul and introduced into the GEOSS Common Infrastructure.
Development of high performance scientific components for interoperability of computing packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gulabani, Teena Pratap
2008-01-01
Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less
A Research on E - learning Resources Construction Based on Semantic Web
NASA Astrophysics Data System (ADS)
Rui, Liu; Maode, Deng
Traditional e-learning platforms have the flaws that it's usually difficult to query or positioning, and realize the cross platform sharing and interoperability. In the paper, the semantic web and metadata standard is discussed, and a kind of e - learning system framework based on semantic web is put forward to try to solve the flaws of traditional elearning platforms.
The CEOS WGISS Atmospheric Composition Portal
NASA Technical Reports Server (NTRS)
Lynnes, Chris
2010-01-01
Goal: Demonstrate the feasibility of connecting distributed atmospheric composition data and analysis tools into a common and shared web framework. Initial effort focused on: a) Collaboratively creating a web application within WDC-RSAT for comparison of satellite derived atmospheric composition datasets accessed from distributed data sources. b) Implementation of data access and interoperability standards. c) Sollicit feedback from paternal users; Especially from ACC participants.
Analysis of ISO/IEEE 11073 built-in security and its potential IHE-based extensibility.
Rubio, Óscar J; Trigo, Jesús D; Alesanco, Álvaro; Serrano, Luis; García, José
2016-04-01
The ISO/IEEE 11073 standard for Personal Health Devices (X73PHD) aims to ensure interoperability between Personal Health Devices and aggregators-e.g. health appliances, routers-in ambulatory setups. The Integrating the Healthcare Enterprise (IHE) initiative promotes the coordinated use of different standards in healthcare systems (e.g. Personal/Electronic Health Records, alert managers, Clinical Decision Support Systems) by defining profiles intended for medical use cases. X73PHD provides a robust syntactic model and a comprehensive terminology, but it places limited emphasis on security and on interoperability with IHE-compliant systems and frameworks. However, the implementation of eHealth/mHealth applications in environments such as health and fitness monitoring, independent living and disease management (i.e. the X73PHD domains) increasingly requires features such as secure connections to mobile aggregators-e.g. smartphones, tablets-, the sharing of devices among different users with privacy, and interoperability with certain IHE-compliant healthcare systems. This work proposes a comprehensive IHE-based X73PHD extension consisting of additive layers adapted to different eHealth/mHealth applications, after having analyzed the features of X73PHD (especially its built-in security), IHE profiles related with these applications and other research works. Both the new features proposed for each layer and the procedures to support them have been carefully chosen to minimize the impact on X73PHD, on its architecture (in terms of delays and overhead) and on its framework. Such implications are thoroughly analyzed in this paper. As a result, an extended model of X73PHD is proposed, preserving its essential features while extending them with added value. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Xie, Jibo; Li, Guoqing
2015-04-01
Earth observation (EO) data obtained by air-borne or space-borne sensors has the characteristics of heterogeneity and geographical distribution of storage. These data sources belong to different organizations or agencies whose data management and storage methods are quite different and geographically distributed. Different data sources provide different data publish platforms or portals. With more Remote sensing sensors used for Earth Observation (EO) missions, different space agencies have distributed archived massive EO data. The distribution of EO data archives and system heterogeneity makes it difficult to efficiently use geospatial data for many EO applications, such as hazard mitigation. To solve the interoperable problems of different EO data systems, an advanced architecture of distributed geospatial data infrastructure is introduced to solve the complexity of distributed and heterogeneous EO data integration and on-demand processing in this paper. The concept and architecture of geospatial data service gateway (GDSG) is proposed to build connection with heterogeneous EO data sources by which EO data can be retrieved and accessed with unified interfaces. The GDSG consists of a set of tools and service to encapsulate heterogeneous geospatial data sources into homogenous service modules. The GDSG modules includes EO metadata harvesters and translators, adaptors to different type of data system, unified data query and access interfaces, EO data cache management, and gateway GUI, etc. The GDSG framework is used to implement interoperability and synchronization between distributed EO data sources with heterogeneous architecture. An on-demand distributed EO data platform is developed to validate the GDSG architecture and implementation techniques. Several distributed EO data achieves are used for test. Flood and earthquake serves as two scenarios for the use cases of distributed EO data integration and interoperability.
Framework for Architecture Trade Study Using MBSE and Performance Simulation
NASA Technical Reports Server (NTRS)
Ryan, Jessica; Sarkani, Shahram; Mazzuchim, Thomas
2012-01-01
Increasing complexity in modern systems as well as cost and schedule constraints require a new paradigm of system engineering to fulfill stakeholder needs. Challenges facing efficient trade studies include poor tool interoperability, lack of simulation coordination (design parameters) and requirements flowdown. A recent trend toward Model Based System Engineering (MBSE) includes flexible architecture definition, program documentation, requirements traceability and system engineering reuse. As a new domain MBSE still lacks governing standards and commonly accepted frameworks. This paper proposes a framework for efficient architecture definition using MBSE in conjunction with Domain Specific simulation to evaluate trade studies. A general framework is provided followed with a specific example including a method for designing a trade study, defining candidate architectures, planning simulations to fulfill requirements and finally a weighted decision analysis to optimize system objectives.
NASA Astrophysics Data System (ADS)
Schaap, D.
2015-12-01
Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP 2 for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; establishing interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) global portal; establishing common standards for a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE)
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Glaves, Helen
2016-04-01
Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP II for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: 1. establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; 2. establishing interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) global portal; 3. the establishment of common standards for a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE)
Analysis model for personal eHealth solutions and services.
Mykkänen, Juha; Tuomainen, Mika; Luukkonen, Irmeli; Itälä, Timo
2010-01-01
In this paper, we present a framework for analysing and assessing various features of personal wellbeing information management services and solutions such as personal health records and citizen-oriented eHealth services. The model is based on general functional and interoperability standards for personal health management applications and generic frameworks for different aspects of analysis. It has been developed and used in the MyWellbeing project in Finland to provide baseline for the research, development and comparison of many different personal wellbeing and health management solutions and to support the development of unified "Coper" concept for citizen empowerment.
An ontology-based framework for bioinformatics workflows.
Digiampietri, Luciano A; Perez-Alcazar, Jose de J; Medeiros, Claudia Bauzer
2007-01-01
The proliferation of bioinformatics activities brings new challenges - how to understand and organise these resources, how to exchange and reuse successful experimental procedures, and to provide interoperability among data and tools. This paper describes an effort toward these directions. It is based on combining research on ontology management, AI and scientific workflows to design, reuse and annotate bioinformatics experiments. The resulting framework supports automatic or interactive composition of tasks based on AI planning techniques and takes advantage of ontologies to support the specification and annotation of bioinformatics workflows. We validate our proposal with a prototype running on real data.
Building the Synergy between Public Sector and Research Data Infrastructures
NASA Astrophysics Data System (ADS)
Craglia, Massimo; Friis-Christensen, Anders; Ostländer, Nicole; Perego, Andrea
2014-05-01
INSPIRE is a European Directive aiming to establish a EU-wide spatial data infrastructure to give cross-border access to information that can be used to support EU environmental policies, as well as other policies and activities having an impact on the environment. In order to ensure cross-border interoperability of data infrastructures operated by EU Member States, INSPIRE sets out a framework based on common specifications for metadata, data, network services, data and service sharing, monitoring and reporting. The implementation of INSPIRE has reached important milestones: the INSPIRE Geoportal was launched in 2011 providing a single access point for the discovery of INSPIRE data and services across EU Member States (currently, about 300K), while all the technical specifications for the interoperability of data across the 34 INSPIRE themes were adopted at the end of 2013. During this period a number of EU and international initiatives has been launched, concerning cross-domain interoperability and (Linked) Open Data. In particular, the EU Open Data Portal, launched in December 2012, made provisions to access government and scientific data from EU institutions and bodies, and the EU ISA Programme (Interoperability Solutions for European Public Administrations) promotes cross-sector interoperability by sharing and re-using EU-wide and national standards and components. Moreover, the Research Data Alliance (RDA), an initiative jointly funded by the European Commission, the US National Science Foundation and the Australian Research Council, was launched in March 2013 to promote scientific data sharing and interoperability. The Joint Research Centre of the European Commission (JRC), besides being the technical coordinator of the implementation of INSPIRE, is also actively involved in the initiatives promoting cross-sector re-use in INSPIRE, and sustainable approaches to address the evolution of technologies - in particular, how to support Linked Data in INSPIRE and the use of global persistent identifiers. It is evident that government and scientific data infrastructures are currently facing a number of issues that have already been addressed in INSPIRE. Sharing experiences and competencies will avoid re-inventing the wheel, and help promoting the cross-domain adoption of consistent solutions. Actually, one of the lessons learnt from INSPIRE and the initiatives in which JRC is involved, is that government and research data are not two separate worlds. Government data are commonly used as a basis to create scientific data, and vice-versa. Consequently, it is fundamental to adopt a consistent approach to address interoperability and data management issues shared by both government and scientific data. The presentation illustrates some of the lessons learnt during the implementation of INSPIRE and in work on data and service interoperability coordinated with European and international initiatives. We describe a number of critical interoperability issues and barriers affecting both scientific and government data, concerning, e.g., data terminologies, quality and licensing, and propose how these problems could be effectively addressed by a closer collaboration of the government and scientific communities, and the sharing of experiences and practices.
Multinational Experiment 7: Protecting Access to Space
2013-07-08
access to space cost to the design, engineering , production and operation of the spacecraft. They also have an impact on spacecraft mass, thermal...station and provide engineering support to receive data in the agreed format. Step 5 – Implementing interoperability. Once a framework has been...procedures or using alternative means (for example, high-altitude airships ). A7. The results support the view that better mitigation approaches need to
NASA Astrophysics Data System (ADS)
Asmi, Ari; Powers, Lindsay
2015-04-01
Research Infrastructures (RIs) are major long-term investments supporting innovative, bottom-up research activities. In the environmental research, they range from high atmosphere radars, to field observation networks and coordinated laboratory facilities. The Earth system is highly interactive and each part of the system interconnected across the spatial and disciplinary borders. However, due practical and historical reasons, the RIs are built from disciplinary points-of-view and separately in different parts of the world, with differing standards, policies, methods and research cultures. This heterogeneity provides necessary diversity to study the complex Earth system, but makes cross-disciplinary and/or global interoperability a challenge. Global actions towards better interoperability are surfacing, especially with EU and US. For example, recent mandates within the US government prioritize open data for federal agencies and federally funded science, and encourage collaboration among agencies to reduce duplication of efforts and increase efficient use of resources. There are several existing initiatives working toward these goals (e.g., COOPEUS, EarthCube, RDA, ICSU-WDS, DataOne, ESIP, USGEO, GEO). However, there is no cohesive framework to coordinate efforts among these, and other, entities. COOPEUS and EarthCube have now begun to map the landscape of interoperability efforts across earth science domains. The COOPEUS mapping effort describes the EU and US landscape of environmental research infrastructures to accomplish the following: identify gaps in services (data provision) necessary to address societal priorities; provide guidance for development of future research infrastructures; and identify opportunities for Research Infrastructures (RIs) to collaborate on issues of common interest. EarthCube mapping effort identifies opportunities to engage a broader community by identifying scientific domain organizations and entities. We present the current situation of the landscape analysis to create a sustainable effort towards removing barriers to interoperability on a global scale.
Lowering Entry Barriers for Multidisciplinary Cyber(e)-Infrastructures
NASA Astrophysics Data System (ADS)
Nativi, S.
2012-04-01
Multidisciplinarity is more and more important to study the Earth System and address Global Changes. To achieve that, multidisciplinary cyber(e)-infrastructures are an important instrument. In the last years, several European, US and international initiatives have been started to carry out multidisciplinary infrastructures, including: the Spatial Information in the European Community (INSPIRE), the Global Monitoring for Environment and Security (GMES), the Data Observation Network for Earth (DataOne), and the Global Earth Observation System of Systems (GEOSS). The majority of these initiatives are developing service-based digital infrastructures asking scientific Communities (i.e. disciplinary Users and data Producers) to implement a set of standards for information interoperability. For scientific Communities, this has represented an entry barrier which has proved to be high, in several cases. In fact, both data Producers and Users do not seem to be willing to invest precious resources to become expert on interoperability solutions -on the contrary, they are focused on developing disciplinary and thematic capacities. Therefore, an important research topic is lowering entry barriers for joining multidisciplinary cyber(e)-Infrastructures. This presentation will introduce a new approach to achieve multidisciplinary interoperability underpinning multidisciplinary infrastructures and lowering the present entry barriers for both Users and data Producers. This is called the Brokering approach: it extends the service-based paradigm by introducing a new a Brokering layer or cloud which is in charge of managing all the interoperability complexity (e.g. data discovery, access, and use) thus easing Users' and Producers' burden. This approach was successfully experimented in the framework of several European FP7 Projects and in GEOSS.
A Consideration of Quality-Attribute-Property for Interoperability of Quality Data
NASA Astrophysics Data System (ADS)
Tarumi, Shinya; Kozaki, Kouji; Kitamura, Yoshinobu; Mizoguchi, Riichiro
Descriptions of attribute and quality are essential elements in ontology developments. Needless to say, science data are description of attributes of target things and it is an important role of ontology to support the validity of and interoperability between the description. Although some upper ontologies such as DOLCE, BFO, etc. are already developed and extensively used, a careful examination reveals some rooms for improvement of them. While each ontology covers quality and quantity, the mutual interchangeability among these ontologies is not considered because each has been designed intended to develop a ``correct'' ontology of quality and quantity. Furthermore, due to variety of ways of data description, no single ontology can cover all the existing scientific data. In this paper, we investigate ``quality'' and ``value'' from an ontological viewpoint and propose a conceptual framework to deal with attribute, property and quality appearing in existing data descriptions in the nanotechnology domain. This framework can be considered as a reference ontology for describing quality with existing upper ontology. Furthermore, on the basis of the results of the consideration, we evaluate and refine a conceptual hierarchy of materials functions which has been built by nanomaterials researchers. Through the evaluation process, we discuss an effect of the definition of a conceptual framework for building/refining ontology. Such conceptual consideration about quality and value is not only the problem in nanomaterials domain but also a first step toward advancement of an intelligent sharing of scientific data in e-Science.
Architecture of a framework for providing information services for public transport.
García, Carmelo R; Pérez, Ricardo; Lorenzo, Alvaro; Quesada-Arencibia, Alexis; Alayón, Francisco; Padrón, Gabino
2012-01-01
This paper presents OnRoute, a framework for developing and running ubiquitous software that provides information services to passengers of public transportation, including payment systems and on-route guidance services. To achieve a high level of interoperability, accessibility and context awareness, OnRoute uses the ubiquitous computing paradigm. To guarantee the quality of the software produced, the reliable software principles used in critical contexts, such as automotive systems, are also considered by the framework. The main components of its architecture (run-time, system services, software components and development discipline) and how they are deployed in the transportation network (stations and vehicles) are described in this paper. Finally, to illustrate the use of OnRoute, the development of a guidance service for travellers is explained.
Recommendations for a service framework to access astronomical archives
NASA Technical Reports Server (NTRS)
Travisano, J. J.; Pollizzi, J.
1992-01-01
There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.
da Silva, Kátia Regina; Costa, Roberto; Crevelari, Elizabeth Sartori; Lacerda, Marianna Sobral; de Moraes Albertini, Caio Marcos; Filho, Martino Martinelli; Santana, José Eduardo; Vissoci, João Ricardo Nickenig; Pietrobon, Ricardo; Barros, Jacson V
2013-01-01
The ability to apply standard and interoperable solutions for implementing and managing medical registries as well as aggregate, reproduce, and access data sets from legacy formats and platforms to advanced standard formats and operating systems are crucial for both clinical healthcare and biomedical research settings. Our study describes a reproducible, highly scalable, standard framework for a device registry implementation addressing both local data quality components and global linking problems. We developed a device registry framework involving the following steps: (1) Data standards definition and representation of the research workflow, (2) Development of electronic case report forms using REDCap (Research Electronic Data Capture), (3) Data collection according to the clinical research workflow and, (4) Data augmentation by enriching the registry database with local electronic health records, governmental database and linked open data collections, (5) Data quality control and (6) Data dissemination through the registry Web site. Our registry adopted all applicable standardized data elements proposed by American College Cardiology / American Heart Association Clinical Data Standards, as well as variables derived from cardiac devices randomized trials and Clinical Data Interchange Standards Consortium. Local interoperability was performed between REDCap and data derived from Electronic Health Record system. The original data set was also augmented by incorporating the reimbursed values paid by the Brazilian government during a hospitalization for pacemaker implantation. By linking our registry to the open data collection repository Linked Clinical Trials (LinkedCT) we found 130 clinical trials which are potentially correlated with our pacemaker registry. This study demonstrates how standard and reproducible solutions can be applied in the implementation of medical registries to constitute a re-usable framework. Such approach has the potential to facilitate data integration between healthcare and research settings, also being a useful framework to be used in other biomedical registries.
Next generation of weather generators on web service framework
NASA Astrophysics Data System (ADS)
Chinnachodteeranun, R.; Hung, N. D.; Honda, K.; Ines, A. V. M.
2016-12-01
Weather generator is a statistical model that synthesizes possible realization of long-term historical weather in future. It generates several tens to hundreds of realizations stochastically based on statistical analysis. Realization is essential information as a crop modeling's input for simulating crop growth and yield. Moreover, they can be contributed to analyzing uncertainty of weather to crop development stage and to decision support system on e.g. water management and fertilizer management. Performing crop modeling requires multidisciplinary skills which limit the usage of weather generator only in a research group who developed it as well as a barrier for newcomers. To improve the procedures of performing weather generators as well as the methodology to acquire the realization in a standard way, we implemented a framework for providing weather generators as web services, which support service interoperability. Legacy weather generator programs were wrapped in the web service framework. The service interfaces were implemented based on an international standard that was Sensor Observation Service (SOS) defined by Open Geospatial Consortium (OGC). Clients can request realizations generated by the model through SOS Web service. Hierarchical data preparation processes required for weather generator are also implemented as web services and seamlessly wired. Analysts and applications can invoke services over a network easily. The services facilitate the development of agricultural applications and also reduce the workload of analysts on iterative data preparation and handle legacy weather generator program. This architectural design and implementation can be a prototype for constructing further services on top of interoperable sensor network system. This framework opens an opportunity for other sectors such as application developers and scientists in other fields to utilize weather generators.
Lee, Jaehoon; Hulse, Nathan C; Wood, Grant M; Oniki, Thomas A; Huff, Stanley M
2016-01-01
In this study we developed a Fast Healthcare Interoperability Resources (FHIR) profile to support exchanging a full pedigree based family health history (FHH) information across multiple systems and applications used by clinicians, patients, and researchers. We used previously developed clinical element models (CEMs) that are capable of representing the FHH information, and derived essential data elements including attributes, constraints, and value sets. We analyzed gaps between the FHH CEM elements and existing FHIR resources. Based on the analysis, we developed a profile that consists of 1) FHIR resources for essential FHH data elements, 2) extensions for additional elements that were not covered by the resources, and 3) a structured definition to integrate patient and family member information in a FHIR message. We implemented the profile using an open-source based FHIR framework and validated it using patient-entered FHH data that was captured through a locally developed FHH tool.
GéoSAS: A modular and interoperable Open Source Spatial Data Infrastructure for research
NASA Astrophysics Data System (ADS)
Bera, R.; Squividant, H.; Le Henaff, G.; Pichelin, P.; Ruiz, L.; Launay, J.; Vanhouteghem, J.; Aurousseau, P.; Cudennec, C.
2015-05-01
To-date, the commonest way to deal with geographical information and processes still appears to consume local resources, i.e. locally stored data processed on a local desktop or server. The maturity and subsequent growing use of OGC standards to exchange data on the World Wide Web, enhanced in Europe by the INSPIRE Directive, is bound to change the way people (and among them research scientists, especially in environmental sciences) make use of, and manage, spatial data. A clever use of OGC standards can help scientists to better store, share and use data, in particular for modelling. We propose a framework for online processing by making an intensive use of OGC standards. We illustrate it using the Spatial Data Infrastructure (SDI) GéoSAS which is the SDI set up for researchers' needs in our department. It is based on the existing open source, modular and interoperable Spatial Data Architecture geOrchestra.
Local, regional and national interoperability in hospital-level systems architecture.
Mykkänen, J; Korpela, M; Ripatti, S; Rannanheimo, J; Sorri, J
2007-01-01
Interoperability of applications in health care is faced with various needs by patients, health professionals, organizations and policy makers. A combination of existing and new applications is a necessity. Hospitals are in a position to drive many integration solutions, but need approaches which combine local, regional and national requirements and initiatives with open standards to support flexible processes and applications on a local hospital level. We discuss systems architecture of hospitals in relation to various processes and applications, and highlight current challenges and prospects using a service-oriented architecture approach. We also illustrate these aspects with examples from Finnish hospitals. A set of main services and elements of service-oriented architectures for health care facilities are identified, with medium-term focus which acknowledges existing systems as a core part of service-oriented solutions. The services and elements are grouped according to functional and interoperability cohesion. A transition towards service-oriented architecture in health care must acknowledge existing health information systems and promote the specification of central processes and software services locally and across organizations. Software industry best practices such as SOA must be combined with health care knowledge to respond to central challenges such as continuous change in health care. A service-oriented approach cannot entirely rely on common standards and frameworks but it must be locally adapted and complemented.
GEOSS AIP-2 Climate Change and Biodiversity Use Scenarios: Interoperability Infrastructures
NASA Astrophysics Data System (ADS)
Nativi, Stefano; Santoro, Mattia
2010-05-01
In the last years, scientific community is producing great efforts in order to study the effects of climate change on life on Earth. In this general framework, a key role is played by the impact of climate change on biodiversity. To assess this, several use scenarios require the modeling of climatological change impact on the regional distribution of biodiversity species. Designing and developing interoperability infrastructures which enable scientists to search, discover, access and use multi-disciplinary resources (i.e. datasets, services, models, etc.) is currently one of the main research fields for the Earth and Space Science Informatics. This presentation introduces and discusses an interoperability infrastructure which implements the discovery, access, and chaining of loosely-coupled resources in the climatology and biodiversity domains. This allows to set up and run forecast and processing models. The presented framework was successfully developed and experimented in the context of GEOSS AIP-2 (Global Earth Observation System of Systems, Architecture Implementation Pilot- Phase 2) Climate Change & Biodiversity thematic Working Group. This interoperability infrastructure is comprised of the following main components and services: a)GEO Portal: through this component end user is able to search, find and access the needed services for the scenario execution; b)Graphical User Interface (GUI): this component provides user interaction functionalities. It controls the workflow manager to perform the required operations for the scenario implementation; c)Use Scenario controller: this component acts as a workflow controller implementing the scenario business process -i.e. a typical climate change & biodiversity projection scenario; d)Service Broker implementing Mediation Services: this component realizes a distributed catalogue which federates several discovery and access components (exposing them through a unique CSW standard interface). Federated components publish climate, environmental and biodiversity datasets; e)Ecological Niche Model Server: this component is able to run one or more Ecological Niche Models (ENM) on selected biodiversity and climate datasets; f)Data Access Transaction server: this component publishes the model outputs. This framework was assessed in two use scenarios of GEOSS AIP-2 Climate Change and Biodiversity WG. Both scenarios concern the prediction of species distributions driven by climatological change forecasts. The first scenario dealt with the Pikas specie regional distribution in the Great Basin area (North America). While, the second one concerned the modeling of the Arctic Food Chain species in the North Pole area -the relationships between different environmental parameters and Polar Bears distribution was analyzed. The scientific patronage was provided by the University of Colorado and the University of Alaska, respectively. Results are published in the GEOSS AIP-2 web site: http://www.ogcnetwork.net/AIP2develop.
Semantics Enabled Queries in EuroGEOSS: a Discovery Augmentation Approach
NASA Astrophysics Data System (ADS)
Santoro, M.; Mazzetti, P.; Fugazza, C.; Nativi, S.; Craglia, M.
2010-12-01
One of the main challenges in Earth Science Informatics is to build interoperability frameworks which allow users to discover, evaluate, and use information from different scientific domains. This needs to address multidisciplinary interoperability challenges concerning both technological and scientific aspects. From the technological point of view, it is necessary to provide a set of special interoperability arrangement in order to develop flexible frameworks that allow a variety of loosely-coupled services to interact with each other. From a scientific point of view, it is necessary to document clearly the theoretical and methodological assumptions underpinning applications in different scientific domains, and develop cross-domain ontologies to facilitate interdisciplinary dialogue and understanding. In this presentation we discuss a brokering approach that extends the traditional Service Oriented Architecture (SOA) adopted by most Spatial Data Infrastructures (SDIs) to provide the necessary special interoperability arrangements. In the EC-funded EuroGEOSS (A European approach to GEOSS) project, we distinguish among three possible functional brokering components: discovery, access and semantics brokers. This presentation focuses on the semantics broker, the Discovery Augmentation Component (DAC), which was specifically developed to address the three thematic areas covered by the EuroGEOSS project: biodiversity, forestry and drought. The EuroGEOSS DAC federates both semantics (e.g. SKOS repositories) and ISO-compliant geospatial catalog services. The DAC can be queried using common geospatial constraints (i.e. what, where, when, etc.). Two different augmented discovery styles are supported: a) automatic query expansion; b) user assisted query expansion. In the first case, the main discovery steps are: i. the query keywords (the what constraint) are “expanded” with related concepts/terms retrieved from the set of federated semantic services. A default expansion regards the multilinguality relationship; ii. The resulting queries are submitted to the federated catalog services; iii. The DAC performs a “smart” aggregation of the queries results and provides them back to the client. In the second case, the main discovery steps are: i. the user browses the federated semantic repositories and selects the concepts/terms-of-interest; ii. The DAC creates the set of geospatial queries based on the selected concepts/terms and submits them to the federated catalog services; iii. The DAC performs a “smart” aggregation of the queries results and provides them back to the client. A Graphical User Interface (GUI) was also developed for testing and interacting with the DAC. The entire brokering framework is deployed in the context of EuroGEOSS infrastructure and it is used in a couple of GEOSS AIP-3 use scenarios: the “e-Habitat Use Scenario” for the Biodiversity and Climate Change topic, and the “Comprehensive Drought Index Use Scenario” for Water/Drought topic
2008-09-30
and Accountability Act of 1996 ) prohibit health care providers from sharing certain information about patients, and the Posse Comitatus Act (1878...Publishers, pp. 333-380. Carley, K.M., and Svoboda, D.M. ( 1996 ). Modeling Organizational Adaptation as a Simulated Annealing Process. Sociological...Privacy: Interdisciplinary Frameworks and Solution. Hershey , PA: IGI Global. Salas, E., Sims, D.E., & Burke, C.S. (2005). Is there a big five in
Meeting People's Needs in a Fully Interoperable Domotic Environment
Miori, Vittorio; Russo, Dario; Concordia, Cesare
2012-01-01
The key idea underlying many Ambient Intelligence (AmI) projects and applications is context awareness, which is based mainly on their capacity to identify users and their locations. The actual computing capacity should remain in the background, in the periphery of our awareness, and should only move to the center if and when necessary. Computing thus becomes ‘invisible’, as it is embedded in the environment and everyday objects. The research project described herein aims to realize an Ambient Intelligence-based environment able to improve users' quality of life by learning their habits and anticipating their needs. This environment is part of an adaptive, context-aware framework designed to make today's incompatible heterogeneous domotic systems fully interoperable, not only for connecting sensors and actuators, but for providing comprehensive connections of devices to users. The solution is a middleware architecture based on open and widely recognized standards capable of abstracting the peculiarities of underlying heterogeneous technologies and enabling them to co-exist and interwork, without however eliminating their differences. At the highest level of this infrastructure, the Ambient Intelligence framework, integrated with the domotic sensors, can enable the system to recognize any unusual or dangerous situations and anticipate health problems or special user needs in a technological living environment, such as a house or a public space. PMID:22969322
Argobots: A Lightweight Low-Level Threading and Tasking Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan
In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this paper, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. We describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less
Meeting people's needs in a fully interoperable domotic environment.
Miori, Vittorio; Russo, Dario; Concordia, Cesare
2012-01-01
The key idea underlying many Ambient Intelligence (AmI) projects and applications is context awareness, which is based mainly on their capacity to identify users and their locations. The actual computing capacity should remain in the background, in the periphery of our awareness, and should only move to the center if and when necessary. Computing thus becomes 'invisible', as it is embedded in the environment and everyday objects. The research project described herein aims to realize an Ambient Intelligence-based environment able to improve users' quality of life by learning their habits and anticipating their needs. This environment is part of an adaptive, context-aware framework designed to make today's incompatible heterogeneous domotic systems fully interoperable, not only for connecting sensors and actuators, but for providing comprehensive connections of devices to users. The solution is a middleware architecture based on open and widely recognized standards capable of abstracting the peculiarities of underlying heterogeneous technologies and enabling them to co-exist and interwork, without however eliminating their differences. At the highest level of this infrastructure, the Ambient Intelligence framework, integrated with the domotic sensors, can enable the system to recognize any unusual or dangerous situations and anticipate health problems or special user needs in a technological living environment, such as a house or a public space.
A unified framework for managing provenance information in translational research
2011-01-01
Background A critical aspect of the NIH Translational Research roadmap, which seeks to accelerate the delivery of "bench-side" discoveries to patient's "bedside," is the management of the provenance metadata that keeps track of the origin and history of data resources as they traverse the path from the bench to the bedside and back. A comprehensive provenance framework is essential for researchers to verify the quality of data, reproduce scientific results published in peer-reviewed literature, validate scientific process, and associate trust value with data and results. Traditional approaches to provenance management have focused on only partial sections of the translational research life cycle and they do not incorporate "domain semantics", which is essential to support domain-specific querying and analysis by scientists. Results We identify a common set of challenges in managing provenance information across the pre-publication and post-publication phases of data in the translational research lifecycle. We define the semantic provenance framework (SPF), underpinned by the Provenir upper-level provenance ontology, to address these challenges in the four stages of provenance metadata: (a) Provenance collection - during data generation (b) Provenance representation - to support interoperability, reasoning, and incorporate domain semantics (c) Provenance storage and propagation - to allow efficient storage and seamless propagation of provenance as the data is transferred across applications (d) Provenance query - to support queries with increasing complexity over large data size and also support knowledge discovery applications We apply the SPF to two exemplar translational research projects, namely the Semantic Problem Solving Environment for Trypanosoma cruzi (T.cruzi SPSE) and the Biomedical Knowledge Repository (BKR) project, to demonstrate its effectiveness. Conclusions The SPF provides a unified framework to effectively manage provenance of translational research data during pre and post-publication phases. This framework is underpinned by an upper-level provenance ontology called Provenir that is extended to create domain-specific provenance ontologies to facilitate provenance interoperability, seamless propagation of provenance, automated querying, and analysis. PMID:22126369
Latest developments for the IAGOS database: Interoperability and metadata
NASA Astrophysics Data System (ADS)
Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume
2014-05-01
In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for intercomparison. The optimal data transfer protocol is being investigated to insure the interoperability. To facilitate satellite and model validation, tools will be made available for co-location and comparison with IAGOS. We will enhance the JOIN application in order to properly display aircraft data as vertical profiles and along individual flight tracks and to allow for graphical comparison to model results that are accessible through interoperable web services, such as the daily products from the GMES/Copernicus atmospheric service.
Sampling-based ensemble segmentation against inter-operator variability
NASA Astrophysics Data System (ADS)
Huo, Jing; Okada, Kazunori; Pope, Whitney; Brown, Matthew
2011-03-01
Inconsistency and a lack of reproducibility are commonly associated with semi-automated segmentation methods. In this study, we developed an ensemble approach to improve reproducibility and applied it to glioblastoma multiforme (GBM) brain tumor segmentation on T1-weigted contrast enhanced MR volumes. The proposed approach combines samplingbased simulations and ensemble segmentation into a single framework; it generates a set of segmentations by perturbing user initialization and user-specified internal parameters, then fuses the set of segmentations into a single consensus result. Three combination algorithms were applied: majority voting, averaging and expectation-maximization (EM). The reproducibility of the proposed framework was evaluated by a controlled experiment on 16 tumor cases from a multicenter drug trial. The ensemble framework had significantly better reproducibility than the individual base Otsu thresholding method (p<.001).
Architecture of a Framework for Providing Information Services for Public Transport
García, Carmelo R.; Pérez, Ricardo; Lorenzo, Álvaro; Quesada-Arencibia, Alexis; Alayón, Francisco; Padrón, Gabino
2012-01-01
This paper presents OnRoute, a framework for developing and running ubiquitous software that provides information services to passengers of public transportation, including payment systems and on-route guidance services. To achieve a high level of interoperability, accessibility and context awareness, OnRoute uses the ubiquitous computing paradigm. To guarantee the quality of the software produced, the reliable software principles used in critical contexts, such as automotive systems, are also considered by the framework. The main components of its architecture (run-time, system services, software components and development discipline) and how they are deployed in the transportation network (stations and vehicles) are described in this paper. Finally, to illustrate the use of OnRoute, the development of a guidance service for travellers is explained. PMID:22778585
Clinical Knowledge Governance Framework for Nationwide Data Infrastructure Projects.
Wulff, Antje; Haarbrandt, Birger; Marschollek, Michael
2018-01-01
The availability of semantically-enriched and interoperable clinical information models is crucial for reusing once collected data across institutions like aspired in the German HiGHmed project. Funded by the Federal Ministry of Education and Research, this nationwide data infrastructure project adopts the openEHR approach for semantic modelling. Here, strong governance is required to define high-quality and reusable models. Design of a clinical knowledge governance framework for openEHR modelling in cross-institutional settings like HiGHmed. Analysis of successful practices from international projects, published ideas on archetype governance and own modelling experiences as well as modelling of BPMN processes. We designed a framework by presenting archetype variations, roles and responsibilities, IT support and modelling workflows. Our framework has great potential to make the openEHR modelling efforts manageable. Because practical experiences are rare, prospectively our work will be predestinated to evaluate the benefits of such structured governance approaches.
Daboul, Amro; Ivanovska, Tatyana; Bülow, Robin; Biffar, Reiner; Cardini, Andrea
2018-01-01
Using 3D anatomical landmarks from adult human head MRIs, we assessed the magnitude of inter-operator differences in Procrustes-based geometric morphometric analyses. An in depth analysis of both absolute and relative error was performed in a subsample of individuals with replicated digitization by three different operators. The effect of inter-operator differences was also explored in a large sample of more than 900 individuals. Although absolute error was not unusual for MRI measurements, including bone landmarks, shape was particularly affected by differences among operators, with up to more than 30% of sample variation accounted for by this type of error. The magnitude of the bias was such that it dominated the main pattern of bone and total (all landmarks included) shape variation, largely surpassing the effect of sex differences between hundreds of men and women. In contrast, however, we found higher reproducibility in soft-tissue nasal landmarks, despite relatively larger errors in estimates of nasal size. Our study exemplifies the assessment of measurement error using geometric morphometrics on landmarks from MRIs and stresses the importance of relating it to total sample variance within the specific methodological framework being used. In summary, precise landmarks may not necessarily imply negligible errors, especially in shape data; indeed, size and shape may be differentially impacted by measurement error and different types of landmarks may have relatively larger or smaller errors. Importantly, and consistently with other recent studies using geometric morphometrics on digital images (which, however, were not specific to MRI data), this study showed that inter-operator biases can be a major source of error in the analysis of large samples, as those that are becoming increasingly common in the 'era of big data'.
Ivanovska, Tatyana; Bülow, Robin; Biffar, Reiner; Cardini, Andrea
2018-01-01
Using 3D anatomical landmarks from adult human head MRIs, we assessed the magnitude of inter-operator differences in Procrustes-based geometric morphometric analyses. An in depth analysis of both absolute and relative error was performed in a subsample of individuals with replicated digitization by three different operators. The effect of inter-operator differences was also explored in a large sample of more than 900 individuals. Although absolute error was not unusual for MRI measurements, including bone landmarks, shape was particularly affected by differences among operators, with up to more than 30% of sample variation accounted for by this type of error. The magnitude of the bias was such that it dominated the main pattern of bone and total (all landmarks included) shape variation, largely surpassing the effect of sex differences between hundreds of men and women. In contrast, however, we found higher reproducibility in soft-tissue nasal landmarks, despite relatively larger errors in estimates of nasal size. Our study exemplifies the assessment of measurement error using geometric morphometrics on landmarks from MRIs and stresses the importance of relating it to total sample variance within the specific methodological framework being used. In summary, precise landmarks may not necessarily imply negligible errors, especially in shape data; indeed, size and shape may be differentially impacted by measurement error and different types of landmarks may have relatively larger or smaller errors. Importantly, and consistently with other recent studies using geometric morphometrics on digital images (which, however, were not specific to MRI data), this study showed that inter-operator biases can be a major source of error in the analysis of large samples, as those that are becoming increasingly common in the 'era of big data'. PMID:29787586
Computational toxicology using the OpenTox application programming interface and Bioclipse
2011-01-01
Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173
Multi-disciplinary interoperability challenges (Ian McHarg Medal Lecture)
NASA Astrophysics Data System (ADS)
Annoni, Alessandro
2013-04-01
Global sustainability research requires multi-disciplinary efforts to address the key research challenges to increase our understanding of the complex relationships between environment and society. For this reason dependence on ICT systems interoperability is rapidly growing but, despite some relevant technological improvement is observed, in practice operational interoperable solutions are still lacking. Among the causes is the absence of a generally accepted definition of "interoperability" in all its broader aspects. In fact the concept of interoperability is just a concept and the more popular definitions are not addressing all challenges to realize operational interoperable solutions. The problem become even more complex when multi-disciplinary interoperability is required because in that case solutions for interoperability of different interoperable solution should be envisaged. In this lecture the following definition will be used: "interoperability is the ability to exchange information and to use it". In the lecture the main challenges for addressing multi-disciplinary interoperability will be presented and a set of proposed approaches/solutions shortly introduced.
Hong, Na; Wen, Andrew; Shen, Feichen; Sohn, Sunghwan; Liu, Sijia; Liu, Hongfang; Jiang, Guoqian
2018-01-01
Standards-based modeling of electronic health records (EHR) data holds great significance for data interoperability and large-scale usage. Integration of unstructured data into a standard data model, however, poses unique challenges partially due to heterogeneous type systems used in existing clinical NLP systems. We introduce a scalable and standards-based framework for integrating structured and unstructured EHR data leveraging the HL7 Fast Healthcare Interoperability Resources (FHIR) specification. We implemented a clinical NLP pipeline enhanced with an FHIR-based type system and performed a case study using medication data from Mayo Clinic's EHR. Two UIMA-based NLP tools known as MedXN and MedTime were integrated in the pipeline to extract FHIR MedicationStatement resources and related attributes from unstructured medication lists. We developed a rule-based approach for assigning the NLP output types to the FHIR elements represented in the type system, whereas we investigated the FHIR elements belonging to the source of the structured EMR data. We used the FHIR resource "MedicationStatement" as an example to illustrate our integration framework and methods. For evaluation, we manually annotated FHIR elements in 166 medication statements from 14 clinical notes generated by Mayo Clinic in the course of patient care, and used standard performance measures (precision, recall and f-measure). The F-scores achieved ranged from 0.73 to 0.99 for the various FHIR element representations. The results demonstrated that our framework based on the FHIR type system is feasible for normalizing and integrating both structured and unstructured EHR data.
Understanding the CCA Standard Through Decaf
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumfert, G
2003-04-17
This document is a tutorial on the CCA Standard as realized through the Decaf implementation. Decaf does not equal the CCA standard much in the same way that Microsoft Visual C++ is not ANSI/ISO C++. This document was created because the CCA standard is evolving and still too fluid to nail down in a tutorial document. Because of its fluidity, and that it represents a hotbed of research and development, beginners can only start learning CCA by choosing one of the frameworks (warts and all). Decaf has just enough functionality to be a useful tool for beginners in the CCAmore » to get started on. Though it lacks many features of the bigger CCA frameworks (CCAFE [3], XCAT [10], and SciRUN [8]) where the heavy-duty research is still going on, it is the first CCA framework that is underpinned by Babel, which provides its language interoperability features. This document can also serve the dual-purpose of providing a reasonable-sized example of building an application using Babel. The entire source for Decaf is included in the examples/subdirectory of the Babel code distribution. This manual assumes the reader is a programmer who has a conceptual understanding of the Babel Language Interoperability Tool. They should be proficient in two or more of the following languages: Fortran77, C, C++, Java, or Python. Furthermore, this manual assumes the reader is familiar with the SPMD{sup 2} programming model that pervades the scientific computing community. Knowledge of and experience with MPI programming is helpful, but not strictly required.« less
Selected Tracking and Fusion Applications for the Defence and Security Domain
2010-05-01
SUBTITLE Selected Tracking and Fusion Applications for the Defence and Security Domain 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...characterized, for example, by sensor ranges from less than a meter to hundreds of kilometers, by time scales ranging from less than second to a few...been carried out within the framework of a multinational technology program called MAJIIC (Multi-Sensor Aerospace-Ground Joint ISR Interoperability
A Framework for Building and Reasoning with Adaptive and Interoperable PMESII Models
2007-11-01
Description Logic SOA Service Oriented Architecture SPARQL Simple Protocol And RDF Query Language SQL Standard Query Language SROM Stability and...another by providing a more expressive ontological structure for one of the models, e.g., semantic networks can be mapped to first- order logical...Pellet is an open-source reasoner that works with OWL-DL. It accepts the SPARQL protocol and RDF query language ( SPARQL ) and provides a Java API to
de Bruin, Jeroen S; Zeckl, Julia; Adlassnig, Katharina; Blacky, Alexander; Koller, Walter; Rappelsberger, Andrea; Adlassnig, Klaus-Peter
2017-01-01
The creation of clinical decision support systems has received a strong impulse over the last years, but their integration into a clinical routine has lagged behind, partly due to a lack of interoperability and trust by physicians. We report on the implementation of a clinical foundation framework in Arden Syntax, comprising knowledge units for (a) preprocessing raw clinical data, (b) the determination of single clinical concepts, and (c) more complex medical knowledge, which can be modeled through the composition and configuration of knowledge units in this framework. Thus, it can be tailored to clinical institutions or patients' caregivers. In the present version, we integrated knowledge units for several infection-related clinical concepts into the framework and developed a clinical event monitoring system over the framework that employs three different scenarios for monitoring clinical signs of bloodstream infection. The clinical event monitoring system was tested using data from intensive care units at Vienna General Hospital, Austria.
Theoretical Framework for Integrating Distributed Energy Resources into Distribution Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lian, Jianming; Wu, Di; Kalsi, Karanjit
This paper focuses on developing a novel theoretical framework for effective coordination and control of a large number of distributed energy resources in distribution systems in order to more reliably manage the future U.S. electric power grid under the high penetration of renewable generation. The proposed framework provides a systematic view of the overall structure of the future distribution systems along with the underlying information flow, functional organization, and operational procedures. It is characterized by the features of being open, flexible and interoperable with the potential to support dynamic system configuration. Under the proposed framework, the energy consumption of variousmore » DERs is coordinated and controlled in a hierarchical way by using market-based approaches. The real-time voltage control is simultaneously considered to complement the real power control in order to keep nodal voltages stable within acceptable ranges during real time. In addition, computational challenges associated with the proposed framework are also discussed with recommended practices.« less
Tool and data interoperability in the SSE system
NASA Technical Reports Server (NTRS)
Shotton, Chuck
1988-01-01
Information is given in viewgraph form on tool and data interoperability in the Software Support Environment (SSE). Information is given on industry problems, SSE system interoperability issues, SSE solutions to tool and data interoperability, and attainment of heterogeneous tool/data interoperability.
UHF (Ultra High Frequency) Military Satellite Communications Ground Equipment Interoperability.
1986-10-06
crisis management requires interoperability between various services. These short-term crises often arise from unforeseen circumstances in which...Scheduler Qualcomm has prepared an interoperability study for the JTC3A (Reference 15) as a TA/CE for USCINCLANT ROC 5-84 requirements. It has defined a...interoperability is fundamental. A number of operational crises have occurred where interoperable communications or the lack of interoperable
Ocean Data Interoperability Platform (ODIP): using regional data systems for global ocean research
NASA Astrophysics Data System (ADS)
Schaap, D.; Thijsse, P.; Glaves, H.
2017-12-01
Ocean acidification, loss of coral reefs, sustainable exploitation of the marine environment are just a few of the challenges researchers around the world are currently attempting to understand and address. However, studies of these ecosystem level challenges are impossible unless researchers can discover and re-use the large volumes of interoperable multidisciplinary data that are currently only accessible through regional and global data systems that serve discreet, and often discipline specific, user communities. The plethora of marine data systems currently in existence are also using different standards, technologies and best practices making re-use of the data problematic for those engaged in interdisciplinary marine research. The Ocean Data Interoperability Platform (ODIP) is responding to this growing demand for discoverable, accessible and reusable data by establishing the foundations for a common global framework for marine data management. But creation of such an infrastructure is a major undertaking, and one that needs to be achieved in part by establishing different levels of interoperability across existing regional and global marine e-infrastructures. Workshops organised by ODIP II facilitate dialogue between selected regional and global marine data systems in an effort to identify potential solutions that integrate these marine e-infrastructures. The outcomes of these discussions have formed the basis for a number of prototype development tasks that aim to demonstrate effective sharing of data across multiple data systems, and allow users to access data from more than one system through a single access point. The ODIP II project is currently developing four prototype solutions that are establishing interoperability between selected regional marine data management infrastructures in Europe, the USA, Canada and Australia, and with the global POGO, IODE Ocean Data Portal (ODP) and GEOSS systems. The potential impact of implementing these solutions for the individual marine data infrastructures is also being evaluated to determine both the technical and financial implications of their integration within existing systems. These impact assessments form part of the strategy to encourage wider adoption of the ODIP solutions and approach beyond the current scope of the project.
Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes
NASA Astrophysics Data System (ADS)
Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.
2014-12-01
With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data Web developer-friendly with a RESTful service. This goal was achieved by defining a proxy layer on top of the existing SPARQL endpoint that 1) translates HTTP requests into SPARQL queries, and 2) renders the returned results as required by the request sender using content negotiation, suffixes and parameters.
Integrating Data and Networks: Human Factors
NASA Astrophysics Data System (ADS)
Chen, R. S.
2012-12-01
The development of technical linkages and interoperability between scientific networks is a necessary but not sufficient step towards integrated use and application of networked data and information for scientific and societal benefit. A range of "human factors" must also be addressed to ensure the long-term integration, sustainability, and utility of both the interoperable networks themselves and the scientific data and information to which they provide access. These human factors encompass the behavior of both individual humans and human institutions, and include system governance, a common framework for intellectual property rights and data sharing, consensus on terminology, metadata, and quality control processes, agreement on key system metrics and milestones, the compatibility of "business models" in the short and long term, harmonization of incentives for cooperation, and minimization of disincentives. Experience with several national and international initiatives and research programs such as the International Polar Year, the Group on Earth Observations, the NASA Earth Observing Data and Information System, the U.S. National Spatial Data Infrastructure, the Global Earthquake Model, and the United Nations Spatial Data Infrastructure provide a range of lessons regarding these human factors. Ongoing changes in science, technology, institutions, relationships, and even culture are creating both opportunities and challenges for expanded interoperability of scientific networks and significant improvement in data integration to advance science and the use of scientific data and information to achieve benefits for society as a whole.
Towards global environmental information and data management
NASA Astrophysics Data System (ADS)
Gurney, Robert; Allison, Lee; Cesar, Roberto; Cossu, Roberto; Dietz, Volkmar; Gemeinholzer, Birgit; Koike, Toshio; Mokrane, Mustapha; Peters, Dale; Thaller-Honold, Svetlana; Treloar, Andrew; Vilotte, Jean-Pierre; Waldmann, Christoph
2014-05-01
The Belmont Forum, a coalition of national science agencies from 13 countries, is supporting an 18-month effort to implement a 'Knowledge Hub' community-building and strategy development program as a first step to coordinate and streamline international efforts on community governance, interoperability and system architectures so that environmental data and information can be exchanged internationally and across subject domains easily and efficiently. This initiative represents a first step to build collaboratively an international capacity and e-infrastructure framework to address societally relevant global environmental change challenges. The project will deliver a community-owned strategy and implementation plan, which will prioritize international funding opportunities for Belmont Forum members to build pilots and exemplars in order to accelerate delivery of end-to end global change decision support systems. In 2012, the Belmont Forum held a series of public town hall meetings, and a two-day scoping meeting of scientists and program officers, which concluded that transformative approaches and innovative technologies are needed for heterogeneous data/information to be integrated and made interoperable for researchers in disparate fields and for myriad uses across international, institutional, disciplinary, spatial and temporal boundaries. Pooling Belmont Forum members' resources to bring communities together for further integration, cooperation, and leveraging of existing initiatives and resources has the potential to develop the e-infrastructure framework necessary to solve pressing environmental problems, and to support the aims of many international data sharing initiatives. The plan is expected to serve as the foundation of future Belmont Forum calls for proposals for e-Infrastructures and Data Management. The Belmont Forum is uniquely able to align resources of major national funders to support global environmental change research on specific technical and governance challenges, and the development of focused pilot systems that could be complementary to other initiatives such as GEOSS, ICSU World Data System, and Global Framework for Climate Services (GFCS). The development of this Belmont Forum Knowledge Hub represents an extraordinary effort to bring together international leaders in interoperability, governance and other fields pertinent to decision-support systems in global environmental change research. It is also addressing related issues such as ensuring a cohort of environmental scientists who can use up-to-date computing techniques for data and information management, and investigating which legal issues need common international attention.
Warthog: A MOOSE-Based Application for the Direct Code Coupling of BISON and PROTEUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCaskey, Alexander J.; Slattery, Stuart; Billings, Jay Jay
The Nuclear Energy Advanced Modeling and Simulation (NEAMS) program from the Department of Energy's Office of Nuclear Energy provides a robust toolkit for the modeling and simulation of current and future advanced nuclear reactor designs. This toolkit provides these technologies organized across product lines: two divisions targeted at fuels and end-to-end reactor modeling, and a third for integration, coupling, and high-level workflow management. The Fuels Product Line and the Reactor Product line provide advanced computational technologies that serve each respective field well, however, their current lack of integration presents a major impediment to future improvements of simulation solution fidelity. Theremore » is a desire for the capability to mix and match tools across Product Lines in an effort to utilize the best from both to improve NEAMS modeling and simulation technologies. This report details a new effort to provide this Product Line interoperability through the development of a new application called Warthog. This application couples the BISON Fuel Performance application from the Fuels Product Line and the PROTEUS Core Neutronics application from the Reactors Product Line in an effort to utilize the best from all parts of the NEAMS toolkit and improve overall solution fidelity of nuclear fuel simulations. To achieve this, Warthog leverages as much prior work from the NEAMS program as possible, and in doing so, enables interoperability between the disparate MOOSE and SHARP frameworks, and the libMesh and MOAB mesh data formats. This report describes this work in full. We begin with a detailed look at the individual NEAMS framework technologies used and developed in the various Product Lines, and the current status of their interoperability. We then introduce the Warthog application: its overall architecture and the ways it leverages the best existing tools from across the NEAMS toolkit to enable BISON-PROTEUS integration. Furthermore, we show how Warthog leverages a tool known as DataTransferKit to seamlessly enable the transfer for solution data between disparate frameworks and mesh formats. To end, we demonstrate tests for the direct software coupling of BISON and PROTEUS using Warthog, and discuss current impediments and solutions to the construction of physically realistic input models for this coupled BISON-PROTEUS system.« less
COMP Superscalar, an interoperable programming framework
NASA Astrophysics Data System (ADS)
Badia, Rosa M.; Conejero, Javier; Diaz, Carlos; Ejarque, Jorge; Lezzi, Daniele; Lordan, Francesc; Ramon-Cortes, Cristian; Sirvent, Raul
2015-12-01
COMPSs is a programming framework that aims to facilitate the parallelization of existing applications written in Java, C/C++ and Python scripts. For that purpose, it offers a simple programming model based on sequential development in which the user is mainly responsible for (i) identifying the functions to be executed as asynchronous parallel tasks and (ii) annotating them with annotations or standard Python decorators. A runtime system is in charge of exploiting the inherent concurrency of the code, automatically detecting and enforcing the data dependencies between tasks and spawning these tasks to the available resources, which can be nodes in a cluster, clouds or grids. In cloud environments, COMPSs provides scalability and elasticity features allowing the dynamic provision of resources.
[Framework for the strengthening of health information systems in Peru].
Curioso, Walter H; Espinoza-Portilla, Elizabeth
2015-01-01
In this article we present the essential components and policies that are most relevant regarding the conceptual framework to strengthen the health information systems in Peru. The article also presents the main policies, actions and strategies made in the field of electronic health in Peru that are most significant. The health information systems in Peru play a key role and are expected to achieve an integrated and interoperable information system. This will allow health information to be complete, efficient, of good quality and available in a timely manner to achieve better quality of life for people and allow meaningful modernization of public health in the context of health reform in Peru.
2011-04-01
is accredited by the Commission on Higher Education of the Middle States Association of Colleges and Schools, 3624 Market Street, Philadelphia, PA...designed to be interoperable with the Army‟s Universal Ground Control Station ( UGCS ) and uses the same hardware and software as the UGCS .12 Part of the...efforts face similar basing, launch and recovery and ground support requirements. Currently, the LEMV program has adapted for use the UGCS used by all
2008 Homeland Security S and T Stakeholders Conference West. Volume 4. Wednesday
2008-01-16
www.npia.police.uk Polonium 210 Interoperability - lessons Major Incident - CBRN Images courtesy of BBC www.npia.police.uk Boscastle 2007...Washington Training Session 37: Preparing First Responders for Food Systems Disasters Jerry Gillespie, DVM, PhD Director, Western Institute for... Food Safety and Security Training Session 39: Technology Adoption & Innovation 1 Dr. Neal Thornberry, Innovation Chair Graduate School of
2011-12-01
Task Based Approach to Planning.” Paper 08F- SIW -033. In Proceed- ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability...Paper 06F- SIW -003. In Proceed- 2597 Blais ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability Standards Organi...MSDL).” Paper 10S- SIW -003. In Proceedings of the Spring Simulation Interoperability Workshop. Simulation Interoperability Standards Organization
Maturity model for enterprise interoperability
NASA Astrophysics Data System (ADS)
Guédria, Wided; Naudet, Yannick; Chen, David
2015-01-01
Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.
Harmonising Nursing Terminologies Using a Conceptual Framework.
Jansen, Kay; Kim, Tae Youn; Coenen, Amy; Saba, Virginia; Hardiker, Nicholas
2016-01-01
The International Classification for Nursing Practice (ICNP®) and the Clinical Care Classification (CCC) System are standardised nursing terminologies that identify discrete elements of nursing practice, including nursing diagnoses, interventions, and outcomes. While CCC uses a conceptual framework or model with 21 Care Components to classify these elements, ICNP, built on a formal Web Ontology Language (OWL) description logic foundation, uses a logical hierarchical framework that is useful for computing and maintenance of ICNP. Since the logical framework of ICNP may not always align with the needs of nursing practice, an informal framework may be a more useful organisational tool to represent nursing content. The purpose of this study was to classify ICNP nursing diagnoses using the 21 Care Components of the CCC as a conceptual framework to facilitate usability and inter-operability of nursing diagnoses in electronic health records. Findings resulted in all 521 ICNP diagnoses being assigned to one of the 21 CCC Care Components. Further research is needed to validate the resulting product of this study with practitioners and develop recommendations for improvement of both terminologies.
Bridging the Gap. European C4ISR Capabilities and Transatlantic Interoperability
2004-10-01
Messagerie Universelle Sécurisée, or MUSE), and the Airborne Laser Optical Link (Liaison Optique Laser Aéroportée, or LOLA), a demonstrator that in 2006...system) Future naval intranet system Messagerie Universelle Sécurisée (MUSE) (secure e-mail system for French MOD) Secure e-mail system...EU/NATO relationship so progress can continue in both frameworks. In addition, whether through the EU or NATO, a failure to allocate European fiscal
Argobots: A Lightweight Low-Level Threading and Tasking Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan
In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, either are too specific to applications or architectures or are not as powerful or flexible. In this paper, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by end users or high-level programming models. We describe the design, implementation, and performance characterization of Argobots and present integrations with three high-level models: OpenMP, MPI, and colocated I/O services. Evaluations show that (1) Argobots, while providing richer capabilities, is competitive with existing simpler generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency-hiding capabilities; and (4) I/O services with Argobots reduce interference with colocated applications while achieving performance competitive with that of a Pthreads approach.« less
Argobots: A Lightweight Low-Level Threading and Tasking Framework
Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan; ...
2017-10-24
In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this article, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. Here, we describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less
The Java Image Science Toolkit (JIST) for rapid prototyping and publishing of neuroimaging software.
Lucas, Blake C; Bogovic, John A; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L; Pham, Dzung L; Landman, Bennett A
2010-03-01
Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC).
The Java Image Science Toolkit (JIST) for Rapid Prototyping and Publishing of Neuroimaging Software
Lucas, Blake C.; Bogovic, John A.; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L.; Pham, Dzung
2010-01-01
Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC). PMID:20077162
Argobots: A Lightweight Low-Level Threading and Tasking Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan
In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this article, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. Here, we describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less
Evaluating Sustainability Models for Interoperability through Brokering Software
NASA Astrophysics Data System (ADS)
Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew
2016-04-01
Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.
2014-01-01
The application of semantic technologies to the integration of biological data and the interoperability of bioinformatics analysis and visualization tools has been the common theme of a series of annual BioHackathons hosted in Japan for the past five years. Here we provide a review of the activities and outcomes from the BioHackathons held in 2011 in Kyoto and 2012 in Toyama. In order to efficiently implement semantic technologies in the life sciences, participants formed various sub-groups and worked on the following topics: Resource Description Framework (RDF) models for specific domains, text mining of the literature, ontology development, essential metadata for biological databases, platforms to enable efficient Semantic Web technology development and interoperability, and the development of applications for Semantic Web data. In this review, we briefly introduce the themes covered by these sub-groups. The observations made, conclusions drawn, and software development projects that emerged from these activities are discussed. PMID:24495517
Augmented reality and haptic interfaces for robot-assisted surgery.
Yamamoto, Tomonori; Abolhassani, Niki; Jung, Sung; Okamura, Allison M; Judkins, Timothy N
2012-03-01
Current teleoperated robot-assisted minimally invasive surgical systems do not take full advantage of the potential performance enhancements offered by various forms of haptic feedback to the surgeon. Direct and graphical haptic feedback systems can be integrated with vision and robot control systems in order to provide haptic feedback to improve safety and tissue mechanical property identification. An interoperable interface for teleoperated robot-assisted minimally invasive surgery was developed to provide haptic feedback and augmented visual feedback using three-dimensional (3D) graphical overlays. The software framework consists of control and command software, robot plug-ins, image processing plug-ins and 3D surface reconstructions. The feasibility of the interface was demonstrated in two tasks performed with artificial tissue: palpation to detect hard lumps and surface tracing, using vision-based forbidden-region virtual fixtures to prevent the patient-side manipulator from entering unwanted regions of the workspace. The interoperable interface enables fast development and successful implementation of effective haptic feedback methods in teleoperation. Copyright © 2011 John Wiley & Sons, Ltd.
Use of Annotations for Component and Framework Interoperability
NASA Astrophysics Data System (ADS)
David, O.; Lloyd, W.; Carlson, J.; Leavesley, G. H.; Geter, F.
2009-12-01
The popular programming languages Java and C# provide annotations, a form of meta-data construct. Software frameworks for web integration, web services, database access, and unit testing now take advantage of annotations to reduce the complexity of APIs and the quantity of integration code between the application and framework infrastructure. Adopting annotation features in frameworks has been observed to lead to cleaner and leaner application code. The USDA Object Modeling System (OMS) version 3.0 fully embraces the annotation approach and additionally defines a meta-data standard for components and models. In version 3.0 framework/model integration previously accomplished using API calls is now achieved using descriptive annotations. This enables the framework to provide additional functionality non-invasively such as implicit multithreading, and auto-documenting capabilities while achieving a significant reduction in the size of the model source code. Using a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside of it. To study the effectiveness of an annotation based framework approach with other modeling frameworks, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A monthly water balance model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. In a next step, the PRMS model was implemented in OMS 3.0 and is currently being implemented for water supply forecasting in the western United States at the USDA NRCS National Water and Climate Center. PRMS is a component based modular precipitation-runoff model developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow and general basin hydrology. The new OMS 3.0 PRMS model source code is more concise and flexible as a result of using the new framework’s annotation based approach. The fully annotated components are now providing information directly for (i) model assembly and building, (ii) dataflow analysis for implicit multithreading, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks. As a prototype example, model code annotations were used to generate binding and mediation code to allow the use of OMS 3.0 model components within the OpenMI context.
da Silva, Kátia Regina; Costa, Roberto; Crevelari, Elizabeth Sartori; Lacerda, Marianna Sobral; de Moraes Albertini, Caio Marcos; Filho, Martino Martinelli; Santana, José Eduardo; Vissoci, João Ricardo Nickenig; Pietrobon, Ricardo; Barros, Jacson V.
2013-01-01
Background The ability to apply standard and interoperable solutions for implementing and managing medical registries as well as aggregate, reproduce, and access data sets from legacy formats and platforms to advanced standard formats and operating systems are crucial for both clinical healthcare and biomedical research settings. Purpose Our study describes a reproducible, highly scalable, standard framework for a device registry implementation addressing both local data quality components and global linking problems. Methods and Results We developed a device registry framework involving the following steps: (1) Data standards definition and representation of the research workflow, (2) Development of electronic case report forms using REDCap (Research Electronic Data Capture), (3) Data collection according to the clinical research workflow and, (4) Data augmentation by enriching the registry database with local electronic health records, governmental database and linked open data collections, (5) Data quality control and (6) Data dissemination through the registry Web site. Our registry adopted all applicable standardized data elements proposed by American College Cardiology / American Heart Association Clinical Data Standards, as well as variables derived from cardiac devices randomized trials and Clinical Data Interchange Standards Consortium. Local interoperability was performed between REDCap and data derived from Electronic Health Record system. The original data set was also augmented by incorporating the reimbursed values paid by the Brazilian government during a hospitalization for pacemaker implantation. By linking our registry to the open data collection repository Linked Clinical Trials (LinkedCT) we found 130 clinical trials which are potentially correlated with our pacemaker registry. Conclusion This study demonstrates how standard and reproducible solutions can be applied in the implementation of medical registries to constitute a re-usable framework. Such approach has the potential to facilitate data integration between healthcare and research settings, also being a useful framework to be used in other biomedical registries. PMID:23936257
Development of Extended Content Standards for Biodiversity Data
NASA Astrophysics Data System (ADS)
Hugo, Wim; Schmidt, Jochen; Saarenmaa, Hannu
2013-04-01
Interoperability in the field of Biodiversity observation has been strongly driven by the development of a number of global initiatives (GEO, GBIF, OGC, TDWG, GenBank, …) and its supporting standards (OGC-WxS, OGC-SOS, Darwin Core (DwC), NetCDF, …). To a large extent, these initiatives have focused on discoverability and standardization of syntactic and schematic interoperability. Semantic interoperability is more complex, requiring development of domain-dependent conceptual data models, and extension of these models with appropriate ontologies (typically manifested as controlled vocabularies). Biodiversity content has been standardized partly, for example through Darwin Core for occurrence data and associated taxonomy, and through Genbank for genetic data, but other contexts of biodiversity observation have lagged behind - making it difficult to achieve semantic interoperability between distributed data sources. With this in mind, WG8 of GEO BON (charged with data and systems interoperability) has started a work programme to address a number of concerns, one of which is the gap in content standards required to make Biodiversity data truly interoperable. The paper reports on the framework developed by WG8 for the classification of Biodiversity observation data into 'families' of use cases and its supporting data schema, where gaps, if any, in the availability if content standards have been identified, and how these are to be addressed by way of an abstract data model and the development of associated content standards. It is proposed that a minimum set of standards (1) will be required to address the scope of Biodiversity content, aligned with levels and dimensions of observation, and based on the 'Essential Biodiversity Variables' (2) being developed by GEO BON . The content standards are envisaged as loosely separated from the syntactic and schematic standards used for the base data exchange: typically, services would offer an existing data standard (DwC, WFS, SOS, NetCDF), with a use-case dependent 'payload' embedded into the data stream. This enables the re-use of the abstract schema, and sometimes the implementation specification (for example XML, JSON, or NetCDF conventions) across services. An explicit aim will be to make the XML implementation specification re-usable as a DwC and a GML (SOS end WFS) extension. (1) Olga Lyashevska, Keith D. Farnsworth, How many dimensions of biodiversity do we need?, Ecological Indicators, Volume 18, July 2012, Pages 485-492, ISSN 1470-160X, 10.1016/j.ecolind.2011.12.016. (2) GEO BON: Workshop on Essential Biodiversity Variables (27-29 February 2012, Frascati, Italy). (http://www.earthobservations.org/geobon_docs_20120227.shtml)
Towards an Open, Distributed Software Architecture for UxS Operations
NASA Technical Reports Server (NTRS)
Cross, Charles D.; Motter, Mark A.; Neilan, James H.; Qualls, Garry D.; Rothhaar, Paul M.; Tran, Loc; Trujillo, Anna C.; Allen, B. Danette
2015-01-01
To address the growing need to evaluate, test, and certify an ever expanding ecosystem of UxS platforms in preparation of cultural integration, NASA Langley Research Center's Autonomy Incubator (AI) has taken on the challenge of developing a software framework in which UxS platforms developed by third parties can be integrated into a single system which provides evaluation and testing, mission planning and operation, and out-of-the-box autonomy and data fusion capabilities. This software framework, named AEON (Autonomous Entity Operations Network), has two main goals. The first goal is the development of a cross-platform, extensible, onboard software system that provides autonomy at the mission execution and course-planning level, a highly configurable data fusion framework sensitive to the platform's available sensor hardware, and plug-and-play compatibility with a wide array of computer systems, sensors, software, and controls hardware. The second goal is the development of a ground control system that acts as a test-bed for integration of the proposed heterogeneous fleet, and allows for complex mission planning, tracking, and debugging capabilities. The ground control system should also be highly extensible and allow plug-and-play interoperability with third party software systems. In order to achieve these goals, this paper proposes an open, distributed software architecture which utilizes at its core the Data Distribution Service (DDS) standards, established by the Object Management Group (OMG), for inter-process communication and data flow. The design decisions proposed herein leverage the advantages of existing robotics software architectures and the DDS standards to develop software that is scalable, high-performance, fault tolerant, modular, and readily interoperable with external platforms and software.
NASA Astrophysics Data System (ADS)
Lu, H.; Yi, D.
2010-12-01
The Deep Exploration is one of the important approaches to the Geoscience research. Since 1980s we had started it and achieved a lot of data. Researchers usually integrate both data of space exploration and deep exploration to study geological structures and represent the Earth’s subsurface, and analyze and explain on the base of integrated data. Due to the different exploration approach it results the heterogeneity of data, and therefore the data achievement is always of the import issue to make the researchers confused. The problem of data share and interaction has to be solved during the development of the SinoProbe research project. Through the research of domestic and overseas well-known exploration project and geosciences data platform, the subject explores the solution of data share and interaction. Based on SOA we present the deep exploration data share framework which comprises three level: data level is used for the solution of data store and the integration of the heterogeneous data; medial level provides the data service of geophysics, geochemistry, etc. by the means of Web service, and carry out kinds of application combination by the use of GIS middleware and Eclipse RCP; interaction level provides professional and non-professional customer the access to different accuracy data. The framework adopts GeoSciML data interaction approach. GeoSciML is a geosciences information markup language, as an application of the OpenGIS Consortium’s (OGC) Geography Markup Language (GML). It transfers heterogeneous data into one earth frame and implements inter-operation. We dissertate in this article the solution how to integrate the heterogeneous data and share the data in the project of SinoProbe.
Glusman, Gustavo; Rose, Peter W; Prlić, Andreas; Dougherty, Jennifer; Duarte, José M; Hoffman, Andrew S; Barton, Geoffrey J; Bendixen, Emøke; Bergquist, Timothy; Bock, Christian; Brunk, Elizabeth; Buljan, Marija; Burley, Stephen K; Cai, Binghuang; Carter, Hannah; Gao, JianJiong; Godzik, Adam; Heuer, Michael; Hicks, Michael; Hrabe, Thomas; Karchin, Rachel; Leman, Julia Koehler; Lane, Lydie; Masica, David L; Mooney, Sean D; Moult, John; Omenn, Gilbert S; Pearl, Frances; Pejaver, Vikas; Reynolds, Sheila M; Rokem, Ariel; Schwede, Torsten; Song, Sicheng; Tilgner, Hagen; Valasatava, Yana; Zhang, Yang; Deutsch, Eric W
2017-12-18
The translation of personal genomics to precision medicine depends on the accurate interpretation of the multitude of genetic variants observed for each individual. However, even when genetic variants are predicted to modify a protein, their functional implications may be unclear. Many diseases are caused by genetic variants affecting important protein features, such as enzyme active sites or interaction interfaces. The scientific community has catalogued millions of genetic variants in genomic databases and thousands of protein structures in the Protein Data Bank. Mapping mutations onto three-dimensional (3D) structures enables atomic-level analyses of protein positions that may be important for the stability or formation of interactions; these may explain the effect of mutations and in some cases even open a path for targeted drug development. To accelerate progress in the integration of these data types, we held a two-day Gene Variation to 3D (GVto3D) workshop to report on the latest advances and to discuss unmet needs. The overarching goal of the workshop was to address the question: what can be done together as a community to advance the integration of genetic variants and 3D protein structures that could not be done by a single investigator or laboratory? Here we describe the workshop outcomes, review the state of the field, and propose the development of a framework with which to promote progress in this arena. The framework will include a set of standard formats, common ontologies, a common application programming interface to enable interoperation of the resources, and a Tool Registry to make it easy to find and apply the tools to specific analysis problems. Interoperability will enable integration of diverse data sources and tools and collaborative development of variant effect prediction methods.
NASA Astrophysics Data System (ADS)
Loescher, H.; Fundamental Instrument Unit
2013-05-01
Ecological research addresses challenges relating to the dynamics of the planet, such as changes in climate, biodiversity, ecosystem functioning and services, carbon and energy cycles, natural and human-induced hazards, and adaptation and mitigation strategies that involve many science and engineering disciplines and cross national boundaries. Because of the global nature of these challenges, greater international collaboration is required for knowledge sharing and technology deployment to advance earth science investigations and enhance societal benefits. For example, the Working Group on Biodiversity Preservation and Ecosystem Services (PCAST 2011) noted the scale and complexity of the physical and human resources needed to address these challenges. Many of the most pressing ecological research questions require global-scale data and global scale solutions (Suresh 2012), e.g., interdisciplinary data access from data centers managing ecological resources and hazards, drought, heat islands, carbon cycle, or data used to forecast the rate of spread of invasive species or zoonotic diseases. Variability and change at one location or in one region may well result from the superposition of global processes coupled together with regional and local modes of variability. For example, we know the El Niño-Southern Oscillation large-scale modes of variability in the coupled terrestrial-aquatic-atmospheric systems' correlation with variability in regional rainfall and ecosystem functions. It is therefore a high priority of government and non-government organizations to develop the necessary large scale, world-class research infrastructures for environmental research—and the framework by which these data can be shared, discovered, and utilized by a broad user community of scientists and policymakers, alike. Given that there are many, albeit nascent, efforts to build new environmental observatories/networks globally (e.g., EU-ICOS, EU-Lifewatch, AU-TERN, China-CERN, GEOSS, GEO-BON, NutNet, etc.) and domestically, (e.g., NSF-CZO, USDA-LTAR, DOE-NGEE, Soil Carbon Network, etc.), there is a strong and mutual desire to assure interoperability of data. Developing interoperability is the degree by which each of the following is mapped between observatories (entities), defined by linking i) science requirements with science questions, ii) traceability of measurements to nationally and internationally accepted standards, iii) how data product are derived, i.e., algorithms, procedures, and methods, and iv) the bioinformatics which broadly include data formats, metadata, controlled vocabularies, and semantics. Here, we explore the rationale and focus areas for interoperability, the governance and work structures, example projects (NSF-NEON, EU-ICOS, and AU-TERN), and the emergent roles of scientists in these endeavors.
A framework for secure and decentralized sharing of medical imaging data via blockchain consensus.
Patel, Vishal
2018-04-01
The electronic sharing of medical imaging data is an important element of modern healthcare systems, but current infrastructure for cross-site image transfer depends on trust in third-party intermediaries. In this work, we examine the blockchain concept, which enables parties to establish consensus without relying on a central authority. We develop a framework for cross-domain image sharing that uses a blockchain as a distributed data store to establish a ledger of radiological studies and patient-defined access permissions. The blockchain framework is shown to eliminate third-party access to protected health information, satisfy many criteria of an interoperable health system, and readily generalize to domains beyond medical imaging. Relative drawbacks of the framework include the complexity of the privacy and security models and an unclear regulatory environment. Ultimately, the large-scale feasibility of such an approach remains to be demonstrated and will depend on a number of factors which we discuss in detail.
[Research on tumor information grid framework].
Zhang, Haowei; Qin, Zhu; Liu, Ying; Tan, Jianghao; Cao, Haitao; Chen, Youping; Zhang, Ke; Ding, Yuqing
2013-10-01
In order to realize tumor disease information sharing and unified management, we utilized grid technology to make the data and software resources which distributed in various medical institutions for effective integration so that we could make the heterogeneous resources consistent and interoperable in both semantics and syntax aspects. This article describes the tumor grid framework, the type of the service being packaged in Web Service Description Language (WSDL) and extensible markup language schemas definition (XSD), the client use the serialized document to operate the distributed resources. The service objects could be built by Unified Modeling Language (UML) as middle ware to create application programming interface. All of the grid resources are registered in the index and released in the form of Web Services based on Web Services Resource Framework (WSRF). Using the system we can build a multi-center, large sample and networking tumor disease resource sharing framework to improve the level of development in medical scientific research institutions and the patient's quality of life.
Accountable care around the world: a framework to guide reform strategies.
McClellan, Mark; Kent, James; Beales, Stephen J; Cohen, Samuel I A; Macdonnell, Michael; Thoumi, Andrea; Abdulmalik, Mariam; Darzi, Ara
2014-09-01
Accountable care--a way to align health care payments with patient-focused reform goals--is currently being pursued in the United States, but its principles are also being applied in many other countries. In this article we review experiences with such reforms to offer a globally applicable definition of an accountable care system and propose a conceptual framework for characterizing and assessing accountable care reforms. The framework consists of five components: population, outcomes, metrics and learning, payments and incentives, and coordinated delivery. We describe how the framework applies to accountable care reforms that are already being implemented in Spain and Singapore. We also describe how it can be used to map progress through increasingly sophisticated levels of reforms. We recommend that policy makers pursuing accountable care reforms emphasize the following steps: highlight population health and wellness instead of just treating illness; pay for outcomes instead of activities; create a more favorable environment for collaboration and coordinated care; and promote interoperable data systems. Project HOPE—The People-to-People Health Foundation, Inc.
STAR Online Framework: from Metadata Collection to Event Analysis and System Control
NASA Astrophysics Data System (ADS)
Arkhipkin, D.; Lauret, J.
2015-05-01
In preparation for the new era of RHIC running (RHIC-II upgrades and possibly, the eRHIC era), the STAR experiment is expanding its modular Message Interface and Reliable Architecture framework (MIRA). MIRA allowed STAR to integrate meta-data collection, monitoring, and online QA components in a very agile and efficient manner using a messaging infrastructure approach. In this paper, we briefly summarize our past achievements, provide an overview of the recent development activities focused on messaging patterns and describe our experience with the complex event processor (CEP) recently integrated into the MIRA framework. CEP was used in the recent RHIC Run 14, which provided practical use cases. Finally, we present our requirements and expectations for the planned expansion of our systems, which will allow our framework to acquire features typically associated with Detector Control Systems. Special attention is given to aspects related to latency, scalability and interoperability within heterogeneous set of services, various data and meta-data acquisition components coexisting in STAR online domain.
Designing learning management system interoperability in semantic web
NASA Astrophysics Data System (ADS)
Anistyasari, Y.; Sarno, R.; Rochmawati, N.
2018-01-01
The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.
Interoperable Risk Management in a Joint Interagency Multinational Environment
2007-08-01
est ensuite examinée par rapport à l’approche rendue obligatoire par le Conseil du Trésor. Un examen effectué par le Chef – Service d’examen (2004...relativement à la compréhension du MDN/des FC en matière de gestion des risques. De plus, un examen effectué par le Chef – Service d’examen (2004) a...to support further improvement. The Treasury Board of Canada Secretariat’s framework for risk management is also reflected in a companion
Coalition Interoperability Measurement Frameworks Literature Survey
2011-08-01
National Defence, 2011 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la Défense nationale, 2011 Abstract...interarmées élabore actuellement ce cadre et les travaux sont parrainés par le Groupe du Sous- ministre adjoint ( Gestion de l’information). L’intérêt...Groupe du Sous-ministre adjoint ( Gestion de l’information). Le type d’interopérabilité qui nous intéresse : 1. touche les Cinq pays (Australie
A Framework for Improving Integrative Factors in C3I Systems of the Argentine Army
1991-03-01
and installed without regard to system integration or interoperability. The SIIFE Project (Sistema Informdtico Integrado de la Fuerza Ej~rcito) was the...de la Fuerza Ej~rcito). At the same time, a secure and independent system for the intelligence area was also developed. In the 1980s, the development...Comando v Control Para las Tropas del Ei~rcito, Buenos Aires, September 1989. 38. Levine, Judith, "Networking for Desert Shield," Communica tions Week
Implementing a Standards Development Framework for the Coalition Battle Management Language
2013-06-01
and M. Hieb, “Coalition Battle Management (C-BML) Study Group Report”, Paper 05F- SIW -041, Fall Simulation Interoperability Workshop Sept 2006. [3...J. Abbott, S. Levine, M. Pullen: “Answering The Question Why A BML Standard Has Taken So Long To Be Establishes?”, Fall 2011 SIW , Orlando USA. [4] K...Heffner et al., “A Systems Engineering Approach to M&S Standards Development: Application to the Coalition Battle Management Language”, 13S- SIW -002
PharmML in Action: an Interoperable Language for Modeling and Simulation
Bizzotto, R; Smith, G; Yvon, F; Kristensen, NR; Swat, MJ
2017-01-01
PharmML1 is an XML‐based exchange format2, 3, 4 created with a focus on nonlinear mixed‐effect (NLME) models used in pharmacometrics,5, 6 but providing a very general framework that also allows describing mathematical and statistical models such as single‐subject or nonlinear and multivariate regression models. This tutorial provides an overview of the structure of this language, brief suggestions on how to work with it, and use cases demonstrating its power and flexibility. PMID:28575551
Medical Device Plug-and-Play Interoperability Standards and Technology Leadership
2017-10-01
Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...Sept 2016 – 20 Sept 2017 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology ...efficiency through interoperable medical technologies . We played a leadership role on interoperability safety standards (AAMI, AAMI/UL Joint
Performance measurement integrated information framework in e-Manufacturing
NASA Astrophysics Data System (ADS)
Teran, Hilaida; Hernandez, Juan Carlos; Vizán, Antonio; Ríos, José
2014-11-01
The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronising manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to performance measurement (PM) processes, a critical area for decision making and implementing improvement actions in manufacturing. This paper proposes a PM information framework to integrate decision support systems in e-Manufacturing. Specifically, the proposed framework offers a homogeneous PM information exchange model that can be applied through decision support in e-Manufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a PM information platform and PM-Web services architecture. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the model.
Framework for adaptive interoperability of manufacturing enterprises (FAIME): a case study
NASA Astrophysics Data System (ADS)
Sims, John E.; Chu, Bei Tseng B.; Long, Junshen; Matthews, Mike; Barnes, Johnny G.; Jones, Chris H.; Anderson, Rayne A.; Lambert, Russ; Drake, Doug C.; Hamilton, Mark A.; Connard, Mark
1997-01-01
In todays global economy, manufacturing industries require to connect disparate applications seamlessly. They require not only to exchange data and transactions, but present a single business process image to their employees in the office, headquarters, and on the plant floor. Also, it is imperative that small and medium size manufacturing companies deploy manufacturing execution systems applications in conjunction with modern enterprise resource programs for cycle time reduction and better quality. This paper presents the experiences and reflections on a project that created a tool set to assist the above be accomplished not only in a shorter cycle time, with a better predictable quality, and with an object oriented framework, but also a tool set that allows the manufacturer to still use legacy applications. This framework has the capability of plug-and- play so that future migrations and re-engineering of processes are more productive.
NASA Astrophysics Data System (ADS)
Bikakis, Nikos; Gioldasis, Nektarios; Tsinaraki, Chrisa; Christodoulakis, Stavros
SPARQL is today the standard access language for Semantic Web data. In the recent years XML databases have also acquired industrial importance due to the widespread applicability of XML in the Web. In this paper we present a framework that bridges the heterogeneity gap and creates an interoperable environment where SPARQL queries are used to access XML databases. Our approach assumes that fairly generic mappings between ontology constructs and XML Schema constructs have been automatically derived or manually specified. The mappings are used to automatically translate SPARQL queries to semantically equivalent XQuery queries which are used to access the XML databases. We present the algorithms and the implementation of SPARQL2XQuery framework, which is used for answering SPARQL queries over XML databases.
A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service
Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin
2014-01-01
Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016
A framework for sharing and integrating remote sensing and GIS models based on Web service.
Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin
2014-01-01
Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.
Unified Framework for Development, Deployment and Robust Testing of Neuroimaging Algorithms
Joshi, Alark; Scheinost, Dustin; Okuda, Hirohito; Belhachemi, Dominique; Murphy, Isabella; Staib, Lawrence H.; Papademetris, Xenophon
2011-01-01
Developing both graphical and command-line user interfaces for neuroimaging algorithms requires considerable effort. Neuroimaging algorithms can meet their potential only if they can be easily and frequently used by their intended users. Deployment of a large suite of such algorithms on multiple platforms requires consistency of user interface controls, consistent results across various platforms and thorough testing. We present the design and implementation of a novel object-oriented framework that allows for rapid development of complex image analysis algorithms with many reusable components and the ability to easily add graphical user interface controls. Our framework also allows for simplified yet robust nightly testing of the algorithms to ensure stability and cross platform interoperability. All of the functionality is encapsulated into a software object requiring no separate source code for user interfaces, testing or deployment. This formulation makes our framework ideal for developing novel, stable and easy-to-use algorithms for medical image analysis and computer assisted interventions. The framework has been both deployed at Yale and released for public use in the open source multi-platform image analysis software—BioImage Suite (bioimagesuite.org). PMID:21249532
Common Data Model for Neuroscience Data and Data Model Exchange
Gardner, Daniel; Knuth, Kevin H.; Abato, Michael; Erde, Steven M.; White, Thomas; DeBellis, Robert; Gardner, Esther P.
2001-01-01
Objective: Generalizing the data models underlying two prototype neurophysiology databases, the authors describe and propose the Common Data Model (CDM) as a framework for federating a broad spectrum of disparate neuroscience information resources. Design: Each component of the CDM derives from one of five superclasses—data, site, method, model, and reference—or from relations defined between them. A hierarchic attribute-value scheme for metadata enables interoperability with variable tree depth to serve specific intra- or broad inter-domain queries. To mediate data exchange between disparate systems, the authors propose a set of XML-derived schema for describing not only data sets but data models. These include biophysical description markup language (BDML), which mediates interoperability between data resources by providing a meta-description for the CDM. Results: The set of superclasses potentially spans data needs of contemporary neuroscience. Data elements abstracted from neurophysiology time series and histogram data represent data sets that differ in dimension and concordance. Site elements transcend neurons to describe subcellular compartments, circuits, regions, or slices; non-neuroanatomic sites include sequences to patients. Methods and models are highly domain-dependent. Conclusions: True federation of data resources requires explicit public description, in a metalanguage, of the contents, query methods, data formats, and data models of each data resource. Any data model that can be derived from the defined superclasses is potentially conformant and interoperability can be enabled by recognition of BDML-described compatibilities. Such metadescriptions can buffer technologic changes. PMID:11141510
Supply Chain Interoperability Measurement
2015-06-19
Supply Chain Interoperability Measurement DISSERTATION June 2015 Christos E. Chalyvidis, Major, Hellenic Air...ENS-DS-15-J-001 SUPPLY CHAIN INTEROPERABILITY MEASUREMENT DISSERTATION Presented to the Faculty Department of Operational Sciences...INTEROPERABILITY MEASUREMENT Christos E. Chalyvidis, BS, MSc. Major, Hellenic Air Force Committee Membership: Dr. A.W. Johnson Chair
Federal Register 2010, 2011, 2012, 2013, 2014
2015-08-03
...] Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments AGENCY: Food... workshop entitled ``FDA/CDC/NLM Workshop on Promoting Semantic Interoperability of Laboratory Data.'' The... to promoting the semantic interoperability of laboratory data between in vitro diagnostic devices and...
Federal Register 2010, 2011, 2012, 2013, 2014
2016-10-04
...] Workshop on Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments... Semantic Interoperability of Laboratory Data.'' The purpose of this public workshop is to receive and... Semantic Interoperability of Laboratory Data.'' Received comments will be placed in the docket and, except...
Leverage and Delegation in Developing an Information Model for Geology
NASA Astrophysics Data System (ADS)
Cox, S. J.
2007-12-01
GeoSciML is an information model and XML encoding developed by a group of primarily geologic survey organizations under the auspices of the IUGS CGI. The scope of the core model broadly corresponds with information traditionally portrayed on a geologic map, viz. interpreted geology, some observations, the map legend and accompanying memoir. The development of GeoSciML has followed the methodology specified for an Application Schema defined by OGC and ISO 19100 series standards. This requires agreement within a community concerning their domain model, its formal representation using UML, documentation as a Feature Type Catalogue, with an XML Schema implementation generated from the model by applying a rule-based transformation. The framework and technology supports a modular governance process. Standard datatypes and GI components (geometry, the feature and coverage metamodels, metadata) are imported from the ISO framework. The observation and sampling model (including boreholes) is imported from OGC. The scale used for most scalar literal values (terms, codes, measures) allows for localization where necessary. Wildcards and abstract base- classes provide explicit extensibility points. Link attributes appear in a regular way in the encodings, allowing reference to external resources using URIs. The encoding is compatible with generic GI data-service interfaces (WFS, WMS, SOS). For maximum interoperability within a community, the interfaces may be specialised through domain-specified constraints (e.g. feature-types, scale and vocabulary bindings, query-models). Formalization using UML and XML allows use of standard validation and processing tools. Use of upper-level elements defined for generic GI application reduces the development effort and governance resonsibility, while maximising cross-domain interoperability. On the other hand, enabling specialization to be delegated in a controlled manner is essential to adoption across a range of subdisciplines and jurisdictions. The GeoSciML design team is responsible only for the part of the model that is unique to geology but for which general agreement can be reached within the domain. This paper is presented on behalf of the Interoperability Working Group of the IUGS Commission for Geoscience Information (CGI) - follow web-link for details of the membership.
Interoperability Context-Setting Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Hardin, Dave; Ambrosio, Ron
2007-01-31
As the deployment of automation technology advances, it touches upon many areas of our corporate and personal lives. A trend is emerging where systems are growing to the extent that integration is taking place with other systems to provide even greater capabilities more efficiently and effectively. GridWise™ provides a vision for this type of integration as it applies to the electric system. Imagine a time in the not too distant future when homeowners can offer the management of their electricity demand to participate in a more efficient and environmentally friendly operation of the electric power grid. They will do thismore » using technology that acts on their behalf in response to information from other components of the electric system. This technology will recognize their preferences to parameters such as comfort and the price of energy to form responses that optimize the local need to a signal that satisfies a higher-level need in the grid. For example, consider a particularly hot day with air stagnation in an area with a significant dependence on wind generation. To manage the forecasted peak electricity demand, the bulk system operator issues a critical peak price warning. Their automation systems alert electric service providers who distribute electricity from the wholesale electricity system to consumers. In response, the electric service providers use their automation systems to inform consumers of impending price increases for electricity. This information is passed to an energy management system at the premises, which acts on the homeowner’s behalf, to adjust the electricity usage of the onsite equipment (which might include generation from such sources as a fuel cell). The objective of such a system is to honor the agreement with the electricity service provider and reduce the homeowner’s bill while keeping the occupants as comfortable as possible. This will include actions such as moving the thermostat on the heating, ventilation, and air-conditioning (HVAC) unit up several degrees. The resulting load reduction becomes part of an aggregated response from the electricity service provider to the bulk system operator who is now in a better position to manage total system load with available generation. Looking across the electric system, from generating plants, to transmission substations, to the distribution system, to factories, office parks, and buildings, automation is growing, and the opportunities for unleashing new value propositions are exciting. How can we facilitate this change and do so in a way that ensures the reliability of electric resources for the wellbeing of our economy and security? The GridWise Architecture Council (GWAC) mission is to enable interoperability among the many entities that interact with the electric power system. A good definition of interoperability is, “The capability of two or more networks, systems, devices, applications, or components to exchange information between them and to use the information so exchanged.” As a step in the direction of enabling interoperability, the GWAC proposes a context-setting framework to organize concepts and terminology so that interoperability issues can be identified and debated, improvements to address issues articulated, and actions prioritized and coordinated across the electric power community.« less
Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101
2012-07-01
interoperability, although they are supported by some interoperability attributes For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p
Warfighter IT Interoperability Standards Study
2012-07-22
data (e.g. messages) between systems ? ii) What process did you used to validate and certify semantic interoperability between your...other systems at this time There was no requirement to validate and certify semantic interoperability The DLS program exchanges data with... semantics Testing for System Compliance with Data Models Verify and Certify Interoperability Using Data
Enabling interoperability in planetary sciences and heliophysics: The case for an information model
NASA Astrophysics Data System (ADS)
Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.
2018-01-01
The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.
NASA Astrophysics Data System (ADS)
Tisdale, M.
2016-12-01
NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), OGC Web Coverage Services (WCS) and leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams and ASDC are utilizing these services, developing applications using the Web AppBuilder for ArcGIS and ArcGIS API for Javascript, and evaluating restructuring their data production and access scripts within the ArcGIS Python Toolbox framework and Geoprocessing service environment. These capabilities yield a greater usage and exposure of ASDC data holdings and provide improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry.
NASA Technical Reports Server (NTRS)
Kazz, G. J.; Greenberg, E.
2000-01-01
Over the next decade, international plans and commitments are underway to develop an infrastructure at Mars to support future exploration of the red planet. The purpose of this infrastructure is to provide reliable global communication and navigation coverage for on-approach, landed, roving, and in-flight assets at Mars. The claim is that this infrastructure will: 1) eliminate the need of these assets to carry Direct to Earth (DTE) communications equipment, 2) significantly increase data return and connectivity, 3) enable small mission exploration of Mars without DTE equipment, 4) provide precision navigation i.e., 10 to 100m position resolution, 5) supply timing reference accurate to 10ms. This paper in particular focuses on two CCSDS recommendations for that infrastructure: CCSDS Proximity-1 Space Link Protocol and CCSDS File Delivery Protocol (CFDP). A key aspect of Mars exploration will be the ability of future missions to interoperate. These protocols establish a framework for interoperability by providing standard communication, navigation, and timing services. In addition, these services include strategies to recover gracefully from communication interruptions and interference while ensuring backward compatibility with previous missions from previous phases of exploration.
OntoCR: A CEN/ISO-13606 clinical repository based on ontologies.
Lozano-Rubí, Raimundo; Muñoz Carrero, Adolfo; Serrano Balazote, Pablo; Pastor, Xavier
2016-04-01
To design a new semantically interoperable clinical repository, based on ontologies, conforming to CEN/ISO 13606 standard. The approach followed is to extend OntoCRF, a framework for the development of clinical repositories based on ontologies. The meta-model of OntoCRF has been extended by incorporating an OWL model integrating CEN/ISO 13606, ISO 21090 and SNOMED CT structure. This approach has demonstrated a complete evaluation cycle involving the creation of the meta-model in OWL format, the creation of a simple test application, and the communication of standardized extracts to another organization. Using a CEN/ISO 13606 based system, an indefinite number of archetypes can be merged (and reused) to build new applications. Our approach, based on the use of ontologies, maintains data storage independent of content specification. With this approach, relational technology can be used for storage, maintaining extensibility capabilities. The present work demonstrates that it is possible to build a native CEN/ISO 13606 repository for the storage of clinical data. We have demonstrated semantic interoperability of clinical information using CEN/ISO 13606 extracts. Copyright © 2016 Elsevier Inc. All rights reserved.
Data distribution service-based interoperability framework for smart grid testbed infrastructure
Youssef, Tarek A.; Elsayed, Ahmed T.; Mohammed, Osama A.
2016-03-02
This study presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS) is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discoverymore » feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS).« less
Creating and Sharing Understanding: GEOSS and ArcGIS Online
NASA Astrophysics Data System (ADS)
White, C. E.; Hogeweg, M.; Foust, J.
2014-12-01
The GEOSS program brokers various forms of earth observation data and information via its online platform Discovery and Access Broker (DAB). The platform connects relevant information systems and infrastructures through the world. Esri and the National Research Council of Italy Institute of Atmospheric Pollution Research (CNR-IIA) are building two-way technology between DAB framework and ArcGIS Online using the ArcGIS Online API. Developers will engineer Esri and DAB interfaces and build interoperable web services that connect the two systems. This collaboration makes GEOSS earth observation data and services available to the ArcGIS Online community, and ArcGIS Online a significant part of the GEOSS DAB infrastructure. ArcGIS Online subscribers can discover and access the resources published by GEOSS, use GEOSS data services, and build applications. Making GEOSS content available in ArcGIS Online increases opportunities for scientists in other communities to visualize information in greater context. Moreover, because the platform supports authoritative and crowd-sourcing information, GEOSS members can build networks into other disciplines. This talk will discuss the power of interoperable service architectures that make such a collaboration possible, and the results thus far.
Reasoning about Resources and Hierarchical Tasks Using OWL and SWRL
NASA Astrophysics Data System (ADS)
Elenius, Daniel; Martin, David; Ford, Reginald; Denker, Grit
Military training and testing events are highly complex affairs, potentially involving dozens of legacy systems that need to interoperate in a meaningful way. There are superficial interoperability concerns (such as two systems not sharing the same messaging formats), but also substantive problems such as different systems not sharing the same understanding of the terrain, positions of entities, and so forth. We describe our approach to facilitating such events: describe the systems and requirements in great detail using ontologies, and use automated reasoning to automatically find and help resolve problems. The complexity of our problem took us to the limits of what one can do with OWL, and we needed to introduce some innovative techniques of using and extending it. We describe our novel ways of using SWRL and discuss its limitations as well as extensions to it that we found necessary or desirable. Another innovation is our representation of hierarchical tasks in OWL, and an engine that reasons about them. Our task ontology has proved to be a very flexible and expressive framework to describe requirements on resources and their capabilities in order to achieve some purpose.
Henderson, Michael L; Dayhoff, Ruth E; Titton, Csaba P; Casertano, Andrew
2006-01-01
As part of its patient care mission, the U.S. Veterans Health Administration performs diagnostic imaging procedures at 141 medical centers and 850 outpatient clinics. VHA's VistA Imaging Package provides a full archival, display, and communications infrastructure and interfaces to radiology and other HIS modules as well as modalities and a worklist provider In addition, various medical center entities within VHA have elected to install commercial picture archiving and communications systems to enable image organization and interpretation. To evaluate interfaces between commercial PACS, the VistA hospital information system, and imaging modalities, VHA has built a fully constrained specification that is based on the Radiology Technical Framework (Rad-TF) Integrating the Healthcare Enterprise. The Health Level Seven normative conformance mechanism was applied to the IHE Rad-TF and agency requirements to arrive at a baseline set of message specifications. VHA provides a thorough implementation and testing process to promote the adoption of standards-based interoperability by all PACS vendors that want to interface with VistA Imaging.
Documenting Models for Interoperability and Reusability ...
Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration between scientific communities, since component-based modeling can integrate models from different disciplines. Integrated Environmental Modeling (IEM) systems focus on transferring information between components by capturing a conceptual site model; establishing local metadata standards for input/output of models and databases; managing data flow between models and throughout the system; facilitating quality control of data exchanges (e.g., checking units, unit conversions, transfers between software languages); warning and error handling; and coordinating sensitivity/uncertainty analyses. Although many computational software systems facilitate communication between, and execution of, components, there are no common approaches, protocols, or standards for turn-key linkages between software systems and models, especially if modifying components is not the intent. Using a standard ontology, this paper reviews how models can be described for discovery, understanding, evaluation, access, and implementation to facilitate interoperability and reusability. In the proceedings of the International Environmental Modelling and Software Society (iEMSs), 8th International Congress on Environmental Mod
Building a VO-compliant Radio Astronomical DAta Model for Single-dish radio telescopes (RADAMS)
NASA Astrophysics Data System (ADS)
Santander-Vela, Juan de Dios; García, Emilio; Leon, Stephane; Espigares, Victor; Ruiz, José Enrique; Verdes-Montenegro, Lourdes; Solano, Enrique
2012-11-01
The Virtual Observatory (VO) is becoming the de-facto standard for astronomical data publication. However, the number of radio astronomical archives is still low in general, and even lower is the number of radio astronomical data available through the VO. In order to facilitate the building of new radio astronomical archives, easing at the same time their interoperability with VO framework, we have developed a VO-compliant data model which provides interoperable data semantics for radio data. That model, which we call the Radio Astronomical DAta Model for Single-dish (RADAMS) has been built using standards of (and recommendations from) the International Virtual Observatory Alliance (IVOA). This article describes the RADAMS and its components, including archived entities and their relationships to VO metadata. We show that by using IVOA principles and concepts, the effort needed for both the development of the archives and their VO compatibility has been lowered, and the joint development of two radio astronomical archives have been possible. We plan to adapt RADAMS to be able to deal with interferometry data in the future.
Lowering the barrier to a decentralized NHIN using the open healthcare framework.
Smith, Eishay; Kaufman, James H
2006-01-01
In this paper, we discuss two important elements to lowering the barrier to creation of a National Health Information Network. The first element is the adoption of standards that will enable interoperability while guarantee open interfaces (and preventing vendor lock-in). The second element is the role of open source. While adoption of open standards by large EMR vendors is critically important to enterprise healthcare providers and payors, the availability of inexpensive (or free) standardized Healthcare Information Technology for small physician practices is critical. By analogy to the emergence of the World Wide Web, a framework for creating inexpensive and open source applications for physicians will be as important to realizing a National Health Information Network as availability of free browser technology was to the growth of the internet.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-18
... Docket 07-100; FCC 11-6] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the... interoperable public safety broadband network. The establishment of a common air interface for 700 MHz public safety broadband networks will create a foundation for interoperability and provide a clear path for the...
Juzwishin, Donald W M
2009-01-01
Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.
D-ATM, a working example of health care interoperability: From dirt path to gravel road.
DeClaris, John-William
2009-01-01
For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.
Contextual cloud-based service oriented architecture for clinical workflow.
Moreno-Conde, Jesús; Moreno-Conde, Alberto; Núñez-Benjumea, Francisco J; Parra-Calderón, Carlos
2015-01-01
Given that acceptance of systems within the healthcare domain multiple papers highlighted the importance of integrating tools with the clinical workflow. This paper analyse how clinical context management could be deployed in order to promote the adoption of cloud advanced services and within the clinical workflow. This deployment will be able to be integrated with the eHealth European Interoperability Framework promoted specifications. Throughout this paper, it is proposed a cloud-based service-oriented architecture. This architecture will implement a context management system aligned with the HL7 standard known as CCOW.
2003-01-01
dependencies, and conceptual independencies. Taken together, the three views provide a framework to ensure interoperability, regardless of system... products for COP users . It enables a shared situational awareness that significantly improves the ability of commanders at all levels to quickly make... Review , March-April 1998. 5 Eric K. Shinseki, General , U.S. Army. “ The Army Transformation: A Historic Opportunity,” 2001- 02 Army Green Book
EuroGEOSS/GENESIS ``e-Habitat'' AIP-3 Use Scenario
NASA Astrophysics Data System (ADS)
Mazzetti, P.; Dubois, G.; Santoro, M.; Peedell, S.; de Longueville, B.; Nativi, S.; Craglia, M.
2010-12-01
Natural ecosystems are in rapid decline. Major habitats are disappearing at a speed never observed before. The current rate of species extinction is several orders of magnitude higher than the background rate from the fossil record. Protected Areas (PAs) and Protected Area Systems are designed to conserve natural and cultural resources, to maintain biodiversity (ecosystems, species, genes) and ecosystem services. The scientific challenge of understanding how environmental and climatological factors impact on ecosystems and habitats requires the use of information from different scientific domains. Thus, multidisciplinary interoperability is a crucial requirement for a framework aiming to support scientists. The Group on Earth Observations (or GEO) is coordinating international efforts to build a Global Earth Observation System of Systems (GEOSS). This emerging public infrastructure is interconnecting a diverse and growing array of instruments and systems for monitoring and forecasting changes in the global environment. This “system of systems” supports multidisciplinary and cross-disciplinary scientific researches. The presented GEOSS-based interoperability framework facilitates the discovery and exploitation of datasets and models from heterogeneous scientific domains and Information Technology services (data sources). The GEO Architecture and Data Committee (ADC) launched the Architecture Implementation Pilot (AIP) Initiative to develop and deploy new processes and infrastructure components for the GEOSS Common Infrastructure (GCI) and the broader GEOSS architecture. The current AIP Phase 3 (AIP-3) aims to increase GEOSS capacity to support several strategic Societal Benefit Areas (SBAs) including: Disaster Management, Health/Air Quality, Biodiversity, Energy, Health/Disease and Water. As to Biodiversity, the EC-funded EuroGEOSS (http://www.eurogeoss.eu) and GENESIS (http://www.genesis-fp7.eu) projects have developed a use scenario called “e-Habitat”. This scenario demonstrates how a GEOSS-based interoperability infrastructure can aid decision makers to assess and possibly forecast the irreplaceability of a given protected area, an essential indicator for assessing the criticality of threats this protected area is exposed to. Based on the previous AIP-Phase2 experience, the EuroGEOSS and GENESIS projects enhanced the successfully experimented interoperability infrastructure with: a) a discovery broker service which underpins semantics enabled queries: the EuroGEOSS/GENESIS Discovery Augmentation Component (DAC); b) environmental modeling components (i.e. OGC WPS instances) implementing algorithms to predict evolution of PAs ecosystems; c) a workflow engine to: i) browse semantic repositories; ii) retrieve concepts of interest; iii) search for resources (i.e. datasets and models) related to such concepts; iv) execute WPS instances. This presentation introduces the enhanced infrastructure developed by the EuroGEOSS/GENESIS AIP-3 Pilot to implement the “e-Habitat” use scenario. The presented infrastructure is accessible through the GEO Portal and is going to be used for demonstrating the “e-Habitat” model at the GEO Ministerial Meeting - Beijing, November 2010.
A Common Metadata System for Marine Data Portals
NASA Astrophysics Data System (ADS)
Wosniok, C.; Breitbach, G.; Lehfeldt, R.
2012-04-01
Processing and allocation of marine datasets depend on the nature of the data resulting from field campaigns, continuous monitoring and numerical modeling. Two research and development projects in northern Germany manage different types of marine data. Due to different data characteristics and institutional frameworks separate data portals are required. This paper describes the integration of distributed marine data in Germany. The Marine Data Infrastructure of Germany (MDI-DE) supports public authorities in the German coastal zone with the implementation of European directives like INSPIRE or the Marine Strategy Framework Directive. This is carried out through setting up standardized web services within a network of participating coastal agencies and the installation of a common data portal (http://www.mdi-de.org), which integrates distributed marine data concerning coastal engineering, coastal water protection and nature conservation in an interoperable and harmonized manner for administrative and scientific purposes as well as for information of the general public. The Coastal Observation System for Northern and Arctic Seas (COSYNA) aims at developing and testing analysis systems for the operational synoptic description of the environmental status of the North Sea and of Arctic coastal waters. This is done by establishing a network of monitoring facilities and the provision of its data in near-real-time. In situ measurements with poles, ferry boxes, and buoys, together with remote sensing measurements, and the data assimilation of these data into simulation results enables COSYNA to provide pre-operational 'products', that are beyond the present routinely applied techniques in observation and modelling. The data allocation in near-real-time requires thoroughly executed data validation, which is processed on the fly before data is passed on to the COSYNA portal (http://kofserver2.hzg.de/codm/). Both projects apply OGC standards such as Web Mapping Service (WMS), Web Feature Service (WFS) and Sensor Observation Service (SOS), which ensures interoperability and extensibility. In addition, metadata as crucial components for searching and finding information in large data infrastructures is provided via the Catalogue Web Service (CS-W). MDI-DE and COSYNA rely on the metadata information system for marine metadata NOKIS, which reflects a metadata profile tailored for marine data according to the specifications of German coastal authorities. In spite of this common software base, interoperability between the two data collections requires constant alignments of the diverse data processed by the two portals. While monitoring data in the MDI-DE is currently rather campaign-based, COSYNA has to fit constantly evolving time series into metadata sets. With all data following the same metadata profile, we now reach full interoperability between the different data collections. The distributed marine information system provides options to search, find and visualise the harmonised results from continuous monitoring, field campaigns, numerical modeling and other data in one web client.
Interoperability of Information Systems Managed and Used by the Local Health Departments.
Shah, Gulzar H; Leider, Jonathon P; Luo, Huabin; Kaur, Ravneet
2016-01-01
In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide.
National electronic health record interoperability chronology.
Hufnagel, Stephen P
2009-05-01
The federal initiative for electronic health record (EHR) interoperability began in 2000 and set the stage for the establishment of the 2004 Executive Order for EHR interoperability by 2014. This article discusses the chronology from the 2001 e-Government Consolidated Health Informatics (CHI) initiative through the current congressional mandates for an aligned, interoperable, and agile DoD AHLTA and VA VistA.
Enhanced semantic interoperability by profiling health informatics standards.
López, Diego M; Blobel, Bernd
2009-01-01
Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.
Daskalakis, S; Mantas, J
2009-01-01
The evaluation of a service-oriented prototype implementation for healthcare interoperability. A prototype framework was developed, aiming to exploit the use of service-oriented architecture (SOA) concepts for achieving healthcare interoperability and to move towards a virtual patient record (VPR) paradigm. The prototype implementation was evaluated for its hypothetical adoption. The evaluation strategy was based on the initial proposition of the DeLone and McLean model of information systems (IS) success [1], as modeled by Iivari [2]. A set of SOA and VPR characteristics were empirically encapsulated within the dimensions of IS success model, combined with measures from previous research works. The data gathered was analyzed using partial least squares (PLS). The results highlighted that system quality is a partial predictor of system use but not of user satisfaction. On the contrary, information quality proved to be a significant predictor of user satisfaction and partially a strong significant predictor of system use. Moreover, system use did not prove to be a significant predictor of individual impact whereas the bi-directional relation between use and user satisfaction did not confirm. Additionally, user satisfaction was found to be a strong significant predictor of individual impact. Finally, individual impact proved to be a strong significant predictor of organizational impact. The empirical study attempted to obtain hypothetical, but still useful beliefs and perceptions regarding the SOA prototype implementation. The deduced observations can form the basis for further investigation regarding the adaptability of SOA implementations with VPR characteristics in the healthcare domain.
Kano, Yoshinobu; Nguyen, Ngan; Saetre, Rune; Yoshida, Kazuhiro; Miyao, Yusuke; Tsuruoka, Yoshimasa; Matsubayashi, Yuichiro; Ananiadou, Sophia; Tsujii, Jun'ichi
2008-01-01
Recently, several text mining programs have reached a near-practical level of performance. Some systems are already being used by biologists and database curators. However, it has also been recognized that current Natural Language Processing (NLP) and Text Mining (TM) technology is not easy to deploy, since research groups tend to develop systems that cater specifically to their own requirements. One of the major reasons for the difficulty of deployment of NLP/TM technology is that re-usability and interoperability of software tools are typically not considered during development. While some effort has been invested in making interoperable NLP/TM toolkits, the developers of end-to-end systems still often struggle to reuse NLP/TM tools, and often opt to develop similar programs from scratch instead. This is particularly the case in BioNLP, since the requirements of biologists are so diverse that NLP tools have to be adapted and re-organized in a much more extensive manner than was originally expected. Although generic frameworks like UIMA (Unstructured Information Management Architecture) provide promising ways to solve this problem, the solution that they provide is only partial. In order for truly interoperable toolkits to become a reality, we also need sharable type systems and a developer-friendly environment for software integration that includes functionality for systematic comparisons of available tools, a simple I/O interface, and visualization tools. In this paper, we describe such an environment that was developed based on UIMA, and we show its feasibility through our experience in developing a protein-protein interaction (PPI) extraction system.
The Development of Clinical Document Standards for Semantic Interoperability in China
Yang, Peng; Pan, Feng; Wan, Yi; Tu, Haibo; Tang, Xuejun; Hu, Jianping
2011-01-01
Objectives This study is aimed at developing a set of data groups (DGs) to be employed as reusable building blocks for the construction of the eight most common clinical documents used in China's general hospitals in order to achieve their structural and semantic standardization. Methods The Diagnostics knowledge framework, the related approaches taken from the Health Level Seven (HL7), the Integrating the Healthcare Enterprise (IHE), and the Healthcare Information Technology Standards Panel (HITSP) and 1,487 original clinical records were considered together to form the DG architecture and data sets. The internal structure, content, and semantics of each DG were then defined by mapping each DG data set to a corresponding Clinical Document Architecture data element and matching each DG data set to the metadata in the Chinese National Health Data Dictionary. By using the DGs as reusable building blocks, standardized structures and semantics regarding the clinical documents for semantic interoperability were able to be constructed. Results Altogether, 5 header DGs, 48 section DGs, and 17 entry DGs were developed. Several issues regarding the DGs, including their internal structure, identifiers, data set names, definitions, length and format, data types, and value sets, were further defined. Standardized structures and semantics regarding the eight clinical documents were structured by the DGs. Conclusions This approach of constructing clinical document standards using DGs is a feasible standard-driven solution useful in preparing documents possessing semantic interoperability among the disparate information systems in China. These standards need to be validated and refined through further study. PMID:22259722
On the formal definition of the systems' interoperability capability: an anthropomorphic approach
NASA Astrophysics Data System (ADS)
Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav
2017-03-01
The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.
HCI∧2 framework: a software framework for multimodal human-computer interaction systems.
Shen, Jie; Pantic, Maja
2013-12-01
This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a shared-memory-based data transport protocol for message delivery and a TCP-based system management protocol. The latter ensures that the integrity of system structure is maintained at runtime. With the inclusion of bridging modules, the HCI∧2 Framework is interoperable with other software frameworks including Psyclone and ActiveMQ. In addition to the core communication middleware, we also present the integrated development environment (IDE) of the HCI∧2 Framework. It provides a complete graphical environment to support every step in a typical MHCI system development process, including module development, debugging, packaging, and management, as well as the whole system management and testing. The quantitative evaluation indicates that our framework outperforms other similar tools in terms of average message latency and maximum data throughput under a typical single PC scenario. To demonstrate HCI∧2 Framework's capabilities in integrating heterogeneous modules, we present several example modules working with a variety of hardware and software. We also present an example of a full system developed using the proposed HCI∧2 Framework, which is called the CamGame system and represents a computer game based on hand-held marker(s) and low-cost camera(s).
Empowering open systems through cross-platform interoperability
NASA Astrophysics Data System (ADS)
Lyke, James C.
2014-06-01
Most of the motivations for open systems lie in the expectation of interoperability, sometimes referred to as "plug-and-play". Nothing in the notion of "open-ness", however, guarantees this outcome, which makes the increased interest in open architecture more perplexing. In this paper, we explore certain themes of open architecture. We introduce the concept of "windows of interoperability", which can be used to align disparate portions of architecture. Such "windows of interoperability", which concentrate on a reduced set of protocol and interface features, might achieve many of the broader purposes assigned as benefits in open architecture. Since it is possible to engineer proprietary systems that interoperate effectively, this nuanced definition of interoperability may in fact be a more important concept to understand and nurture for effective systems engineering and maintenance.
SMART on FHIR: a standards-based, interoperable apps platform for electronic health records
Kreda, David A; Mandl, Kenneth D; Kohane, Isaac S; Ramoni, Rachel B
2016-01-01
Objective In early 2010, Harvard Medical School and Boston Children’s Hospital began an interoperability project with the distinctive goal of developing a platform to enable medical applications to be written once and run unmodified across different healthcare IT systems. The project was called Substitutable Medical Applications and Reusable Technologies (SMART). Methods We adopted contemporary web standards for application programming interface transport, authorization, and user interface, and standard medical terminologies for coded data. In our initial design, we created our own openly licensed clinical data models to enforce consistency and simplicity. During the second half of 2013, we updated SMART to take advantage of the clinical data models and the application-programming interface described in a new, openly licensed Health Level Seven draft standard called Fast Health Interoperability Resources (FHIR). Signaling our adoption of the emerging FHIR standard, we called the new platform SMART on FHIR. Results We introduced the SMART on FHIR platform with a demonstration that included several commercial healthcare IT vendors and app developers showcasing prototypes at the Health Information Management Systems Society conference in February 2014. This established the feasibility of SMART on FHIR, while highlighting the need for commonly accepted pragmatic constraints on the base FHIR specification. Conclusion In this paper, we describe the creation of SMART on FHIR, relate the experience of the vendors and developers who built SMART on FHIR prototypes, and discuss some challenges in going from early industry prototyping to industry-wide production use. PMID:26911829
MAPI: a software framework for distributed biomedical applications
2013-01-01
Background The amount of web-based resources (databases, tools etc.) in biomedicine has increased, but the integrated usage of those resources is complex due to differences in access protocols and data formats. However, distributed data processing is becoming inevitable in several domains, in particular in biomedicine, where researchers face rapidly increasing data sizes. This big data is difficult to process locally because of the large processing, memory and storage capacity required. Results This manuscript describes a framework, called MAPI, which provides a uniform representation of resources available over the Internet, in particular for Web Services. The framework enhances their interoperability and collaborative use by enabling a uniform and remote access. The framework functionality is organized in modules that can be combined and configured in different ways to fulfil concrete development requirements. Conclusions The framework has been tested in the biomedical application domain where it has been a base for developing several clients that are able to integrate different web resources. The MAPI binaries and documentation are freely available at http://www.bitlab-es.com/mapi under the Creative Commons Attribution-No Derivative Works 2.5 Spain License. The MAPI source code is available by request (GPL v3 license). PMID:23311574
Larkins, Sarah L; Preston, Robyn; Matte, Marie C; Lindemann, Iris C; Samson, Rex; Tandinco, Filedito D; Buso, David; Ross, Simone J; Pálsdóttir, Björg; Neusy, André-Jacques
2013-01-01
Health professional schools are responsible for producing graduates with competencies and attitudes to address health inequities and respond to priority health needs. Health professional schools striving towards social accountability founded the Training for Health Equity Network (THEnet). This article describes the development of THEnet evaluation framework for socially accountable health professional education, presents the framework to be used as a tool by other schools and discusses the findings of pilot implementation at five schools. The framework was designed collaboratively and built on Boelen and Woollard's conceptualization, production and usability model. It includes key components, linked to aspirational statements, indicators and suggested measurement tools. Five schools completed pilot implementation, involving workshops, document/data review and focus group discussions with faculty, students and community members. Three sections of the framework consider: How does our school work?; What do we do? and What difference do we make? Pilot testing proved that the evaluation framework was acceptable and feasible across contexts and produced findings useful at school level and to compare schools. The framework is designed as a formative exercise to help schools take a critical look at their performance and progress towards social accountability. Initiatives to implement the framework more widely are underway. The framework effectively aids in identifying strengths, weaknesses and gaps, with a view to schools striving for continuous self-improvement. THEnet evaluation framework is applicable and useful across contexts. It is possible and desirable to assess progress towards social accountability in health professional schools and this is an important step in producing health professionals with knowledge, attitudes, and skills to meet the challenges of priority health needs of underserved populations.
Interoperability of Information Systems Managed and Used by the Local Health Departments
Leider, Jonathon P.; Luo, Huabin; Kaur, Ravneet
2016-01-01
Background: In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). Objectives: To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. Data and Methods: This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. Results: For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Conclusion: Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide. PMID:27684616
European security framework for healthcare.
Ruotsalainen, Pekka; Pohjonen, Hanna
2003-01-01
eHealth and telemedicine services are promising business areas in Europe. It is clear that eHealth products and services will be sold and ordered from a distance and over national borderlines in the future. However, there are many barriers to overcome. For both national and pan-European eHealth and telemedicine applications a common security framework is needed. These frameworks set security requirements needed for cross-border eHealth services. The next step is to build a security infrastructure which is independent of technical platforms. Most of the European eHealth platforms are regional or territorial. Some countries are looking for a Public Key Infrastructure, but no large scale solutions do exist in healthcare. There is no clear candidate solution for European-wide interoperable eHealth platform. Gross-platform integration seems to be the most practical integration method at a European level in the short run. The use of Internet as a European integration platform is a promising solution in the long run.
UniDA: Uniform Device Access Framework for Human Interaction Environments
Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose Antonio; Vazquez-Rodriguez, Santiago; Duro, Richard José
2011-01-01
Human interaction environments (HIE) must be understood as any place where people carry out their daily life, including their work, family life, leisure and social life, interacting with technology to enhance or facilitate the experience. The integration of technology in these environments has been achieved in a disorderly and incompatible way, with devices operating in isolated islands with artificial edges delimited by the manufacturers. In this paper we are presenting the UniDA framework, an integral solution for the development of systems that require the integration and interoperation of devices and technologies in HIEs. It provides developers and installers with a uniform conceptual framework capable of modelling an HIE, together with a set of libraries, tools and devices to build distributed instrumentation networks with support for transparent integration of other technologies. A series of use case examples and a comparison to many of the existing technologies in the field has been included in order to show the benefits of using UniDA. PMID:22163700
UniDA: uniform device access framework for human interaction environments.
Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose Antonio; Vazquez-Rodriguez, Santiago; Duro, Richard José
2011-01-01
Human interaction environments (HIE) must be understood as any place where people carry out their daily life, including their work, family life, leisure and social life, interacting with technology to enhance or facilitate the experience. The integration of technology in these environments has been achieved in a disorderly and incompatible way, with devices operating in isolated islands with artificial edges delimited by the manufacturers. In this paper we are presenting the UniDA framework, an integral solution for the development of systems that require the integration and interoperation of devices and technologies in HIEs. It provides developers and installers with a uniform conceptual framework capable of modelling an HIE, together with a set of libraries, tools and devices to build distributed instrumentation networks with support for transparent integration of other technologies. A series of use case examples and a comparison to many of the existing technologies in the field has been included in order to show the benefits of using UniDA.
EarthCube's Assessment Framework: Ensuring Return on Investment
NASA Astrophysics Data System (ADS)
Lehnert, K.
2016-12-01
EarthCube is a community-governed, NSF-funded initiative to transform geoscience research by developing cyberinfrastructure that improves access, sharing, visualization, and analysis of all forms of geosciences data and related resources. EarthCube's goal is to enable geoscientists to tackle the challenges of understanding and predicting a complex and evolving solid Earth, hydrosphere, atmosphere, and space environment systems. EarthCube's infrastructure needs capabilities around data, software, and systems. It is essential for EarthCube to determine the value of new capabilities for the community and the progress of the overall effort to demonstrate its value to the science community and Return on Investment for the NSF. EarthCube is therefore developing an assessment framework for research proposals, projects funded by EarthCube, and the overall EarthCube program. As a first step, a software assessment framework has been developed that addresses the EarthCube Strategic Vision by promoting best practices in software development, complete and useful documentation, interoperability, standards adherence, open science, and education and training opportunities for research developers.
USGEO Common Framework For Earth Observation Data
NASA Astrophysics Data System (ADS)
Walter, J.; de la Beaujardiere, J.; Bristol, S.
2015-12-01
The United States Group on Earth Observations (USGEO) Data Management Working Group (DMWG) is an interagency body established by the White House Office of Science and Technology Policy (OSTP). The primary purpose of this group is to foster interagency cooperation and collaboration for improving the life cycle data management practices and interoperability of federally held earth observation data consistent with White House documents including the National Strategy for Civil Earth Observations, the National Plan for Civil Earth Observations, and the May 2013 Executive Order on Open Data (M-13-13). The members of the USGEO DMWG are working on developing a Common Framework for Earth Observation Data that consists of recommended standards and approaches for realizing these goals as well as improving the discoverability, accessibility, and usability of federally held earth observation data. These recommendations will also guide work being performed under the Big Earth Data Initiative (BEDI). This talk will summarize the Common Framework, the philosophy behind it, and next steps forward.
Sustainable Management of Seagrass Meadows: the GEOSS AIP-6 Pilot
NASA Astrophysics Data System (ADS)
Santoro, Mattia; Pastres, Roberto; Zucchetta, Matteo; Venier, Chiara; Roncella, Roberto; Bigagli, Lorenzo; Mangin, Antoine; Amine Taji, Mohamed; Gonzalo Malvarez, Gonzalo; Nativi, Stefano
2014-05-01
Seagrass meadows (marine angiosperm plants) occupy less than 0.2% of the global ocean surface, annually store about 10-18% of the so-called "Blue Carbon", i.e. the Carbon stored in coastal vegetated areas. Recent literature estimates that the flux to the long-term carbon sink in seagrasses represents 10-20% of seagrasses global average production. Such figures can be translated into economic benefits, taking into account that a ton of carbon dioxide in Europe is paid at around 15 € in the carbon market. This means that the organic carbon retained in seagrass sediments in the Mediterranean is worth 138 - 1128 billion €, which represents 6-23 € per square meter. This is 9-35 times more than one square meter of tropical forest soil (0.66 € per square meter), or 5-17 times when considering both the above and the belowground compartments in tropical forests. According the most conservative estimations, about 10% of the Mediterranean meadows have been lost during the last century. To estimate seagrass meadows distribution, a Species Distribution Model (SDM) can be used. SDM is a tool that is used to evaluate the potential distribution of a given species (e.g. Posidonia oceanica for seagrass) on the basis of the features (bio-chemical-physical parameters) of the studied environment. In the framework of the GEOSS (Global Earth Observation System of Systems) initiative, the FP7 project MEDINA developed a showcase as part of the GEOSS Architecture Interoperability Pilot - phase 6 (AIP-6). The showcase aims at providing a tool for the sustainable management of seagrass meadows along the Mediterranean coastline by integrating the SDM with available GEOSS resources. This way, the required input data can be searched, accessed and ingested into the model leveraging the brokering framework of the GEOSS Common Infrastructure (GCI). This framework is comprised of a set of middle-ware components (Brokers) that are in charge of implementing the needed interoperability arrangements to interconnect the heterogeneous and distributed capacities contributing to GEOSS. The presentation discusses such a framework explaining how the input data is discovered, accessed and processed to ingest the model. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement n. 282977.
Towards semantic interoperability for electronic health records.
Garde, Sebastian; Knaup, Petra; Hovenga, Evelyn; Heard, Sam
2007-01-01
In the field of open electronic health records (EHRs), openEHR as an archetype-based approach is being increasingly recognised. It is the objective of this paper to shortly describe this approach, and to analyse how openEHR archetypes impact on health professionals and semantic interoperability. Analysis of current approaches to EHR systems, terminology and standards developments. In addition to literature reviews, we organised face-to-face and additional telephone interviews and tele-conferences with members of relevant organisations and committees. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability -- both important prerequisites for semantic interoperability. Archetypes enable the formal definition of clinical content by clinicians. To enable comprehensive semantic interoperability, the development and maintenance of archetypes needs to be coordinated internationally and across health professions. Domain knowledge governance comprises a set of processes that enable the creation, development, organisation, sharing, dissemination, use and continuous maintenance of archetypes. It needs to be supported by information technology. To enable EHRs, semantic interoperability is essential. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability. However, without coordinated archetype development and maintenance, 'rank growth' of archetypes would jeopardize semantic interoperability. We therefore believe that openEHR archetypes and domain knowledge governance together create the knowledge environment required to adopt EHRs.
The OGC Sensor Web Enablement framework
NASA Astrophysics Data System (ADS)
Cox, S. J.; Botts, M.
2006-12-01
Sensor observations are at the core of natural sciences. Improvements in data-sharing technologies offer the promise of much greater utilisation of observational data. A key to this is interoperable data standards. The Open Geospatial Consortium's (OGC) Sensor Web Enablement initiative (SWE) is developing open standards for web interfaces for the discovery, exchange and processing of sensor observations, and tasking of sensor systems. The goal is to support the construction of complex sensor applications through real-time composition of service chains from standard components. The framework is based around a suite of standard interfaces, and standard encodings for the message transferred between services. The SWE interfaces include: Sensor Observation Service (SOS)-parameterized observation requests (by observation time, feature of interest, property, sensor); Sensor Planning Service (SPS)-tasking a sensor- system to undertake future observations; Sensor Alert Service (SAS)-subscription to an alert, usually triggered by a sensor result exceeding some value. The interface design generally follows the pattern established in the OGC Web Map Service (WMS) and Web Feature Service (WFS) interfaces, where the interaction between a client and service follows a standard sequence of requests and responses. The first obtains a general description of the service capabilities, followed by obtaining detail required to formulate a data request, and finally a request for a data instance or stream. These may be implemented in a stateless "REST" idiom, or using conventional "web-services" (SOAP) messaging. In a deployed system, the SWE interfaces are supplemented by Catalogue, data (WFS) and portrayal (WMS) services, as well as authentication and rights management. The standard SWE data formats are Observations and Measurements (O&M) which encodes observation metadata and results, Sensor Model Language (SensorML) which describes sensor-systems, Transducer Model Language (TML) which covers low-level data streams, and domain-specific GML Application Schemas for definitions of the target feature types. The SWE framework has been demonstrated in several interoperability testbeds. These were based around emergency management, security, contamination and environmental monitoring scenarios.
Innovations in Arizona's Accountability Policies and Frameworks for Alternative Schools
ERIC Educational Resources Information Center
Schlessman, Amy
2014-01-01
This study presents Arizona's innovations in academic accountability policy and academic accountability frameworks for alternative schools. A timeline of statutes and regulations including the State Board of Education approved alternative school definition provides Arizona's context for alternative school accountability policy and frameworks.…
Kinjo, Akira R.; Suzuki, Hirofumi; Yamashita, Reiko; Ikegawa, Yasuyo; Kudou, Takahiro; Igarashi, Reiko; Kengaku, Yumiko; Cho, Hasumi; Standley, Daron M.; Nakagawa, Atsushi; Nakamura, Haruki
2012-01-01
The Protein Data Bank Japan (PDBj, http://pdbj.org) is a member of the worldwide Protein Data Bank (wwPDB) and accepts and processes the deposited data of experimentally determined macromolecular structures. While maintaining the archive in collaboration with other wwPDB partners, PDBj also provides a wide range of services and tools for analyzing structures and functions of proteins, which are summarized in this article. To enhance the interoperability of the PDB data, we have recently developed PDB/RDF, PDB data in the Resource Description Framework (RDF) format, along with its ontology in the Web Ontology Language (OWL) based on the PDB mmCIF Exchange Dictionary. Being in the standard format for the Semantic Web, the PDB/RDF data provide a means to integrate the PDB with other biological information resources. PMID:21976737
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.
The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levelsmore » to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.« less
Space Network Interoperability Panel (SNIP) study
NASA Technical Reports Server (NTRS)
Ryan, Thomas; Lenhart, Klaus; Hara, Hideo
1991-01-01
The Space Network Interoperability Panel (SNIP) study is a tripartite study that involves the National Aeronautics and Space Administration (NASA), the European Space Agency (ESA), and the National Space Development Agency (NASDA) of Japan. SNIP involves an ongoing interoperability study of the Data Relay Satellite (DRS) Systems of the three organizations. The study is broken down into two parts; Phase one deals with S-band (2 GHz) interoperability and Phase two deals with Ka-band (20/30 GHz) interoperability (in addition to S-band). In 1987 the SNIP formed a Working Group to define and study operations concepts and technical subjects to assure compatibility of the international data relay systems. Since that time a number of Panel and Working Group meetings have been held to continue the study. Interoperability is of interest to the three agencies because it offers a number of potential operation and economic benefits. This paper presents the history and status of the SNIP study.
NASA Astrophysics Data System (ADS)
Oggioni, A.; Tagliolato, P.; Schleidt, K.; Carrara, P.; Grellet, S.; Sarretta, A.
2016-02-01
The state of the art in biodiversity data management unfortunately encompases a plethora of diverse data formats. Compared to other research fields, there is a lack in harmonization and standardization of these data. While data from traditional biodiversity collections (e.g. from museums) can be easily represented by existing standard as provided by TDWG, the growing number of field observations stemming from both VGI activities (e.g. iNaturalist) as well as from automated systems (e.g. animal biotelemetry) would at the very least require upgrades of current formats. Moreover, from an eco-informatics perspective, the integration and use of data from different scientific fields is the norm (abiotic data, geographic information, etc.); the possibility to represent this information and biodiversity data in a homogeneous way would be an advantage for interoperability, allowing for easy integration across environmental media. We will discuss the possibility to exploit the Open Geospatial Consortium/ISO standard, Observations and Measurements (O&M) [1], a generic conceptual model developed for observation data but with strong analogies with the biodiversity-oriented OBOE ontology [2]. The applicability of OGC O&M for the provision of biodiviersity occurence data has been suggested by the INSPIRE Cross Thematic Working Group on Observations & Measurements [3], Inspire Environmental Monitoring Facilities Thematic Working Group [4] and New Zealand Environmental Information Interoperability Framework [5]. This approach, in our opinion, could be an advantage for the biodiversity community. We will provide some examples for encoding biodiversity occurence data using the O&M standard in addition to highlighting the advatages offered by O&M in comparison to other representation formats. [1] Cox, S. (2013). Geographic information - Observations and measurements - OGC and ISO 19156. [2] Madin, J., Bowers, S., Schildhauer, M., Krivov, S., Pennington, D., & Villa, F. (2007). An ontology for describing and synthesizing ecological observation data. Ecological Informatics, 2(3), 279-296. [3] INSPIRE_D2.9_O&M_Guidelines_v2.0rc3.pdf[4] INSPIRE_DataSpecification_EF_v3.0.pdf[5] Watkins, A. (2012) Biodiversity Interoperability through Open Geospatial Standards
Test Protocols for Advanced Inverter Interoperability Functions – Main Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.
2013-11-01
Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated onmore » grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not currently required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as some of these inverter capabilities are being incorporated in large demonstration and commercial projects. The test protocols are intended to be used to verify acceptable performance of inverters within the standard framework described in IEC TR 61850-90-7. These test protocols, as they are refined and validated over time, can become precursors for future certification test procedures for DER advanced grid support functions.« less
Test Protocols for Advanced Inverter Interoperability Functions - Appendices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.
2013-11-01
Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated onmore » grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not now required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as some of these inverter capabilities are being incorporated in large demonstration and commercial projects. The test protocols are intended to be used to verify acceptable performance of inverters within the standard framework described in IEC TR 61850-90-7. These test protocols, as they are refined and validated over time, can become precursors for future certification test procedures for DER advanced grid support functions.« less
Cicutto, Lisa; Gleason, Melanie; Haas-Howard, Christy; Jenkins-Nygren, Lynn; Labonde, Susan; Patrick, Kathy
2017-08-01
School health teams commonly address the needs of students with asthma, which requires specific knowledge and skills. To develop a skilled school health team, a competency-based framework for managing asthma in schools was developed. A modified Delphi with 31 panelists was completed. Consensus (≥80% agreement) was reached for all 148 items regarding the appropriateness as a minimum competency for asthma care in schools. The resultant Colorado Competency Framework for Asthma Care in Schools guided the development and pilot testing of a continuing education curriculum for school nurses. Pre- and postassessments demonstrated significant improvements in knowledge and self-confidence related to asthma care in schools and inhaler technique skills. This work is the first to use a consensus process to identify a framework of minimum competencies for providing asthma care in schools. This framework informed a continuing education curriculum that resulted in improved knowledge, confidence, and skills for school nurses.
Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.
Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert
2017-08-01
Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.
Changing Medical School IT to Support Medical Education Transformation.
Spickard, Anderson; Ahmed, Toufeeq; Lomis, Kimberly; Johnson, Kevin; Miller, Bonnie
2016-01-01
Many medical schools are modifying curricula to reflect the rapidly evolving health care environment, but schools struggle to provide the educational informatics technology (IT) support to make the necessary changes. Often a medical school's IT support for the education mission derives from isolated work units employing separate technologies that are not interoperable. We launched a redesigned, tightly integrated, and novel IT infrastructure to support a completely revamped curriculum at the Vanderbilt School of Medicine. This system uses coordinated and interoperable technologies to support new instructional methods, capture students' effort, and manage feedback, allowing the monitoring of students' progress toward specific competency goals across settings and programs. The new undergraduate medical education program at Vanderbilt, entitled Curriculum 2.0, is a competency-based curriculum in which the ultimate goal is medical student advancement based on performance outcomes and personal goals rather than a time-based sequence of courses. IT support was essential in the creation of Curriculum 2.0. In addition to typical learning and curriculum management functions, IT was needed to capture data in the learning workflow for analysis, as well as for informing individual and programmatic success. We aligned people, processes, and technology to provide the IT infrastructure for the organizational transformation. Educational IT personnel were successfully realigned to create the new IT system. The IT infrastructure enabled monitoring of student performance within each competency domain across settings and time via personal student electronic portfolios. Students use aggregated performance data, derived in real time from the portfolio, for mentor-guided performance assessment, and for creation of individual learning goals and plans. Poorly performing students were identified earlier through online communication systems that alert the appropriate instructor or coach of low quiz grades or missed learning goals. Graphical and narrative displays of a student's competency performance across courses and clinical experiences informed high-stake decisions made about student progress by the promotions committee. Similarly, graphical display of aggregate student outcomes provided education leaders with information needed to adjust and improve the curriculum. With the alignment of people, processes, and technology, educational IT can facilitate transformational steps in the training of medical students.
ERIC Educational Resources Information Center
Anwar-McHenry, Julia; Donovan, Robert John; Nicholas, Amberlee; Kerrigan, Simone; Francas, Stephanie; Phan, Tina
2016-01-01
Purpose: Mentally Healthy WA developed and implemented the Mentally Healthy Schools Framework in 2010 in response to demand from schools wanting to promote the community-based Act-Belong-Commit mental health promotion message within a school setting. Schools are an important setting for mental health promotion, therefore, the Framework encourages…
Impact of coalition interoperability on PKI
NASA Astrophysics Data System (ADS)
Krall, Edward J.
2003-07-01
This paper examines methods for providing PKI interoperability among units of a coalition of armed forces drawn from different nations. The area in question is tactical identity management, for the purposes of confidentiality, integrity and non-repudiation in such a dynamic coalition. The interoperating applications under consideration range from email and other forms of store-and-forward messaging to TLS and IPSEC-protected real-time communications. Six interoperability architectures are examined with advantages and disadvantages of each described in the paper.
Telemedicine system interoperability architecture: concept description and architecture overview.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard Layne, II
2004-05-01
In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 1 2014-10-01 2014-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 1 2013-10-01 2013-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
PharmML in Action: an Interoperable Language for Modeling and Simulation.
Bizzotto, R; Comets, E; Smith, G; Yvon, F; Kristensen, N R; Swat, M J
2017-10-01
PharmML is an XML-based exchange format created with a focus on nonlinear mixed-effect (NLME) models used in pharmacometrics, but providing a very general framework that also allows describing mathematical and statistical models such as single-subject or nonlinear and multivariate regression models. This tutorial provides an overview of the structure of this language, brief suggestions on how to work with it, and use cases demonstrating its power and flexibility. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
Retrieving and Indexing Spatial Data in the Cloud Computing Environment
NASA Astrophysics Data System (ADS)
Wang, Yonggang; Wang, Sheng; Zhou, Daliang
In order to solve the drawbacks of spatial data storage in common Cloud Computing platform, we design and present a framework for retrieving, indexing, accessing and managing spatial data in the Cloud environment. An interoperable spatial data object model is provided based on the Simple Feature Coding Rules from the OGC such as Well Known Binary (WKB) and Well Known Text (WKT). And the classic spatial indexing algorithms like Quad-Tree and R-Tree are re-designed in the Cloud Computing environment. In the last we develop a prototype software based on Google App Engine to implement the proposed model.
Predicting the behavior of techno-social systems.
Vespignani, Alessandro
2009-07-24
We live in an increasingly interconnected world of techno-social systems, in which infrastructures composed of different technological layers are interoperating within the social component that drives their use and development. Examples are provided by the Internet, the World Wide Web, WiFi communication technologies, and transportation and mobility infrastructures. The multiscale nature and complexity of these networks are crucial features in understanding and managing the networks. The accessibility of new data and the advances in the theory and modeling of complex networks are providing an integrated framework that brings us closer to achieving true predictive power of the behavior of techno-social systems.
Strategic Adaptation of SCA for STRS
NASA Technical Reports Server (NTRS)
Quinn, Todd; Kacpura, Thomas
2007-01-01
The Space Telecommunication Radio System (STRS) architecture is being developed to provide a standard framework for future NASA space radios with greater degrees of interoperability and flexibility to meet new mission requirements. The space environment imposes unique operational requirements with restrictive size, weight, and power constraints that are significantly smaller than terrestrial-based military communication systems. With the harsh radiation environment of space, the computing and processing resources are typically one or two generations behind current terrestrial technologies. Despite these differences, there are elements of the SCA that can be adapted to facilitate the design and implementation of the STRS architecture.
A Business-to-Business Interoperability Testbed: An Overview
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulvatunyou, Boonserm; Ivezic, Nenad; Monica, Martin
In this paper, we describe a business-to-business (B2B) testbed co-sponsored by the Open Applications Group, Inc. (OAGI) and the National Institute of Standard and Technology (NIST) to advance enterprise e-commerce standards. We describe the business and technical objectives and initial activities within the B2B Testbed. We summarize our initial lessons learned to form the requirements that drive the next generation testbed development. We also give an overview of a promising testing framework architecture in which to drive the testbed developments. We outline the future plans for the testbed development.
HuPSON: the human physiology simulation ontology.
Gündel, Michaela; Younesi, Erfan; Malhotra, Ashutosh; Wang, Jiali; Li, Hui; Zhang, Bijun; de Bono, Bernard; Mevissen, Heinz-Theodor; Hofmann-Apitius, Martin
2013-11-22
Large biomedical simulation initiatives, such as the Virtual Physiological Human (VPH), are substantially dependent on controlled vocabularies to facilitate the exchange of information, of data and of models. Hindering these initiatives is a lack of a comprehensive ontology that covers the essential concepts of the simulation domain. We propose a first version of a newly constructed ontology, HuPSON, as a basis for shared semantics and interoperability of simulations, of models, of algorithms and of other resources in this domain. The ontology is based on the Basic Formal Ontology, and adheres to the MIREOT principles; the constructed ontology has been evaluated via structural features, competency questions and use case scenarios.The ontology is freely available at: http://www.scai.fraunhofer.de/en/business-research-areas/bioinformatics/downloads.html (owl files) and http://bishop.scai.fraunhofer.de/scaiview/ (browser). HuPSON provides a framework for a) annotating simulation experiments, b) retrieving relevant information that are required for modelling, c) enabling interoperability of algorithmic approaches used in biomedical simulation, d) comparing simulation results and e) linking knowledge-based approaches to simulation-based approaches. It is meant to foster a more rapid uptake of semantic technologies in the modelling and simulation domain, with particular focus on the VPH domain.
Open IoT Ecosystem for Enhanced Interoperability in Smart Cities-Example of Métropole De Lyon.
Robert, Jérémy; Kubler, Sylvain; Kolbe, Niklas; Cerioni, Alessandro; Gastaud, Emmanuel; Främling, Kary
2017-12-08
The Internet of Things (IoT) has promised a future where everything gets connected. Unfortunately, building a single global ecosystem of Things that communicate with each other seamlessly is virtually impossible today. The reason is that the IoT is essentially a collection of isolated "Intranets of Things", also referred to as "vertical silos", which cannot easily and efficiently interact with each other. Smart cities are perhaps the most striking examples of this problem since they comprise a wide range of stakeholders and service providers who must work together, including urban planners, financial organisations, public and private service providers, telecommunication providers, industries, citizens, and so forth. Within this context, the contribution of this paper is threefold: (i) discuss business and technological implications as well as challenges of creating successful open innovation ecosystems, (ii) present the technological building blocks underlying an IoT ecosystem developed in the framework of the EU Horizon 2020 programme, (iii) present a smart city pilot (Heat Wave Mitigation in Métropole de Lyon ) for which the proposed ecosystem significantly contributes to improving interoperability between a number of system components, and reducing regulatory barriers for joint service co-creation practices.
Open IoT Ecosystem for Enhanced Interoperability in Smart Cities—Example of Métropole De Lyon
Robert, Jérémy; Kolbe, Niklas; Cerioni, Alessandro; Gastaud, Emmanuel
2017-01-01
The Internet of Things (IoT) has promised a future where everything gets connected. Unfortunately, building a single global ecosystem of Things that communicate with each other seamlessly is virtually impossible today. The reason is that the IoT is essentially a collection of isolated “Intranets of Things”, also referred to as “vertical silos”, which cannot easily and efficiently interact with each other. Smart cities are perhaps the most striking examples of this problem since they comprise a wide range of stakeholders and service providers who must work together, including urban planners, financial organisations, public and private service providers, telecommunication providers, industries, citizens, and so forth. Within this context, the contribution of this paper is threefold: (i) discuss business and technological implications as well as challenges of creating successful open innovation ecosystems, (ii) present the technological building blocks underlying an IoT ecosystem developed in the framework of the EU Horizon 2020 programme, (iii) present a smart city pilot (Heat Wave Mitigation in Métropole de Lyon) for which the proposed ecosystem significantly contributes to improving interoperability between a number of system components, and reducing regulatory barriers for joint service co-creation practices. PMID:29292719
Patient Privacy, Consent, and Identity Management in Health Information Exchange
Hosek, Susan D.; Straus, Susan G.
2013-01-01
Abstract The Military Health System (MHS) and the Veterans Health Administration (VHA) have been among the nation's leaders in health information technology (IT), including the development of health IT systems and electronic health records that summarize patients' care from multiple providers. Health IT interoperability within MHS and across MHS partners, including VHA, is one of ten goals in the current MHS Strategic Plan. As a step toward achieving improved interoperability, the MHS is seeking to develop a research roadmap to better coordinate health IT research efforts, address IT capability gaps, and reduce programmatic risk for its enterprise projects. This article contributes to that effort by identifying gaps in research, policy, and practice involving patient privacy, consent, and identity management that need to be addressed to bring about improved quality and efficiency of care through health information exchange. Major challenges include (1) designing a meaningful patient consent procedure, (2) recording patients' consent preferences and designing procedures to implement restrictions on disclosures of protected health information, and (3) advancing knowledge regarding the best technical approaches to performing patient identity matches and how best to monitor results over time. Using a sociotechnical framework, this article suggests steps for overcoming these challenges and topics for future research. PMID:28083296
Enabling OpenID Authentication for VO-integrated Portals
NASA Astrophysics Data System (ADS)
Plante, R.; Yekkirala, V.; Baker, W.
2012-09-01
To support interoperating services that share proprietary data and other user-specific information, the VAO Project provides login services for browser-based portals built on the open standard, OpenID. To help portal developers take advantage of this service, we have developed a downloadable toolkit for integrating OpenID single sign-on support into any portal. This toolkit provides APIs in a few languages commonly used on the server-side as well as a command-line version for use in any language. In addition to describing how to use this toolkit, we also discuss the general VAO framework for single sign-on. While a portal may, if it wishes, support any OpenID provider, the VAO service provides a few extra features to support VO interoperability. This includes a portal's ability to retrieve (with the user's permission) an X.509 certificate representing the authenticated user so that the portal can access other restricted services on the user's behalf. Other standard features of OpenID allow portals to request other information about the user; this feature will be used in the future for sharing information about a user's group membership to enable sharing within a group of collaborating scientists.
Rezaeibagha, Fatemeh; Win, Khin Than; Susilo, Willy
Even though many safeguards and policies for electronic health record (EHR) security have been implemented, barriers to the privacy and security protection of EHR systems persist. This article presents the results of a systematic literature review regarding frequently adopted security and privacy technical features of EHR systems. Our inclusion criteria were full articles that dealt with the security and privacy of technical implementations of EHR systems published in English in peer-reviewed journals and conference proceedings between 1998 and 2013; 55 selected studies were reviewed in detail. We analysed the review results using two International Organization for Standardization (ISO) standards (29100 and 27002) in order to consolidate the study findings. Using this process, we identified 13 features that are essential to security and privacy in EHRs. These included system and application access control, compliance with security requirements, interoperability, integration and sharing, consent and choice mechanism, policies and regulation, applicability and scalability and cryptography techniques. This review highlights the importance of technical features, including mandated access control policies and consent mechanisms, to provide patients' consent, scalability through proper architecture and frameworks, and interoperability of health information systems, to EHR security and privacy requirements.
Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A
2008-02-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).
Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.
2008-01-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service Oriented Architecture (SSOA) for cancer research by the National Cancer Institute’s cancer Biomedical Informatics Grid (caBIG™). PMID:17512259
Maturity Model for Advancing Smart Grid Interoperability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knight, Mark; Widergren, Steven E.; Mater, J.
2013-10-28
Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met withmore » process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.« less
Laplante-Lévesque, Ariane; Abrams, Harvey; Bülow, Maja; Lunner, Thomas; Nelson, John; Riis, Søren Kamaric; Vanpoucke, Filiep
2016-10-01
This article describes the perspectives of hearing device manufacturers regarding the exciting developments that the Internet makes possible. Specifically, it proposes to join forces toward interoperability and standardization of Internet and audiology. A summary of why such a collaborative effort is required is provided from historical and scientific perspectives. A roadmap toward interoperability and standardization is proposed. Information and communication technologies improve the flow of health care data and pave the way to better health care. However, hearing-related products, features, and services are notoriously heterogeneous and incompatible with other health care systems (no interoperability). Standardization is the process of developing and implementing technical standards (e.g., Noah hearing database). All parties involved in interoperability and standardization realize mutual gains by making mutually consistent decisions. De jure (officially endorsed) standards can be developed in collaboration with large national health care systems as well as spokespeople for hearing care professionals and hearing device users. The roadmap covers mutual collaboration; data privacy, security, and ownership; compliance with current regulations; scalability and modularity; and the scope of interoperability and standards. We propose to join forces to pave the way to the interoperable Internet and audiology products, features, and services that the world needs.
Reflections on the role of open source in health information system interoperability.
Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G
2007-01-01
This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.
Liaw, S T; Rahimi, A; Ray, P; Taggart, J; Dennis, S; de Lusignan, S; Jalaludin, B; Yeo, A E T; Talaei-Khoei, A
2013-01-01
Effective use of routine data to support integrated chronic disease management (CDM) and population health is dependent on underlying data quality (DQ) and, for cross system use of data, semantic interoperability. An ontological approach to DQ is a potential solution but research in this area is limited and fragmented. Identify mechanisms, including ontologies, to manage DQ in integrated CDM and whether improved DQ will better measure health outcomes. A realist review of English language studies (January 2001-March 2011) which addressed data quality, used ontology-based approaches and is relevant to CDM. We screened 245 papers, excluded 26 duplicates, 135 on abstract review and 31 on full-text review; leaving 61 papers for critical appraisal. Of the 33 papers that examined ontologies in chronic disease management, 13 defined data quality and 15 used ontologies for DQ. Most saw DQ as a multidimensional construct, the most used dimensions being completeness, accuracy, correctness, consistency and timeliness. The majority of studies reported tool design and development (80%), implementation (23%), and descriptive evaluations (15%). Ontological approaches were used to address semantic interoperability, decision support, flexibility of information management and integration/linkage, and complexity of information models. DQ lacks a consensus conceptual framework and definition. DQ and ontological research is relatively immature with little rigorous evaluation studies published. Ontology-based applications could support automated processes to address DQ and semantic interoperability in repositories of routinely collected data to deliver integrated CDM. We advocate moving to ontology-based design of information systems to enable more reliable use of routine data to measure health mechanisms and impacts. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
A distributed framework for health information exchange using smartphone technologies.
Abdulnabi, Mohamed; Al-Haiqi, Ahmed; Kiah, M L M; Zaidan, A A; Zaidan, B B; Hussain, Muzammil
2017-05-01
Nationwide health information exchange (NHIE) continues to be a persistent concern for government agencies, despite the many efforts and the conceived benefits of sharing patient data among healthcare providers. Difficulties in ensuring global connectivity, interoperability, and concerns on security have always hampered the government from successfully deploying NHIE. By looking at NHIE from a fresh perspective and bearing in mind the pervasiveness and power of modern mobile platforms, this paper proposes a new approach to NHIE that builds on the notion of consumer-mediated HIE, albeit without the focus on central health record banks. With the growing acceptance of smartphones as reliable, indispensable, and most personal devices, we suggest to leverage the concept of mobile personal health records (PHRs installed on smartphones) to the next level. We envision mPHRs that take the form of distributed storage units for health information, under the full control and direct possession of patients, who can have ready access to their personal data whenever needed. However, for the actual exchange of data with health information systems managed by healthcare providers, the latter have to be interoperable with patient-carried mPHRs. Computer industry has long ago solved a similar problem of interoperability between peripheral devices and operating systems. We borrow from that solution the idea of providing special interfaces between mPHRs and provider systems. This interface enables the two entities to communicate with no change to either end. The design and operation of the proposed approach is explained. Additional pointers on potential implementations are provided, and issues that pertain to any solution to implement NHIE are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public... persons that the Federal Communications Commission's (FCC) Communications Security, Reliability, and... the security, reliability, and interoperability of communications systems. On March 19, 2011, the FCC...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
Bridging the gap between Hydrologic and Atmospheric communities through a standard based framework
NASA Astrophysics Data System (ADS)
Boldrini, E.; Salas, F.; Maidment, D. R.; Mazzetti, P.; Santoro, M.; Nativi, S.; Domenico, B.
2012-04-01
Data interoperability in the study of Earth sciences is essential to performing interdisciplinary multi-scale multi-dimensional analyses (e.g. hydrologic impacts of global warming, regional urbanization, global population growth etc.). This research aims to bridge the existing gap between hydrologic and atmospheric communities both at semantic and technological levels. Within the context of hydrology, scientists are usually concerned with data organized as time series: a time series can be seen as a variable measured at a particular point in space over a period of time (e.g. the stream flow values as periodically measured by a buoy sensor in a river); atmospheric scientists instead usually organize their data as coverages: a coverage can be seen as a multidimensional data array (e.g. satellite images acquired through time). These differences make non-trivial the set up of a common framework to perform data discovery and access. A set of web services specifications and implementations is already in place in both the scientific communities to allow data discovery and access in the different domains. The CUAHSI-Hydrologic Information System (HIS) service stack lists different services types and implementations: - a metacatalog (implemented as a CSW) used to discover metadata services by distributing the query to a set of catalogs - time series catalogs (implemented as CSW) used to discover datasets published by the feature services - feature services (implemented as WFS) containing features with data access link - sensor observation services (implemented as SOS) enabling access to the stream of acquisitions Within the Unidata framework, there lies a similar service stack for atmospheric data: - the broker service (implemented as a CSW) distributes a user query to a set of heterogeneous services (i.e. catalogs services, but also inventory and access services) - the catalog service (implemented as a CSW) is able to harvest the available metadata offered by THREDDS services, and executes complex queries against the available metadata. - inventory service (implemented as a THREDDS) being able to hierarchically organize and publish a local collection of multi-dimensional arrays (e.g. NetCDF, GRIB files), as well as publish auxiliary standard services to realize the actual data access and visualization (e.g. WCS, OPeNDAP, WMS). The approach followed in this research is to build on top of the existing standards and implementations, by setting up a standard-aware interoperable framework, able to deal with the existing heterogeneity in an organic way. As a methodology, interoperability tests against real services were performed; existing problems were thus highlighted and possibly solved. The use of flexible tools, able to deal in a smart way with heterogeneity has proven to be successful, in particular experiments were carried on with both GI-cat broker and ESRI GeoPortal frameworks. GI-cat discovery broker was proven successful at implementing the CSW interface, as well as federating heterogeneous resources, such as THREDDS and WCS services published by Unidata, HydroServer, WFS and SOS services published by CUAHSI. Experiments with ESRI GeoPortal were also successful: the GeoPortal was used to deploy a web interface able to distribute searches amongst catalog implementations from both the hydrologic and the atmospheric communities, including HydroServers and GI-cat, combining results from both the domains in a seamless way.
Provenance in Data Interoperability for Multi-Sensor Intercomparison
NASA Technical Reports Server (NTRS)
Lynnes, Chris; Leptoukh, Greg; Berrick, Steve; Shen, Suhung; Prados, Ana; Fox, Peter; Yang, Wenli; Min, Min; Holloway, Dan; Enloe, Yonsook
2008-01-01
As our inventory of Earth science data sets grows, the ability to compare, merge and fuse multiple datasets grows in importance. This requires a deeper data interoperability than we have now. Efforts such as Open Geospatial Consortium and OPeNDAP (Open-source Project for a Network Data Access Protocol) have broken down format barriers to interoperability; the next challenge is the semantic aspects of the data. Consider the issues when satellite data are merged, cross-calibrated, validated, inter-compared and fused. We must match up data sets that are related, yet different in significant ways: the phenomenon being measured, measurement technique, location in space-time or quality of the measurements. If subtle distinctions between similar measurements are not clear to the user, results can be meaningless or lead to an incorrect interpretation of the data. Most of these distinctions trace to how the data came to be: sensors, processing and quality assessment. For example, monthly averages of satellite-based aerosol measurements often show significant discrepancies, which might be due to differences in spatio- temporal aggregation, sampling issues, sensor biases, algorithm differences or calibration issues. Provenance information must be captured in a semantic framework that allows data inter-use tools to incorporate it and aid in the intervention of comparison or merged products. Semantic web technology allows us to encode our knowledge of measurement characteristics, phenomena measured, space-time representation, and data quality attributes in a well-structured, machine-readable ontology and rulesets. An analysis tool can use this knowledge to show users the provenance-related distrintions between two variables, advising on options for further data processing and analysis. An additional problem for workflows distributed across heterogeneous systems is retrieval and transport of provenance. Provenance may be either embedded within the data payload, or transmitted from server to client in an out-of-band mechanism. The out of band mechanism is more flexible in the richness of provenance information that can be accomodated, but it relies on a persistent framework and can be difficult for legacy clients to use. We are prototyping the embedded model, incorporating provenance within metadata objects in the data payload. Thus, it always remains with the data. The downside is a limit to the size of provenance metadata that we can include, an issue that will eventually need resolution to encompass the richness of provenance information required for daata intercomparison and merging.
UAS Integration in the NAS Project: DAA-TCAS Interoperability "mini" HITL Primary Results
NASA Technical Reports Server (NTRS)
Rorie, Conrad; Fern, Lisa; Shively, Jay; Santiago, Confesor
2016-01-01
At the May 2015 SC-228 meeting, requirements for TCAS II interoperability became elevated in priority. A TCAS interoperability workgroup was formed to identify and address key issues/questions. The TCAS workgroup came up with an initial list of questions and a plan to address those questions. As part of that plan, NASA proposed to run a mini HITL to address display, alerting and guidance issues. A TCAS Interoperability Workshop was held to determine potential display/alerting/guidance issues that could be explored in future NASA mini HITLS. Consensus on main functionality of DAA guidance when TCAS II RA occurs. Prioritized list of independent variables for experimental design. Set of use cases to stress TCAS Interoperability.
An Ontological Solution to Support Interoperability in the Textile Industry
NASA Astrophysics Data System (ADS)
Duque, Arantxa; Campos, Cristina; Jiménez-Ruiz, Ernesto; Chalmeta, Ricardo
Significant developments in information and communication technologies and challenging market conditions have forced enterprises to adapt their way of doing business. In this context, providing mechanisms to guarantee interoperability among heterogeneous organisations has become a critical issue. Even though prolific research has already been conducted in the area of enterprise interoperability, we have found that enterprises still struggle to introduce fully interoperable solutions, especially, in terms of the development and application of ontologies. Thus, the aim of this paper is to introduce basic ontology concepts in a simple manner and to explain the advantages of the use of ontologies to improve interoperability. We will also present a case study showing the implementation of an application ontology for an enterprise in the textile/clothing sector.
77 FR 37001 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... of the Interoperability Services Layer, Attn: Ron Chen, 400 Gigling Road, Seaside, CA 93955. Title; Associated Form; and OMB Number: Interoperability Services Layer; OMB Control Number 0704-TBD. Needs and Uses... INFORMATION: Summary of Information Collection IoLS (Interoperability Layer Services) is an application in a...
He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison
2018-01-12
Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).
Achieving Interoperability in GEOSS - How Close Are We?
NASA Astrophysics Data System (ADS)
Arctur, D. K.; Khalsa, S. S.; Browdy, S. F.
2010-12-01
A primary goal of the Global Earth Observing System of System (GEOSS) is improving the interoperability between the observational, modelling, data assimilation, and prediction systems contributed by member countries. The GEOSS Common Infrastructure (GCI) comprises the elements designed to enable discovery and access to these diverse data and information sources. But to what degree can the mechanisms for accessing these data, and the data themselves, be considered interoperable? Will the separate efforts by Communities of Practice within GEO to build their own portals, such as for Energy, Biodiversity, and Air Quality, lead to fragmentation or synergy? What communication and leadership do we need with these communities to improve interoperability both within and across such communities? The Standards and Interoperability Forum (SIF) of GEO's Architecture and Data Committee has assessed progress towards achieving the goal of global interoperability and made recommendations regarding evolution of the architecture and overall data strategy to ensure fulfillment of the GEOSS vision. This presentation will highlight the results of this study, and directions for further work.
Personal Health Records: Is Rapid Adoption Hindering Interoperability?
Studeny, Jana; Coustasse, Alberto
2014-01-01
The establishment of the Meaningful Use criteria has created a critical need for robust interoperability of health records. A universal definition of a personal health record (PHR) has not been agreed upon. Standardized code sets have been built for specific entities, but integration between them has not been supported. The purpose of this research study was to explore the hindrance and promotion of interoperability standards in relationship to PHRs to describe interoperability progress in this area. The study was conducted following the basic principles of a systematic review, with 61 articles used in the study. Lagging interoperability has stemmed from slow adoption by patients, creation of disparate systems due to rapid development to meet requirements for the Meaningful Use stages, and rapid early development of PHRs prior to the mandate for integration among multiple systems. Findings of this study suggest that deadlines for implementation to capture Meaningful Use incentive payments are supporting the creation of PHR data silos, thereby hindering the goal of high-level interoperability. PMID:25214822
Key pillars of data interoperability in Earth Sciences - INSPIRE and beyond
NASA Astrophysics Data System (ADS)
Tomas, Robert; Lutz, Michael
2013-04-01
The well-known heterogeneity and fragmentation of data models, formats and controlled vocabularies of environmental data limit potential data users from utilising the wealth of environmental information available today across Europe. The main aim of INSPIRE1 is to improve this situation and give users possibility to access, use and correctly interpret environmental data. Over the past years number of INSPIRE technical guidelines (TG) and implementing rules (IR) for interoperability have been developed, involving hundreds of domain experts from across Europe. The data interoperability specifications, which have been developed for all 34 INSPIRE spatial data themes2, are the central component of the TG and IR. Several of these themes are related to the earth sciences, e.g. geology (including hydrogeology, geophysics and geomorphology), mineral and energy resources, soil science, natural hazards, meteorology, oceanography, hydrology and land cover. The following main pillars for data interoperability and harmonisation have been identified during the development of the specifications: Conceptual data models describe the spatial objects and their properties and relationships for the different spatial data themes. To achieve cross-domain harmonization, the data models for all themes are based on a common modelling framework (the INSPIRE Generic Conceptual Model3) and managed in a common UML repository. Harmonised vocabularies (or code lists) are to be used in data exchange in order to overcome interoperability issues caused by heterogeneous free-text and/or multi-lingual content. Since a mapping to a harmonized vocabulary could be difficult, the INSPIRE data models typically allow the provision of more specific terms from local vocabularies in addition to the harmonized terms - utilizing either the extensibility options or additional terminological attributes. Encoding. Currently, specific XML profiles of the Geography Markup Language (GML) are promoted as the standard encoding. However, since the conceptual models are independent of concrete encodings, it is also possible to derive other encodings (e.g. based on RDF). Registers provide unique and persistent identifiers for a number of different types of information items (e.g. terms from a controlled vocabulary or units of measure) and allow their consistent management and versioning. By using these identifiers in data, references to specific information items can be made unique and unambiguous. It is important that these interoperability solutions are not developed in isolation - for Europe only. This has been identified from the beginning, and therefore, international standards have been taken into account and been widely referred to in INSPIRE. This mutual cooperation with international standardisation activities needs to be maintained or even extended. For example, where INSPIRE has gone beyond existing standards, the INSPIRE interoperability solutions should be introduced to the international standardisation initiatives. However, in some cases, it is difficult to choose the appropriate international organization or standardisation body (e.g. where there are several organizations overlapping in scope) or to achieve international agreements that accept European specifics. Furthermore, the development of the INSPIRE specifications (to be legally adopted in 2013) is only a beginning of the effort to make environmental data interoperable. Their actual implementation by data providers across Europe, as well as the rapid development in the earth sciences (e.g. from new simulation models, scientific advances, etc.) and ICT technology will lead to requests for changes. It is therefore crucial to ensure the long-term sustainable maintenance and further development of the proposed infrastructure. This task cannot be achieved by the INSPIRE coordination team of the European Commission alone. It is therefore crucial to closely involve relevant (where possible, umbrella) organisations in the earth sciences, who can provide the necessary domain knowledge and expert networks.
Organisational Interoperability: Evaluation and Further Development of the OIM Model
2003-06-01
an Organizational Interoperability Maturity Model (OIM) to evaluate interoperability at the organizational level. The OIM considers the human ... activity aspects of military operations, which are not covered in other models. This paper describes how the model has been used to identify problems and to
E-Services quality assessment framework for collaborative networks
NASA Astrophysics Data System (ADS)
Stegaru, Georgiana; Danila, Cristian; Sacala, Ioan Stefan; Moisescu, Mihnea; Mihai Stanescu, Aurelian
2015-08-01
In a globalised networked economy, collaborative networks (CNs) are formed to take advantage of new business opportunities. Collaboration involves shared resources and capabilities, such as e-Services that can be dynamically composed to automate CN participants' business processes. Quality is essential for the success of business process automation. Current approaches mostly focus on quality of service (QoS)-based service selection and ranking algorithms, overlooking the process of service composition which requires interoperable, adaptable and secure e-Services to ensure seamless collaboration, data confidentiality and integrity. Lack of assessment of these quality attributes can result in e-Service composition failure. The quality of e-Service composition relies on the quality of each e-Service and on the quality of the composition process. Therefore, there is the need for a framework that addresses quality from both views: product and process. We propose a quality of e-Service composition (QoESC) framework for quality assessment of e-Service composition for CNs which comprises of a quality model for e-Service evaluation and guidelines for quality of e-Service composition process. We implemented a prototype considering a simplified telemedicine use case which involves a CN in e-Healthcare domain. To validate the proposed quality-driven framework, we analysed service composition reliability with and without using the proposed framework.
Cross Support Transfer Service (CSTS) Framework Library
NASA Technical Reports Server (NTRS)
Ray, Timothy
2014-01-01
Within the Consultative Committee for Space Data Systems (CCSDS), there is an effort to standardize data transfer between ground stations and control centers. CCSDS plans to publish a collection of transfer services that will each address the transfer of a particular type of data (e.g., tracking data). These services will be called Cross Support Transfer Services (CSTSs). All of these services will make use of a common foundation that is called the CSTS Framework. This library implements the User side of the CSTS Framework. "User side" means that the library performs the role that is typically expected of the control center. This library was developed in support of the Goddard Data Standards program. This technology could be applicable for control centers, and possibly for use in control center simulators needed to test ground station capabilities. The main advantages of this implementation are its flexibility and simplicity. It provides the framework capabilities, while allowing the library user to provide a wrapper that adapts the library to any particular environment. The main purpose of this implementation was to support the inter-operability testing required by CCSDS. In addition, it is likely that the implementation will be useful within the Goddard mission community (for use in control centers).
Rattner, Alexander S.; Guillen, Donna Post; Joshi, Alark; ...
2016-03-17
Photo- and physically realistic techniques are often insufficient for visualization of fluid flow simulations, especially for 3D and time-varying studies. Substantial research effort has been dedicated to the development of non-photorealistic and illustration-inspired visualization techniques for compact and intuitive presentation of such complex datasets. However, a great deal of work has been reproduced in this field, as many research groups have developed specialized visualization software. Additionally, interoperability between illustrative visualization software is limited due to diverse processing and rendering architectures employed in different studies. In this investigation, a framework for illustrative visualization is proposed, and implemented in MarmotViz, a ParaViewmore » plug-in, enabling its use on a variety of computing platforms with various data file formats and mesh geometries. Region-of-interest identification and feature-tracking algorithms incorporated into this tool are described. Implementations of multiple illustrative effect algorithms are also presented to demonstrate the use and flexibility of this framework. Here, by providing an integrated framework for illustrative visualization of CFD data, MarmotViz can serve as a valuable asset for the interpretation of simulations of ever-growing scale.« less
Kase, Courtney; Hoover, Sharon; Boyd, Gina; West, Kristina D; Dubenitz, Joel; Trivedi, Pamala A; Peterson, Hilary J; Stein, Bradley D
2017-07-01
There is an unmet need for behavioral health support and services among children and adolescents, which school behavioral health has the potential to address. Existing reviews and meta-analyses document the behavioral health benefits of school behavioral health programs and frameworks, but few summaries of the academic benefits of such programs exist. We provide exemplars of the academic benefits of school behavioral health programs and frameworks. A literature review identified school behavioral health-related articles and reports. Articles for inclusion were restricted to those that were school-based programs and frameworks in the United States that included an empirical evaluation of intervention academic-related outcomes. Findings from 36 primary research, review, and meta-analysis articles from the past 17 years show the benefits of school behavioral health clinical interventions and targeted interventions on a range of academic outcomes for adolescents. Our findings are consistent with reports documenting health benefits of school behavioral health frameworks and programs and can facilitate further efforts to support school behavioral health for a range of stakeholders interested in the benefits of school behavioral health programs and frameworks on academic outcomes. © 2017, American School Health Association.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-13
..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public..., Reliability, and Interoperability Council (CSRIC) will hold its fifth meeting. The CSRIC will vote on... to the FCC regarding best practices and actions the FCC can take to ensure the security, reliability...
Evaluation of Interoperability Protocols in Repositories of Electronic Theses and Dissertations
ERIC Educational Resources Information Center
Hakimjavadi, Hesamedin; Masrek, Mohamad Noorman
2013-01-01
Purpose: The purpose of this study is to evaluate the status of eight interoperability protocols within repositories of electronic theses and dissertations (ETDs) as an introduction to further studies on feasibility of deploying these protocols in upcoming areas of interoperability. Design/methodology/approach: Three surveys of 266 ETD…
Examining the Relationship between Electronic Health Record Interoperability and Quality Management
ERIC Educational Resources Information Center
Purcell, Bernice M.
2013-01-01
A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…
Interoperability of Demand Response Resources Demonstration in NY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wellington, Andre
2014-03-31
The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.
Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interop...
Reminiscing about 15 years of interoperability efforts
Van de Sompel, Herbert; Nelson, Michael L.
2015-11-01
Over the past fifteen years, our perspective on tackling information interoperability problems for web-based scholarship has evolved significantly. In this opinion piece, we look back at three efforts that we have been involved in that aptly illustrate this evolution: OAI-PMH, OAI-ORE, and Memento. Understanding that no interoperability specification is neutral, we attempt to characterize the perspectives and technical toolkits that provided the basis for these endeavors. With that regard, we consider repository-centric and web-centric interoperability perspectives, and the use of a Linked Data or a REST/HATEAOS technology stack, respectively. In addition, we lament the lack of interoperability across nodes thatmore » play a role in web-based scholarship, but end on a constructive note with some ideas regarding a possible path forward.« less
A wearable context aware system for ubiquitous healthcare.
Kang, Dong-Oh; Lee, Hyung-Jik; Ko, Eun-Jung; Kang, Kyuchang; Lee, Jeunwoo
2006-01-01
Recent developments of information technologies are leading the advent of the era of ubiquitous healthcare, which means healthcare services at any time and at any places. The ubiquitous healthcare service needs a wearable system for more continual measurement of biological signals of a user, which gives information of the user from wearable sensors. In this paper, we propose a wearable context aware system for ubiquitous healthcare, and its systematic design process of a ubiquitous healthcare service. Some wearable sensor systems are introduced with Zigbee communication. We develop a context aware framework to send information from wearable sensors to healthcare service entities as a middleware to solve the interoperability problem between sensor makers and healthcare service providers. And, we propose a systematic process of design of ubiquitous healthcare services with the context aware framework. In order to show the feasibility of the proposed system, some application examples are given, which are applied to remote monitoring, and a self check service.
A Systems Approach to Biometrics in the Military Domain.
Wilson, Lauren; Gahan, Michelle; Lennard, Chris; Robertson, James
2018-02-21
Forensic biometrics is the application of forensic science principles to physical and behavioral characteristics. Forensic biometrics is a secondary sub-system in the forensic science "system of systems," which describes forensic science as a sub-system in the larger criminal justice, law enforcement, intelligence, and military system. The purpose of this paper is to discuss biometrics in the military domain and integration into the wider forensic science system of systems. The holistic system thinking methodology was applied to the U.S. biometric system to map it to the system of systems framework. The U.S. biometric system is used as a case study to help guide other countries to develop military biometric systems that are integrated and interoperable at the whole-of-government level. The aim is to provide the system of systems framework for agencies to consider for proactive design of biometric systems. © 2018 American Academy of Forensic Sciences.
Bayne, Jay S
2008-06-01
In support of a generalization of systems theory, this paper introduces a new approach in modeling complex distributed systems. It offers an analytic framework for describing the behavior of interactive cyberphysical systems (CPSs), which are networked stationary or mobile information systems responsible for the real-time governance of physical processes whose behaviors unfold in cyberspace. The framework is predicated on a cyberspace-time reference model comprising three spatial dimensions plus time. The spatial domains include geospatial, infospatial, and sociospatial references, the latter describing relationships among sovereign enterprises (rational agents) that choose voluntarily to organize and interoperate for individual and mutual benefit through geospatial (physical) and infospatial (logical) transactions. Of particular relevance to CPSs are notions of timeliness and value, particularly as they relate to the real-time governance of physical processes and engagements with other cooperating CPS. Our overarching interest, as with celestial mechanics, is in the formation and evolution of clusters of cyberspatial objects and the federated systems they form.
Coding conventions and principles for a National Land-Change Modeling Framework
Donato, David I.
2017-07-14
This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.
Kinjo, Akira R.; Bekker, Gert-Jan; Suzuki, Hirofumi; Tsuchiya, Yuko; Kawabata, Takeshi; Ikegawa, Yasuyo; Nakamura, Haruki
2017-01-01
The Protein Data Bank Japan (PDBj, http://pdbj.org), a member of the worldwide Protein Data Bank (wwPDB), accepts and processes the deposited data of experimentally determined macromolecular structures. While maintaining the archive in collaboration with other wwPDB partners, PDBj also provides a wide range of services and tools for analyzing structures and functions of proteins. We herein outline the updated web user interfaces together with RESTful web services and the backend relational database that support the former. To enhance the interoperability of the PDB data, we have previously developed PDB/RDF, PDB data in the Resource Description Framework (RDF) format, which is now a wwPDB standard called wwPDB/RDF. We have enhanced the connectivity of the wwPDB/RDF data by incorporating various external data resources. Services for searching, comparing and analyzing the ever-increasing large structures determined by hybrid methods are also described. PMID:27789697
Executing Medical Guidelines on the Web: Towards Next Generation Healthcare
NASA Astrophysics Data System (ADS)
Argüello, M.; Des, J.; Fernandez-Prieto, M. J.; Perez, R.; Paniagua, H.
There is still a lack of full integration between current Electronic Health Records (EHRs) and medical guidelines that encapsulate evidence-based medicine. Thus, general practitioners (GPs) and specialised physicians still have to read document-based medical guidelines and decide among various options for managing common non-life-threatening conditions where the selection of the most appropriate therapeutic option for each individual patient can be a difficult task. This paper presents a simulation framework and computational test-bed, called V.A.F. Framework, for supporting simulations of clinical situations that boosted the integration between Health Level Seven (HL7) and Semantic Web technologies (OWL, SWRL, and OWL-S) to achieve content layer interoperability between online clinical cases and medical guidelines, and therefore, it proves that higher integration between EHRs and evidence-based medicine can be accomplished which could lead to a next generation of healthcare systems that provide more support to physicians and increase patients' safety.
NASA Astrophysics Data System (ADS)
Tobar, R. J.; von Brand, H.; Araya, M. A.; Juerges, T.
2010-12-01
The ALMA Common Software (ACS) framework lacks of the real-time capabilities to control the antennas’ instrumentation — as has been probed by previous works — which has lead to non-portable workarounds to the problem. Indeed, the time service used in ACS, based in the Container/Component model, presents plenty of results that confirm this statement. This work addresses the problem of design and integrate a real-time service for ACS, providing to the framework an implementation such that the control operations over the different instruments could be done within real-time constraints. This implementation is compared with the current time service, showing the difference between the two systems when subjecting them to common scenarios. Also, the new implementation is done following the POSIX specification, ensuring interoperability and portability through different operating systems.
Enhancing the many-to-many relations across IHE document sharing communities.
Ribeiro, Luís S; Costa, Carlos; Oliveira, José Luís
2012-01-01
The Integrating Healthcare Enterprise (IHE) initiative is an ongoing project aiming to enable true inter-site interoperability in the health IT field. IHE is a work in progress and many challenges need to be overcome before the healthcare Institutions may share patient clinical records transparently and effortless. Configuring, deploying and testing an IHE document sharing community requires a significant effort to plan and maintain the supporting IT infrastructure. With the new paradigm of cloud computing is now possible to launch software devices on demand and paying accordantly to the usage. This paper presents a framework designed with purpose of expediting the creation of IHE document sharing communities. It provides semi-ready templates of sharing communities that will be customized according the community needs. The framework is a meeting point of the healthcare institutions, creating a favourable environment that might converge in new inter-institutional professional relationships and eventually the creation of new Affinity Domains.
The HDF Product Designer - Interoperability in the First Mile
NASA Astrophysics Data System (ADS)
Lee, H.; Jelenak, A.; Habermann, T.
2014-12-01
Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.
Potential interoperability problems facing multi-site radiation oncology centers in The Netherlands
NASA Astrophysics Data System (ADS)
Scheurleer, J.; Koken, Ph; Wessel, R.
2014-03-01
Aim: To identify potential interoperability problems facing multi-site Radiation Oncology (RO) departments in the Netherlands and solutions for unambiguous multi-system workflows. Specific challenges confronting the RO department of VUmc (RO-VUmc), which is soon to open a satellite department, were characterized. Methods: A nationwide questionnaire survey was conducted to identify possible interoperability problems and solutions. Further detailed information was obtained by in-depth interviews at 3 Dutch RO institutes that already operate in more than one site. Results: The survey had a 100% response rate (n=21). Altogether 95 interoperability problems were described. Most reported problems were on a strategic and semantic level. The majority were DICOM(-RT) and HL7 related (n=65), primarily between treatment planning and verification systems or between departmental and hospital systems. Seven were identified as being relevant for RO-VUmc. Departments have overcome interoperability problems with their own, or with tailor-made vendor solutions. There was little knowledge about or utilization of solutions developed by Integrating the Healthcare Enterprise Radiation Oncology (IHE-RO). Conclusions: Although interoperability problems are still common, solutions have been identified. Awareness of IHE-RO needs to be raised. No major new interoperability problems are predicted as RO-VUmc develops into a multi-site department.
Integration and visualization of systems biology data in context of the genome
2010-01-01
Background High-density tiling arrays and new sequencing technologies are generating rapidly increasing volumes of transcriptome and protein-DNA interaction data. Visualization and exploration of this data is critical to understanding the regulatory logic encoded in the genome by which the cell dynamically affects its physiology and interacts with its environment. Results The Gaggle Genome Browser is a cross-platform desktop program for interactively visualizing high-throughput data in the context of the genome. Important features include dynamic panning and zooming, keyword search and open interoperability through the Gaggle framework. Users may bookmark locations on the genome with descriptive annotations and share these bookmarks with other users. The program handles large sets of user-generated data using an in-process database and leverages the facilities of SQL and the R environment for importing and manipulating data. A key aspect of the Gaggle Genome Browser is interoperability. By connecting to the Gaggle framework, the genome browser joins a suite of interconnected bioinformatics tools for analysis and visualization with connectivity to major public repositories of sequences, interactions and pathways. To this flexible environment for exploring and combining data, the Gaggle Genome Browser adds the ability to visualize diverse types of data in relation to its coordinates on the genome. Conclusions Genomic coordinates function as a common key by which disparate biological data types can be related to one another. In the Gaggle Genome Browser, heterogeneous data are joined by their location on the genome to create information-rich visualizations yielding insight into genome organization, transcription and its regulation and, ultimately, a better understanding of the mechanisms that enable the cell to dynamically respond to its environment. PMID:20642854
Overview of NASA MSFC IEC Federated Engineering Collaboration Capability
NASA Technical Reports Server (NTRS)
Moushon, Brian; McDuffee, Patrick
2005-01-01
The MSFC IEC federated engineering framework is currently developing a single collaborative engineering framework across independent NASA centers. The federated approach allows NASA centers the ability to maintain diversity and uniqueness, while providing interoperability. These systems are integrated together in a federated framework without compromising individual center capabilities. MSFC IEC's Federation Framework will have a direct affect on how engineering data is managed across the Agency. The approach is directly attributed in response to the Columbia Accident Investigation Board (CAB) finding F7.4-11 which states the Space Shuttle Program has a wealth of data sucked away in multiple databases without a convenient way to integrate and use the data for management, engineering, or safety decisions. IEC s federated capability is further supported by OneNASA recommendation 6 that identifies the need to enhance cross-Agency collaboration by putting in place common engineering and collaborative tools and databases, processes, and knowledge-sharing structures. MSFC's IEC Federated Framework is loosely connected to other engineering applications that can provide users with the integration needed to achieve an Agency view of the entire product definition and development process, while allowing work to be distributed across NASA Centers and contractors. The IEC DDMS federation framework eliminates the need to develop a single, enterprise-wide data model, where the goal of having a common data model shared between NASA centers and contractors is very difficult to achieve.
Ross, Simone J; Preston, Robyn; Lindemann, Iris C; Matte, Marie C; Samson, Rex; Tandinco, Filedito D; Larkins, Sarah L; Palsdottir, Bjorg; Neusy, Andre-Jacques
2014-01-01
The Training for Health Equity Network (THEnet), a group of diverse health professional schools aspiring toward social accountability, developed and pilot tested a comprehensive evaluation framework to assess progress toward socially accountable health professions education. The evaluation framework provides criteria for schools to assess their level of social accountability within their organization and planning; education, research and service delivery; and the direct and indirect impacts of the school and its graduates, on the community and health system. This paper describes the pilot implementation of testing the evaluation framework across five THEnet schools, and examines whether the evaluation framework was practical and feasible across contexts for the purposes of critical reflection and continuous improvement in terms of progress towards social accountability. In this pilot study, schools utilized the evaluation framework using a mixed method approach of data collection comprising of workshops, qualitative interviews and focus group discussions, document review and collation and analysis of existing quantitative data. The evaluation framework allowed each school to contextually gather evidence on how it was meeting the aspirational goals of social accountability across a range of school activities, and to identify strengths and areas for improvement and development. The evaluation framework pilot study demonstrated how social accountability can be assessed through a critically reflective and comprehensive process. As social accountability focuses on the relationship between health professions schools and health system and health population outcomes, each school was able to demonstrate to students, health professionals, governments, accrediting bodies, communities and other stakeholders how current and future health care needs of populations are addressed in terms of education, research, and service learning.
High School Improvement: Indicators of Effectiveness and School-Level Benchmarks
ERIC Educational Resources Information Center
National High School Center, 2012
2012-01-01
The National High School Center's "Eight Elements of High School Improvement: A Mapping Framework" provides a cohesive high school improvement framework comprised of eight elements and related indicators of effectiveness. These indicators of effectiveness allow states, districts, and schools to identify strengths and weaknesses of their current…
NASA Astrophysics Data System (ADS)
Peckham, S. D.
2013-12-01
Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework service components as necessary to mediate the differences between the coupled models. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. To illustrate the power of standardized model interfaces and metadata, a smart, light-weight modeling framework written in Python will be introduced that can automatically (without user intervention) couple a set of BMI-enabled hydrologic process components together to create a spatial hydrologic model. The same mechanisms could also be used to provide seamless integration (import/export) of data and models.
NASA Astrophysics Data System (ADS)
Peckham, Scott
2016-04-01
Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders, time interpolators and unit converters) as necessary to mediate the differences between them so they can work together. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model or data set to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. Recent efforts to bring powerful uncertainty analysis and inverse modeling toolkits such as DAKOTA into modeling frameworks will also be described. This talk will conclude with an overview of several related modeling projects that have been funded by NSF's EarthCube initiative, namely the Earth System Bridge, OntoSoft and GeoSemantics projects.
Evaluating health-promoting schools in Hong Kong: development of a framework.
Lee, Albert; Cheng, Frances F K; St Leger, Lawry
2005-06-01
Health-promoting schools (HPS)/healthy schools have existed internationally for about 15 years. Yet there are few comprehensive evaluation frameworks available which enable the outcomes of HPS initiatives to be assessed. This paper identifies an evaluation framework developed in Hong Kong. The framework uses a range of approaches to explore what schools actually do in their health promotion and health education initiatives. The framework, which is based on the WHO (Western Pacific Regional Office) Guidelines for HPS, is described in detail. The appropriate instruments for data collection are described and their origins identified. The evaluation plan and protocol, which underpinned the very comprehensive evaluation in Hong Kong, are explained. Finally, a case is argued for evaluation of HPS to be more in line with the educational dynamics of schools and the research literature on effective schooling, rather than focusing primarily on health-related measures.
Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaidon, Clement; Poplawski, Michael
First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.
Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'
NASA Astrophysics Data System (ADS)
Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno
2015-04-01
Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.
BioNet Digital Communications Framework
NASA Technical Reports Server (NTRS)
Gifford, Kevin; Kuzminsky, Sebastian; Williams, Shea
2010-01-01
BioNet v2 is a peer-to-peer middleware that enables digital communication devices to talk to each other. It provides a software development framework, standardized application, network-transparent device integration services, a flexible messaging model, and network communications for distributed applications. BioNet is an implementation of the Constellation Program Command, Control, Communications and Information (C3I) Interoperability specification, given in CxP 70022-01. The system architecture provides the necessary infrastructure for the integration of heterogeneous wired and wireless sensing and control devices into a unified data system with a standardized application interface, providing plug-and-play operation for hardware and software systems. BioNet v2 features a naming schema for mobility and coarse-grained localization information, data normalization within a network-transparent device driver framework, enabling of network communications to non-IP devices, and fine-grained application control of data subscription band width usage. BioNet directly integrates Disruption Tolerant Networking (DTN) as a communications technology, enabling networked communications with assets that are only intermittently connected including orbiting relay satellites and planetary rover vehicles.
NASA Technical Reports Server (NTRS)
Stephens, J. Briscoe; Grider, Gary W.
1992-01-01
These Earth Science and Applications Division-Data and Information System (ESAD-DIS) interoperability requirements are designed to quantify the Earth Science and Application Division's hardware and software requirements in terms of communications between personal and visualization workstation, and mainframe computers. The electronic mail requirements and local area network (LAN) requirements are addressed. These interoperability requirements are top-level requirements framed around defining the existing ESAD-DIS interoperability and projecting known near-term requirements for both operational support and for management planning. Detailed requirements will be submitted on a case-by-case basis. This document is also intended as an overview of ESAD-DIs interoperability for new-comers and management not familiar with these activities. It is intended as background documentation to support requests for resources and support requirements.
An open repositories network development for medical teaching resources.
Soula, Gérard; Darmoni, Stefan; Le Beux, Pierre; Renard, Jean-Marie; Dahamna, Badisse; Fieschi, Marius
2010-01-01
The lack of interoperability between repositories of heterogeneous and geographically widespread data is an obstacle to the diffusion, sharing and reutilization of those data. We present the development of an open repositories network taking into account both the syntactic and semantic interoperability of the different repositories and based on international standards in this field. The network is used by the medical community in France for the diffusion and sharing of digital teaching resources. The syntactic interoperability of the repositories is managed using the OAI-PMH protocol for the exchange of metadata describing the resources. Semantic interoperability is based, on one hand, on the LOM standard for the description of resources and on MESH for the indexing of the latter and, on the other hand, on semantic interoperability management designed to optimize compliance with standards and the quality of the metadata.
ERIC Educational Resources Information Center
Fearnside, Rob
The Victorian school accountability framework is designed specifically for Victorian public schooling in the 1990s. These schools have three chief characteristics: (1) a high level of school autonomy in operational decisions about research allocation, human-resource management, and staff selection; (2) a common framework for curriculum and…
Ensuring Sustainable Data Interoperability Across the Natural and Social Sciences
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2015-12-01
Both the natural and social science data communities are attempting to address the long-term sustainability of their data infrastructures in rapidly changing research, technological, and policy environments. Many parts of these communities are also considering how to improve the interoperability and integration of their data and systems across natural, social, health, and other domains. However, these efforts have generally been undertaken in parallel, with little thought about how different sustainability approaches may impact long-term interoperability from scientific, legal, or economic perspectives, or vice versa, i.e., how improved interoperability could enhance—or threaten—infrastructure sustainability. Scientific progress depends substantially on the ability to learn from the legacy of previous work available for current and future scientists to study, often by integrating disparate data not previously assembled. Digital data are less likely than scientific publications to be usable in the future unless they are managed by science-oriented repositories that can support long-term data access with the documentation and services needed for future interoperability. We summarize recent discussions in the social and natural science communities on emerging approaches to sustainability and relevant interoperability activities, including efforts by the Belmont Forum E-Infrastructures project to address global change data infrastructure needs; the Group on Earth Observations to further implement data sharing and improve data management across diverse societal benefit areas; and the Research Data Alliance to develop legal interoperability principles and guidelines and to address challenges faced by domain repositories. We also examine emerging needs for data interoperability in the context of the post-2015 development agenda and the expected set of Sustainable Development Goals (SDGs), which set ambitious targets for sustainable development, poverty reduction, and environmental stewardship by 2030. These efforts suggest the need for a holistic approach towards improving and implementing strategies, policies, and practices that will ensure long-term sustainability and interoperability of scientific data repositories and networks across multiple scientific domains.
Groundwater data network interoperability
Brodaric, Boyan; Booth, Nathaniel; Boisvert, Eric; Lucido, Jessica M.
2016-01-01
Water data networks are increasingly being integrated to answer complex scientific questions that often span large geographical areas and cross political borders. Data heterogeneity is a major obstacle that impedes interoperability within and between such networks. It is resolved here for groundwater data at five levels of interoperability, within a Spatial Data Infrastructure architecture. The result is a pair of distinct national groundwater data networks for the United States and Canada, and a combined data network in which they are interoperable. This combined data network enables, for the first time, transparent public access to harmonized groundwater data from both sides of the shared international border.
Report on the Second Catalog Interoperability Workshop
NASA Technical Reports Server (NTRS)
Thieman, James R.; James, Mary E.
1988-01-01
The events, resolutions, and recommendations of the Second Catalog Interoperability Workshop, held at JPL in January, 1988, are discussed. This workshop dealt with the issues of standardization and communication among directories, catalogs, and inventories in the earth and space science data management environment. The Directory Interchange Format, being constructed as a standard for the exchange of directory information among participating data systems, is discussed. Involvement in the Interoperability effort by NASA, NOAA, ISGS, and NSF is described, and plans for future interoperability considered. The NASA Master Directory prototype is presented and critiqued and options for additional capabilities debated.
A logical approach to semantic interoperability in healthcare.
Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni
2011-01-01
Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.
NASA Astrophysics Data System (ADS)
Foglini, Federica; Boero, Ferdinando; Guarino, Raffaele
2016-04-01
The EU's H2020 EVER-EST Project is dedicated to the realization of a Virtual Research Environment (VRE) for Earth Science researchers during 2015-2018. In this framework the Sea monitoring represents one of the four use case VRCs chosen to validate the EVER-EST e-infrastructure, which is aimed at representing a wide and multidisciplinary Earth Science domain. The objective of the Sea Monitoring Virtual Research Community (VRC) is to provide useful and applicable contributions to the identification and definition of variables indicated by the European Commission in the Marine Directive under the framework for Good Environment Status (GES). The European Marine Strategy Framework Directive (MSFD, http://ec.europa.eu/environment/marine/index_en.htm) has defined the descriptors for Good Environmental Status in marine waters. The first descriptor is biodiversity; the second one is the presence of non-indigenous species while the remaining nine (even when they consider physical, chemical or geological variables) require proper functioning of the ecosystem, linked to a good state of biodiversity. The Sea Monitoring VRC is direct to provide practical methods, procedures and protocols to support coherent and widely accepted interpretation of the Descriptors 1(Biodiversity), 2 (non- indigenous species), 4 (food webs) and 6 (seafloor integrity) identified in GES. In that context, the criteria and methodological standards already identified by the European Commission, and at same time considering the activities and projects in progress in the marine framework, will be taken into account. This research of practical methods to estimate and measure GES parameters requires a close cooperation among different disciplines including: biologists, geologists, geophysics, oceanographers, Earth observation experts and others. It will also require a number of different types of scientific data and observations (e.g. biology related, chemico-physical, etc.) from different inputs and sensors (e.g. remote sensing, on-site buoys, marine stations, administrations, citizen observations, etc.). Furthermore, different communities require support and guidance to be able to effectively interoperate and share practices, methods, standards and terminologies. The EVER-EST VRE will provide the Sea Monitoring VRC users community with an innovative framework aimed at enhancing their ability to interoperate and share knowledge, experience and methods for GES assessment and monitoring. Furthermore the Sea monitoring VRC will focus the attention on the implementation of Research Object (RO, a semantically rich aggregation of resources bringing together data, documents and methods in scientific investigations) for GES assessment to be shared among the wide sea monitoring community for the first time.
Food Practices and School Connectedness: A Whole-School Approach
ERIC Educational Resources Information Center
Neely, Eva; Walton, Mat; Stephens, Christine
2016-01-01
Purpose: The health-promoting schools (HPSs) framework has emerged as a promising model for promoting school connectedness in the school setting. The purpose of this paper is to explore the potential for food practices to promote school connectedness within a HPSs framework. Design/methodology/approach: This study explores food practices within a…
ERIC Educational Resources Information Center
Smith, Laura; Beck, Katharine; Bernstein, Erinn; Dashtguard, Pasha
2014-01-01
The professional school counseling literature has proposed innovative frameworks for practice including social justice/multicultural approaches, school-wide counseling initiatives, and school-community partnerships. In this article, we propose a programmatic intervention that can be a vehicle for all three: the implementation of school-based youth…
Do health-promoting schools improve nutrition in China?
Wang, Dongxu; Stewart, Donald; Yuan, Yanfei; Chang, Chun
2015-06-01
To demonstrate the effectiveness of health-promoting school framework to promoting healthy eating behaviours and nutrition knowledge among Chinese middle school students, their parents and school staff. Three schools were randomly selected from 15 rural middle schools, then were randomly assigned to either (i) school using HPS framework (HPS school), (ii) school with improved health education only (HE school) or (iii) school received no intervention (control school). Nutrition knowledge and eating behaviours were measured at baseline and 3-month after interventions, using the same instrument. Students and parents in the HPS school had the largest improvement in nutrition knowledge, from 4.92 to 8.23 and 4.84 to 7.74, followed by those in the HE school, from 4.98 to 8.09 and 4.78 to 5.80. School staff in the HE school had the largest improvement in nutrition knowledge (from 4.40 to 8.45), followed by those in the HPS school (from 5.20 to 9.15). Students in the HPS school had the largest improvement in eating behaviours (from 3.16 to 4.13), followed by those in the HE school (from 2.78 to 3.54). There was a statistical difference in the improvement of nutrition knowledge of all target population and of eating behaviours of students after interventions across three schools (p < 0.05). Both HPS framework and health education can increase nutrition knowledge among Chinese middle school students, their parents and school staff. However, HPS framework was more effective than health education only. Noticeably, HPS framework had a positive impact on students' eating behaviours, which should be in the subject of further research. © The Author (2013). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Marcos, Mar; Maldonado, Jose A; Martínez-Salvador, Begoña; Boscá, Diego; Robles, Montserrat
2013-08-01
Clinical decision-support systems (CDSSs) comprise systems as diverse as sophisticated platforms to store and manage clinical data, tools to alert clinicians of problematic situations, or decision-making tools to assist clinicians. Irrespective of the kind of decision-support task CDSSs should be smoothly integrated within the clinical information system, interacting with other components, in particular with the electronic health record (EHR). However, despite decades of developments, most CDSSs lack interoperability features. We deal with the interoperability problem of CDSSs and EHRs by exploiting the dual-model methodology. This methodology distinguishes a reference model and archetypes. A reference model is represented by a stable and small object-oriented model that describes the generic properties of health record information. For their part, archetypes are reusable and domain-specific definitions of clinical concepts in the form of structured and constrained combinations of the entities of the reference model. We rely on archetypes to make the CDSS compatible with EHRs from different institutions. Concretely, we use archetypes for modelling the clinical concepts that the CDSS requires, in conjunction with a series of knowledge-intensive mappings relating the archetypes to the data sources (EHR and/or other archetypes) they depend on. We introduce a comprehensive approach, including a set of tools as well as methodological guidelines, to deal with the interoperability of CDSSs and EHRs based on archetypes. Archetypes are used to build a conceptual layer of the kind of a virtual health record (VHR) over the EHR whose contents need to be integrated and used in the CDSS, associating them with structural and terminology-based semantics. Subsequently, the archetypes are mapped to the EHR by means of an expressive mapping language and specific-purpose tools. We also describe a case study where the tools and methodology have been employed in a CDSS to support patient recruitment in the framework of a clinical trial for colorectal cancer screening. The utilisation of archetypes not only has proved satisfactory to achieve interoperability between CDSSs and EHRs but also offers various advantages, in particular from a data model perspective. First, the VHR/data models we work with are of a high level of abstraction and can incorporate semantic descriptions. Second, archetypes can potentially deal with different EHR architectures, due to their deliberate independence of the reference model. Third, the archetype instances we obtain are valid instances of the underlying reference model, which would enable e.g. feeding back the EHR with data derived by abstraction mechanisms. Lastly, the medical and technical validity of archetype models would be assured, since in principle clinicians should be the main actors in their development. Copyright © 2013 Elsevier Inc. All rights reserved.
A Successful Component Architecture for Interoperable and Evolvable Ground Data Systems
NASA Technical Reports Server (NTRS)
Smith, Danford S.; Bristow, John O.; Wilmot, Jonathan
2006-01-01
The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) has adopted an open architecture approach for satellite control centers and is now realizing benefits beyond those originally envisioned. The Goddard Mission Services Evolution Center (GMSEC) architecture utilizes standardized interfaces and a middleware software bus to allow functional components to be easily integrated. This paper presents the GMSEC architectural goals and concepts, the capabilities enabled and the benefits realized by adopting this framework approach. NASA experiences with applying the GMSEC architecture on multiple missions are discussed. The paper concludes with a summary of lessons learned, future directions for GMSEC and the possible applications beyond NASA GSFC.
Cardea: Dynamic Access Control in Distributed Systems
NASA Technical Reports Server (NTRS)
Lepro, Rebekah
2004-01-01
Modern authorization systems span domains of administration, rely on many different authentication sources, and manage complex attributes as part of the authorization process. This . paper presents Cardea, a distributed system that facilitates dynamic access control, as a valuable piece of an inter-operable authorization framework. First, the authorization model employed in Cardea and its functionality goals are examined. Next, critical features of the system architecture and its handling of the authorization process are then examined. Then the S A M L and XACML standards, as incorporated into the system, are analyzed. Finally, the future directions of this project are outlined and connection points with general components of an authorization system are highlighted.
Wang, Anliang; Yan, Xiaolong; Wei, Zhijun
2018-04-27
This note presents the design of a scalable software package named ImagePy for analysing biological images. Our contribution is concentrated on facilitating extensibility and interoperability of the software through decoupling the data model from the user interface. Especially with assistance from the Python ecosystem, this software framework makes modern computer algorithms easier to be applied in bioimage analysis. ImagePy is free and open source software, with documentation and code available at https://github.com/Image-Py/imagepy under the BSD license. It has been tested on the Windows, Mac and Linux operating systems. wzjdlut@dlut.edu.cn or yxdragon@imagepy.org.
Advances in simulation of wave interactions with extended MHD phenomena
NASA Astrophysics Data System (ADS)
Batchelor, D.; Abla, G.; D'Azevedo, E.; Bateman, G.; Bernholdt, D. E.; Berry, L.; Bonoli, P.; Bramley, R.; Breslau, J.; Chance, M.; Chen, J.; Choi, M.; Elwasif, W.; Foley, S.; Fu, G.; Harvey, R.; Jaeger, E.; Jardin, S.; Jenkins, T.; Keyes, D.; Klasky, S.; Kruger, S.; Ku, L.; Lynch, V.; McCune, D.; Ramos, J.; Schissel, D.; Schnack, D.; Wright, J.
2009-07-01
The Integrated Plasma Simulator (IPS) provides a framework within which some of the most advanced, massively-parallel fusion modeling codes can be interoperated to provide a detailed picture of the multi-physics processes involved in fusion experiments. The presentation will cover four topics: 1) recent improvements to the IPS, 2) application of the IPS for very high resolution simulations of ITER scenarios, 3) studies of resistive and ideal MHD stability in tokamk discharges using IPS facilities, and 4) the application of RF power in the electron cyclotron range of frequencies to control slowly growing MHD modes in tokamaks and initial evaluations of optimized location for RF power deposition.
Open Source Live Distributions for Computer Forensics
NASA Astrophysics Data System (ADS)
Giustini, Giancarlo; Andreolini, Mauro; Colajanni, Michele
Current distributions of open source forensic software provide digital investigators with a large set of heterogeneous tools. Their use is not always focused on the target and requires high technical expertise. We present a new GNU/Linux live distribution, named CAINE (Computer Aided INvestigative Environment) that contains a collection of tools wrapped up into a user friendly environment. The CAINE forensic framework introduces novel important features, aimed at filling the interoperability gap across different forensic tools. Moreover, it provides a homogeneous graphical interface that drives digital investigators during the acquisition and analysis of electronic evidence, and it offers a semi-automatic mechanism for the creation of the final report.
School Culture and Teenage Substance Use: A Conceptual and Operational Framework
ERIC Educational Resources Information Center
Markham, Wolfgang A.
2015-01-01
This paper outlines a conceptual and operational framework for understanding the relationships between school culture and teenage substance use (smoking, drinking and illicit drug use). The framework draws upon Bernstein's theory of cultural transmission, a theory of health promoting schools and a frame for understanding the effects of place on…
Building a Framework for Engineering Design Experiences in High School
ERIC Educational Resources Information Center
Denson, Cameron D.; Lammi, Matthew
2014-01-01
In this article, Denson and Lammi put forth a conceptual framework that will help promote the successful infusion of engineering design experiences into high school settings. When considering a conceptual framework of engineering design in high school settings, it is important to consider the complex issue at hand. For the purposes of this…
ERIC Educational Resources Information Center
Rice, John; Huang, Min
2015-01-01
Clark County School District (CCSD) asked the Western Regional Education Laboratory (REL West) to examine the relationship between spending per pupil and Nevada School Performance Framework (NSPF) index scores in the district's schools. Data were examined from three school years (2011/12, 2012/13, 2013/14) and for three types of schools…
ERIC Educational Resources Information Center
Goodman-Scott, Emily
2014-01-01
School-Wide Positive Behavioral Interventions and Supports (PBIS) are school-wide, data-driven frameworks for promoting safe schools and student learning. This article explains PBIS and provides practical examples of PBIS implementation by describing a school counselor-run PBIS framework in one elementary school, as part of a larger, district-wide…
A Framework for School Change--The School Portfolio.
ERIC Educational Resources Information Center
Bernhardt, Victoria L.
The "Education for the Future Initiative" is a national, privately-funded program initially sponsored by the Telesis Foundation and recently joined by Arthur Andersen & Co. to help schools undertake continuous improvement. The Initiative has pioneered a framework for school improvement in the form of a school portfolio. This document offers a…
A cloud-based approach for interoperable electronic health records (EHRs).
Bahga, Arshdeep; Madisetti, Vijay K
2013-09-01
We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.
Reuse and Interoperability of Avionics for Space Systems
NASA Technical Reports Server (NTRS)
Hodson, Robert F.
2007-01-01
The space environment presents unique challenges for avionics. Launch survivability, thermal management, radiation protection, and other factors are important for successful space designs. Many existing avionics designs use custom hardware and software to meet the requirements of space systems. Although some space vendors have moved more towards a standard product line approach to avionics, the space industry still lacks similar standards and common practices for avionics development. This lack of commonality manifests itself in limited reuse and a lack of interoperability. To address NASA s need for interoperable avionics that facilitate reuse, several hardware and software approaches are discussed. Experiences with existing space boards and the application of terrestrial standards is outlined. Enhancements and extensions to these standards are considered. A modular stack-based approach to space avionics is presented. Software and reconfigurable logic cores are considered for extending interoperability and reuse. Finally, some of the issues associated with the design of reusable interoperable avionics are discussed.
Maras, Melissa A; Weston, Karen J; Blacksmith, Jennifer; Brophy, Chelsey
2015-03-01
Schools must possess a variety of capacities to effectively support comprehensive and coordinated school health promotion activities, and researchers have developed a district-level capacity-building framework specific to school health promotion. State-level school health coalitions often support such capacity-building efforts and should embed this work within a data-based, decision-making model. However, there is a lack of guidance for state school health coalitions on how they should collect and use data. This article uses a district-level capacity-building framework to interpret findings from a statewide coordinated school health needs/resource assessment in order to examine statewide capacity for school health promotion. Participants included school personnel (N = 643) from one state. Descriptive statistics were calculated for survey items, with further examination of subgroup differences among school administrators and nurses. Results were then interpreted via a post hoc application of a district-level capacity-building framework. Findings across districts revealed statewide strengths and gaps with regard to leadership and management capacities, internal and external supports, and an indicator of global capacity. Findings support the utility of using a common framework across local and state levels to align efforts and embed capacity-building activities within a data-driven, continuous improvement model. © 2014 Society for Public Health Education.
Incorporating Brokers within Collaboration Environments
NASA Astrophysics Data System (ADS)
Rajasekar, A.; Moore, R.; de Torcy, A.
2013-12-01
A collaboration environment, such as the integrated Rule Oriented Data System (iRODS - http://irods.diceresearch.org), provides interoperability mechanisms for accessing storage systems, authentication systems, messaging systems, information catalogs, networks, and policy engines from a wide variety of clients. The interoperability mechanisms function as brokers, translating actions requested by clients to the protocol required by a specific technology. The iRODS data grid is used to enable collaborative research within hydrology, seismology, earth science, climate, oceanography, plant biology, astronomy, physics, and genomics disciplines. Although each domain has unique resources, data formats, semantics, and protocols, the iRODS system provides a generic framework that is capable of managing collaborative research initiatives that span multiple disciplines. Each interoperability mechanism (broker) is linked to a name space that enables unified access across the heterogeneous systems. The collaboration environment provides not only support for brokers, but also support for virtualization of name spaces for users, files, collections, storage systems, metadata, and policies. The broker enables access to data or information in a remote system using the appropriate protocol, while the collaboration environment provides a uniform naming convention for accessing and manipulating each object. Within the NSF DataNet Federation Consortium project (http://www.datafed.org), three basic types of interoperability mechanisms have been identified and applied: 1) drivers for managing manipulation at the remote resource (such as data subsetting), 2) micro-services that execute the protocol required by the remote resource, and 3) policies for controlling the execution. For example, drivers have been written for manipulating NetCDF and HDF formatted files within THREDDS servers. Micro-services have been written that manage interactions with the CUAHSI data repository, the DataONE information catalog, and the GeoBrain broker. Policies have been written that manage transfer of messages between an iRODS message queue and the Advanced Message Queuing Protocol. Examples of these brokering mechanisms will be presented. The DFC collaboration environment serves as the intermediary between community resources and compute grids, enabling reproducible data-driven research. It is possible to create an analysis workflow that retrieves data subsets from a remote server, assemble the required input files, automate the execution of the workflow, automatically track the provenance of the workflow, and share the input files, workflow, and output files. A collaborator can re-execute a shared workflow, compare results, change input files, and re-execute an analysis.
Principles of data integration and interoperability in the GEO Biodiversity Observation Network
NASA Astrophysics Data System (ADS)
Saarenmaa, Hannu; Ó Tuama, Éamonn
2010-05-01
The goal of the Global Earth Observation System of Systems (GEOSS) is to link existing information systems into a global and flexible network to address nine areas of critical importance to society. One of these "societal benefit areas" is biodiversity and it will be supported by a GEOSS sub-system known as the GEO Biodiversity Observation Network (GEO BON). In planning the GEO BON, it was soon recognised that there are already a multitude of existing networks and initiatives in place worldwide. What has been lacking is a coordinated framework that allows for information sharing and exchange between the networks. Traversing across the various scales of biodiversity, in particular from the individual and species levels to the ecosystems level has long been a challenge. Furthermore, some of the major regions of the world have already taken steps to coordinate their efforts, but links between the regions have not been a priority until now. Linking biodiversity data to that of the other GEO societal benefit areas, in particular ecosystems, climate, and agriculture to produce useful information for the UN Conventions and other policy-making bodies is another need that calls for integration of information. Integration and interoperability are therefore a major theme of GEO BON, and a "system of systems" is very much needed. There are several approaches to integration that need to be considered. Data integration requires harmonising concepts, agreeing on vocabularies, and building ontologies. Semantic mediation of data using these building blocks is still not easy to achieve. Agreements on, or mappings between, the metadata standards that will be used across the networks is a major requirement that will need to be addressed early on. With interoperable metadata, service integration will be possible through registry of registries systems such as GBIF's forthcoming GBDRS and the GEO Clearinghouse. Chaining various services that build intermediate products using workflow systems will also help expedite the delivery of products and reports that are required for integrated assessment of data from many disciplines. Going beyond the Service Oriented Architectures which now are mainstream, these challenges have lately been addressed in the business world by adopting what is called a Semantic Enterprise Architecture. Semantic portals have been built, in particular, to address interoperability across domains, where users may not be familiar with concepts of all networks. We will discuss the applicability of these approaches for building the global GEO BON.
2008-08-01
facilitate the use of existing architecture descriptions in performing interoperability measurement. Noting that “everything in the world can be expressed as...biological, botanical, and genetic research, it has also been used with great success in the fields of ecology, medicine, the social sciences, the...appropriate for at least three reasons. First, systems perform different interoperations in different scenarios (i.e., they are used differently); second
Commanding Heterogeneous Multi-Robot Teams
2014-06-01
Coalition Battle Management Language (C-BML) Study Group Report. 2005 Fall Simulation Interoperability Workshop (05F- SIW - 041), Orlando, FL, September...NMSG-085 CIG Land Operation Demonstration. 2013 Spring Simulation Interoperability Workshop (13S- SIW -031), San Diego, CA. April 2013. [4] K...Simulation Interoperability Workshop (10F- SIW -039), Orlando, FL, September 2010. [5] M. Langerwisch, M. Ax, S. Thamke, T. Remmersmann, A. Tiderko
Dandanell, G
1992-01-01
The interoperator distance between a synthetic operator Os and the deoP2O2-galK fusion was varied between 46 and 176 bp. The repression of the deoP2 directed galK expression as a function of the interoperator distance (center-to-center) was measured in vivo in a single-copy system. The results show that the DeoR repressor efficiently can repress transcription at all the interoperator distances tested. The degree of repression depends very little on the spacing between the operators, however, a weak periodic dependency of 8-11 bp may exist. PMID:1437558
NASA Astrophysics Data System (ADS)
Tootle, G. A.; Gutenson, J. L.; Zhu, L.; Ernest, A. N. S.; Oubeidillah, A.; Zhang, X.
2015-12-01
The National Flood Interoperability Experiment (NFIE) held June 3-July 17, 2015 at the National Water Center (NWC) in Tuscaloosa, Alabama sought to demonstrate an increase in flood predictive capacity for the coterminous United States (CONUS). Accordingly, NFIE-derived technologies and workflows offer the ability to forecast flood damage and economic consequence estimates that coincide with the hydrologic and hydraulic estimations these physics-based models generate. A model providing an accurate prediction of damage and economic consequences is a valuable asset when allocating funding for disaster response, recovery, and relief. Damage prediction and economic consequence assessment also offer an adaptation planning mechanism for defending particularly valuable or vulnerable structures. The NFIE, held at the NWC on The University of Alabama (UA) campus led to the development of this large scale flow and inundation forecasting framework. Currently, the system can produce 15-hour lead-time forecasts for the entire coterminous United States (CONUS). A concept which is anticipated to become operational as of May 2016 within the NWC. The processing of such a large-scale, fine resolution model is accomplished in a parallel computing environment using large supercomputing clusters. Traditionally, flood damage and economic consequence assessment is calculated in a desktop computing environment with a ménage of meteorology, hydrology, hydraulic, and damage assessment tools. In the United States, there are a range of these flood damage/ economic consequence assessment software's available to local, state, and federal emergency management agencies. Among the more commonly used and freely accessible models are the Hydrologic Engineering Center's Flood Damage Reduction Analysis (HEC-FDA), Flood Impact Assessment (HEC-FIA), and Federal Emergency Management Agency's (FEMA's) United States Multi-Hazard (Hazus-MH). All of which exist only in a desktop environment. With this, authors submit an initial framework for estimating damage and economic consequences to floods using flow and inundation products from the NFIE framework. This adaptive system utilizes existing nationwide datasets describing location and use of structures and can take assimilate a range of data resolutions.
HuPSON: the human physiology simulation ontology
2013-01-01
Background Large biomedical simulation initiatives, such as the Virtual Physiological Human (VPH), are substantially dependent on controlled vocabularies to facilitate the exchange of information, of data and of models. Hindering these initiatives is a lack of a comprehensive ontology that covers the essential concepts of the simulation domain. Results We propose a first version of a newly constructed ontology, HuPSON, as a basis for shared semantics and interoperability of simulations, of models, of algorithms and of other resources in this domain. The ontology is based on the Basic Formal Ontology, and adheres to the MIREOT principles; the constructed ontology has been evaluated via structural features, competency questions and use case scenarios. The ontology is freely available at: http://www.scai.fraunhofer.de/en/business-research-areas/bioinformatics/downloads.html (owl files) and http://bishop.scai.fraunhofer.de/scaiview/ (browser). Conclusions HuPSON provides a framework for a) annotating simulation experiments, b) retrieving relevant information that are required for modelling, c) enabling interoperability of algorithmic approaches used in biomedical simulation, d) comparing simulation results and e) linking knowledge-based approaches to simulation-based approaches. It is meant to foster a more rapid uptake of semantic technologies in the modelling and simulation domain, with particular focus on the VPH domain. PMID:24267822
NASA Astrophysics Data System (ADS)
Yu, Xu; Shao, Quanqin; Zhu, Yunhai; Deng, Yuejin; Yang, Haijun
2006-10-01
With the development of informationization and the separation between data management departments and application departments, spatial data sharing becomes one of the most important objectives for the spatial information infrastructure construction, and spatial metadata management system, data transmission security and data compression are the key technologies to realize spatial data sharing. This paper discusses the key technologies for metadata based on data interoperability, deeply researches the data compression algorithms such as adaptive Huffman algorithm, LZ77 and LZ78 algorithm, studies to apply digital signature technique to encrypt spatial data, which can not only identify the transmitter of spatial data, but also find timely whether the spatial data are sophisticated during the course of network transmission, and based on the analysis of symmetric encryption algorithms including 3DES,AES and asymmetric encryption algorithm - RAS, combining with HASH algorithm, presents a improved mix encryption method for spatial data. Digital signature technology and digital watermarking technology are also discussed. Then, a new solution of spatial data network distribution is put forward, which adopts three-layer architecture. Based on the framework, we give a spatial data network distribution system, which is efficient and safe, and also prove the feasibility and validity of the proposed solution.
The Microbial Resource Research Infrastructure MIRRI: Strength through Coordination
Stackebrandt, Erko; Schüngel, Manuela; Martin, Dunja; Smith, David
2015-01-01
Microbial resources have been recognized as essential raw materials for the advancement of health and later for biotechnology, agriculture, food technology and for research in the life sciences, as their enormous abundance and diversity offer an unparalleled source of unexplored solutions. Microbial domain biological resource centres (mBRC) provide live cultures and associated data to foster and support the development of basic and applied science in countries worldwide and especially in Europe, where the density of highly advanced mBRCs is high. The not-for-profit and distributed project MIRRI (Microbial Resource Research Infrastructure) aims to coordinate access to hitherto individually managed resources by developing a pan-European platform which takes the interoperability and accessibility of resources and data to a higher level. Providing a wealth of additional information and linking to datasets such as literature, environmental data, sequences and chemistry will enable researchers to select organisms suitable for their research and enable innovative solutions to be developed. The current independent policies and managed processes will be adapted by partner mBRCs to harmonize holdings, services, training, and accession policy and to share expertise. The infrastructure will improve access to enhanced quality microorganisms in an appropriate legal framework and to resource-associated data in a more interoperable way. PMID:27682123
Hosek, Susan D; Straus, Susan G
2013-01-01
The Military Health System (MHS) and the Veterans Health Administration (VHA) have been among the nation's leaders in health information technology (IT), including the development of health IT systems and electronic health records that summarize patients' care from multiple providers. Health IT interoperability within MHS and across MHS partners, including VHA, is one of ten goals in the current MHS Strategic Plan. As a step toward achieving improved interoperability, the MHS is seeking to develop a research roadmap to better coordinate health IT research efforts, address IT capability gaps, and reduce programmatic risk for its enterprise projects. This article contributes to that effort by identifying gaps in research, policy, and practice involving patient privacy, consent, and identity management that need to be addressed to bring about improved quality and efficiency of care through health information exchange. Major challenges include (1) designing a meaningful patient consent procedure, (2) recording patients' consent preferences and designing procedures to implement restrictions on disclosures of protected health information, and (3) advancing knowledge regarding the best technical approaches to performing patient identity matches and how best to monitor results over time. Using a sociotechnical framework, this article suggests steps for overcoming these challenges and topics for future research.
The Microbial Resource Research Infrastructure MIRRI: Strength through Coordination.
Stackebrandt, Erko; Schüngel, Manuela; Martin, Dunja; Smith, David
2015-11-18
Microbial resources have been recognized as essential raw materials for the advancement of health and later for biotechnology, agriculture, food technology and for research in the life sciences, as their enormous abundance and diversity offer an unparalleled source of unexplored solutions. Microbial domain biological resource centres (mBRC) provide live cultures and associated data to foster and support the development of basic and applied science in countries worldwide and especially in Europe, where the density of highly advanced mBRCs is high. The not-for-profit and distributed project MIRRI (Microbial Resource Research Infrastructure) aims to coordinate access to hitherto individually managed resources by developing a pan-European platform which takes the interoperability and accessibility of resources and data to a higher level. Providing a wealth of additional information and linking to datasets such as literature, environmental data, sequences and chemistry will enable researchers to select organisms suitable for their research and enable innovative solutions to be developed. The current independent policies and managed processes will be adapted by partner mBRCs to harmonize holdings, services, training, and accession policy and to share expertise. The infrastructure will improve access to enhanced quality microorganisms in an appropriate legal framework and to resource-associated data in a more interoperable way.
ERIC Educational Resources Information Center
Friend, Jennifer; Caruthers, Loyce
2015-01-01
This article advances a framework for educators to create pathways to elicit students' diverse perspectives as qualitative data sources in the process of urban school renewal. Elements of the framework are discussed in conjunction with relevant research and findings from videotaped interviews with elementary (n = 144) and secondary (n = 28)…
A framework for designing hand hygiene educational interventions in schools.
Appiah-Brempong, Emmanuel; Harris, Muriel J; Newton, Samuel; Gulis, Gabriel
2018-03-01
Hygiene education appears to be the commonest school-based intervention for preventing infectious diseases, especially in the developing world. Nevertheless, there remains a gap in literature regarding a school-specific theory-based framework for designing a hand hygiene educational intervention in schools. We sought to suggest a framework underpinned by psychosocial theories towards bridging this knowledge gap. Furthermore, we sought to propound a more comprehensive definition of hand hygiene which could guide the conceptualisation of hand hygiene interventions in varied settings. Literature search was guided by a standardized tool and literature was retrieved on the basis of a predetermined inclusion criteria. Databases consulted include PubMed, ERIC, and EBSCO host (Medline, CINAHL, PsycINFO, etc.). Evidence bordering on a theoretical framework to aid the design of school-based hand hygiene educational interventions is summarized narratively. School-based hand hygiene educational interventions seeking to positively influence behavioural outcomes could consider enhancing psychosocial variables including behavioural capacity, attitudes and subjective norms (normative beliefs and motivation to comply). A framework underpinned by formalized psychosocial theories has relevance and could enhance the design of hand hygiene educational interventions, especially in schools.
Ovies-Bernal, Diana Paola; Agudelo-Londoño, Sandra M
2014-01-01
Identify shared criteria used throughout the world in the implementation of interoperable National Health Information Systems (NHIS) and provide validated scientific information on the dimensions affecting interoperability. This systematic review sought to identify primary articles on the implementation of interoperable NHIS published in scientific journals in English, Portuguese, or Spanish between 1990 and 2011 through a search of eight databases of electronic journals in the health sciences and informatics: MEDLINE (PubMed), Proquest, Ovid, EBSCO, MD Consult, Virtual Health Library, Metapress, and SciELO. The full texts of the articles were reviewed, and those that focused on technical computer aspects or on normative issues were excluded, as well as those that did not meet the quality criteria for systematic reviews of interventions. Of 291 studies found and reviewed, only five met the inclusion criteria. These articles reported on the process of implementing an interoperable NHIS in Brazil, China, the United States, Turkey, and the Semiautonomous Region of Zanzíbar, respectively. Five common basic criteria affecting implementation of the NHIS were identified: standards in place to govern the process, availability of trained human talent, financial and structural constraints, definition of standards, and assurance that the information is secure. Four dimensions affecting interoperability were defined: technical, semantic, legal, and organizational. The criteria identified have to be adapted to the actual situation in each country and a proactive approach should be used to ensure that implementation of the interoperable NHIS is strategic, simple, and reliable.
Towards a Brokering Framework for Business Process Execution
NASA Astrophysics Data System (ADS)
Santoro, Mattia; Bigagli, Lorenzo; Roncella, Roberto; Mazzetti, Paolo; Nativi, Stefano
2013-04-01
Advancing our knowledge of environmental phenomena and their interconnections requires an intensive use of environmental models. Due to the complexity of Earth system, the representation of complex environmental processes often requires the use of more than one model (often from different disciplines). The Group on Earth Observation (GEO) launched the Model Web initiative to increase present accessibility and interoperability of environmental models, allowing their flexible composition into complex Business Processes (BPs). A few, basic principles are at the base of the Model Web concept (Nativi, et al.): (i) Open access, (ii) Minimal entry-barriers, (iii) Service-driven approach, and (iv) Scalability. This work proposes an architectural solution, based on the Brokering approach for multidisciplinary interoperability, aiming to contribute to the Model Web vision. The Brokering approach is currently adopted in the new GEOSS Common Infrastructure (GCI) as was presented at the last GEO Plenary meeting in Istanbul, November 2011. We designed and prototyped a component called BP Broker. The high-level functionalities provided by the BP Broker are: • Discover the needed model implementations in an open, distributed and heterogeneous environment; • Check I/O consistency of BPs and provide suggestions for mismatches resolving: • Publish the EBP as a standard model resource for re-use. • Submit the compiled BP (EBP) to a WF-engine for execution. A BP Broker has the following features: • Support multiple abstract BP specifications; • Support encoding in multiple WF-engine languages. According to the Brokering principles, the designed system is flexible enough to support the use of multiple BP design (visual) tools, heterogeneous Web interfaces for model execution (e.g. OGC WPS, WSDL, etc.), and different Workflow engines. The present implementation makes use of BPMN 2.0 notation for BP design and jBPM workflow engine for eBP execution; however, the strong decoupling which characterizes the design of the BP Broker easily allows supporting other technologies. The main benefits of the proposed approach are: (i) no need for a composition infrastructure, (ii) alleviation from technicalities of workflow definitions, (iii) support of incomplete BPs, and (iv) the reuse of existing BPs as atomic processes. The BP Broker was designed and prototyped in the EC funded projects EuroGEOSS (http://www.eurogeoss.eu) and UncertWeb (http://www.uncertweb.org); the latter project provided also the use scenarios that were used to test the framework: the eHabitat scenario (calculation habitat similarity likelihood) and the FERA scenario (impact of climate change on land-use and crop yield). Three more scenarios are presently under development. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreements n. 248488 and n. 226487. References Nativi, S., Mazzetti, P., & Geller, G. (2012), "Environmental model access and interoperability: The GEO Model Web initiative". Environmental Modelling & Software , 1-15
The MED-SUV Multidisciplinary Interoperability Infrastructure
NASA Astrophysics Data System (ADS)
Mazzetti, Paolo; D'Auria, Luca; Reitano, Danilo; Papeschi, Fabrizio; Roncella, Roberto; Puglisi, Giuseppe; Nativi, Stefano
2016-04-01
In accordance with the international Supersite initiative concept, the MED-SUV (MEDiterranean SUpersite Volcanoes) European project (http://med-suv.eu/) aims to enable long-term monitoring experiment in two relevant geologically active regions of Europe prone to natural hazards: Mt. Vesuvio/Campi Flegrei and Mt. Etna. This objective requires the integration of existing components, such as monitoring systems and data bases and novel sensors for the measurements of volcanic parameters. Moreover, MED-SUV is also a direct contribution to the Global Earth Observation System of Systems (GEOSS) as one the volcano Supersites recognized by the Group on Earth Observation (GEO). To achieve its goal, MED-SUV set up an advanced e-infrastructure allowing the discovery of and access to heterogeneous data for multidisciplinary applications, and the integration with external systems like GEOSS. The MED-SUV overall infrastructure is conceived as a three layer architecture with the lower layer (Data level) including the identified relevant data sources, the mid-tier (Supersite level) including components for mediation and harmonization , and the upper tier (Global level) composed of the systems that MED-SUV must serve, such as GEOSS and possibly other global/community systems. The Data level is mostly composed of existing data sources, such as space agencies satellite data archives, the UNAVCO system, the INGV-Rome data service. They share data according to different specifications for metadata, data and service interfaces, and cannot be changed. Thus, the only relevant MED-SUV activity at this level was the creation of a MED-SUV local repository based on Web Accessible Folder (WAF) technology, deployed in the INGV site in Catania, and hosting in-situ data and products collected and generated during the project. The Supersite level is at the core of the MED-SUV architecture, since it must mediate between the disparate data sources in the layer below, and provide a harmonized view to the layer above. In order to address data and service heteogeneity, the MED-SUV infrastructure is based on the brokered architecture approach, implemented using the GI-suite Brokering Framework for discovery and access. The GI-Suite Brokering Framework has been extended and configured to broker all the identified relevant data sources. It is also able to publish data according to several de-iure and de-facto standards including OGC CSW and OpenSearch, facilitating the interconnection with external systems. At the Global level, MED-SUV identified the interconnection with GEOSS as the main requirement. Since MED-SUV Supersite level is implemented based on the same technology adopted in the current GEOSS Common Infrastructure (GCI) by the GEO Discovery and Access Broker (GEO DAB), no major interoperability problem is foreseen. The MED-SUV Multidisciplinary Interoperability Infrastructure is complemented by a user portal providing human-to-machine interaction, and enabling data discovery and access. The GI-Suite Brokering Framework APIs and javascript library support machine-to-machine interaction, enabling the creation of mobile and Web applications using information available through the MED-SUV Supersite.
Toxicology ontology perspectives.
Hardy, Barry; Apic, Gordana; Carthew, Philip; Clark, Dominic; Cook, David; Dix, Ian; Escher, Sylvia; Hastings, Janna; Heard, David J; Jeliazkova, Nina; Judson, Philip; Matis-Mitchell, Sherri; Mitic, Dragana; Myatt, Glenn; Shah, Imran; Spjuth, Ola; Tcheremenskaia, Olga; Toldo, Luca; Watson, David; White, Andrew; Yang, Chihae
2012-01-01
The field of predictive toxicology requires the development of open, public, computable, standardized toxicology vocabularies and ontologies to support the applications required by in silico, in vitro, and in vivo toxicology methods and related analysis and reporting activities. In this article we review ontology developments based on a set of perspectives showing how ontologies are being used in predictive toxicology initiatives and applications. Perspectives on resources and initiatives reviewed include OpenTox, eTOX, Pistoia Alliance, ToxWiz, Virtual Liver, EU-ADR, BEL, ToxML, and Bioclipse. We also review existing ontology developments in neighboring fields that can contribute to establishing an ontological framework for predictive toxicology. A significant set of resources is already available to provide a foundation for an ontological framework for 21st century mechanistic-based toxicology research. Ontologies such as ToxWiz provide a basis for application to toxicology investigations, whereas other ontologies under development in the biological, chemical, and biomedical communities could be incorporated in an extended future framework. OpenTox has provided a semantic web framework for the implementation of such ontologies into software applications and linked data resources. Bioclipse developers have shown the benefit of interoperability obtained through ontology by being able to link their workbench application with remote OpenTox web services. Although these developments are promising, an increased international coordination of efforts is greatly needed to develop a more unified, standardized, and open toxicology ontology framework.
LVC Architecture Roadmap Implementation - Results of the First Two Years
2012-03-01
NOTES Presented at the Simulation Interoperability Standards Organization?s (SISO) Spring Simulation Interoperability Workshop ( SIW ), 26-30 March...presented at the semi-annual Simulation Interoperability Workshops ( SIWs ) and the annual Interservice/Industry Training, Simulation & Education Conference...I/ITSEC), as well as other venues. For example, a full-day workshop on the initial progress of the effort was conducted at the 2010 Spring SIW [2
Jagodnik, Kathleen M; Koplev, Simon; Jenkins, Sherry L; Ohno-Machado, Lucila; Paten, Benedict; Schurer, Stephan C; Dumontier, Michel; Verborgh, Ruben; Bui, Alex; Ping, Peipei; McKenna, Neil J; Madduri, Ravi; Pillai, Ajay; Ma'ayan, Avi
2017-07-01
The volume and diversity of data in biomedical research have been rapidly increasing in recent years. While such data hold significant promise for accelerating discovery, their use entails many challenges including: the need for adequate computational infrastructure, secure processes for data sharing and access, tools that allow researchers to find and integrate diverse datasets, and standardized methods of analysis. These are just some elements of a complex ecosystem that needs to be built to support the rapid accumulation of these data. The NIH Big Data to Knowledge (BD2K) initiative aims to facilitate digitally enabled biomedical research. Within the BD2K framework, the Commons initiative is intended to establish a virtual environment that will facilitate the use, interoperability, and discoverability of shared digital objects used for research. The BD2K Commons Framework Pilots Working Group (CFPWG) was established to clarify goals and work on pilot projects that address existing gaps toward realizing the vision of the BD2K Commons. This report reviews highlights from a two-day meeting involving the BD2K CFPWG to provide insights on trends and considerations in advancing Big Data science for biomedical research in the United States. Copyright © 2017 Elsevier Inc. All rights reserved.
Big Data Discovery and Access Services through NOAA OneStop
NASA Astrophysics Data System (ADS)
Casey, K. S.; Neufeld, D.; Ritchey, N. A.; Relph, J.; Fischman, D.; Baldwin, R.
2017-12-01
The NOAA OneStop Project was created as a pathfinder effort to to improve the discovery of, access to, and usability of NOAA's vast and diverse collection of big data. OneStop is led by the NOAA/NESDIS National Centers for Environmental Information (NCEI), and is seen as a key NESDIS contribution to NOAA's open data and data stewardship efforts. OneStop consists of an entire framework of services, from storage and interoperable access services at the base, through metadata and catalog services in the middle, to a modern user interface experience at the top. Importantly, it is an open framework where external tools and services can connect at whichever level is most appropriate. Since the beta release of the OneStop user interface at the 2016 Fall AGU meeting, significant progress has been made improving and modernizing many NOAA data collections to optimize their use within the framework. In addition, OneStop has made progress implementing robust metadata management and catalog systems at the collection and granule level and improving the user experience with the web interface. This progress will be summarized and the results of extensive user testing including professional usability studies will be reviewed. Key big data technologies supporting the framework will be presented and a community input sought on the future directions of the OneStop Project.
Eckart, J Dana; Sobral, Bruno W S
2003-01-01
The emergent needs of the bioinformatics community challenge current information systems. The pace of biological data generation far outstrips Moore's Law. Therefore, a gap continues to widen between the capabilities to produce biological (molecular and cell) data sets and the capability to manage and analyze these data sets. As a result, Federal investments in large data set generation produces diminishing returns in terms of the community's capabilities of understanding biology and leveraging that understanding to make scientific and technological advances that improve society. We are building an open framework to address various data management issues including data and tool interoperability, nomenclature and data communication standardization, and database integration. PathPort, short for Pathogen Portal, employs a generic, web-services based framework to deal with some of the problems identified by the bioinformatics community. The motivating research goal of a scalable system to provide data management and analysis for key pathosystems, especially relating to molecular data, has resulted in a generic framework using two major components. On the server-side, we employ web-services. On the client-side, a Java application called ToolBus acts as a client-side "bus" for contacting data and tools and viewing results through a single, consistent user interface.
Vision: A Conceptual Framework for School Counselors
ERIC Educational Resources Information Center
Watkinson, Jennifer Scaturo
2013-01-01
Vision is essential to the implementation of the American School Counselor Association (ASCA) National Model. Drawing from research in organizational leadership, this article provides a conceptual framework for how school counselors can incorporate vision as a strategy for implementing school counseling programs within the context of practice.…
The Maryland Youth Suicide Prevention School Program.
ERIC Educational Resources Information Center
Maryland State Dept. of Education, Baltimore.
The Maryland State Department of Education developed this framework for a suicide prevention program. The program framework addresses the following goals: (1) increase awareness among school personnel and community awareness among school personnel and community leaders of the incidence of teenage suicide; (2) train school personnel in individual…
Review of "Stuck Schools: A Framework for Identifying Schools Where Students Need Change--Now"
ERIC Educational Resources Information Center
Lee, Jaekyung
2010-01-01
The Education Trust research report "Stuck Schools" suggests a framework for identifying chronically low-performing schools in need of turnaround. The study uses Maryland and Indiana to show that some low-performing schools make progress while others remain stagnant. The report has four serious problems of reliability and validity,…
Local School Ideologies and Inclusion: The Case of Swedish Independent Schools
ERIC Educational Resources Information Center
Goransson, Kerstin; Malmqvist, Johan; Nilholm, Claes
2013-01-01
This paper reports on the development of a framework for the classification of local school ideologies in relation to inclusion that provides a tool for classifying the general educational direction as well as work with pupils in need of special support of individual schools. The framework defines different aspects of local school ideology in…
NEON: Contributing continental-scale long-term environmental data for the benefit of society
NASA Astrophysics Data System (ADS)
Wee, B.; Aulenbach, S.
2011-12-01
The National Ecological Observatory Network (NEON) is a NSF funded national investment in physical and information infrastructure. Large-scale environmental changes pose challenges that straddle environmental, economic, and social boundaries. As we develop climate adaptation strategies at the Federal, state, local, and tribal levels, accessible and usable data are essential for implementing actions that are informed by the best available information. NEON's goal is to enable understanding and forecasting of the impacts of climate change, land use change and invasive species on continental-scale ecology by providing physical and information infrastructure. The NEON framework will take standardized, long-term, coordinated measurements of related environmental variables at each of its 62 sites across the nation. These observations, collected by automated instruments, field crews, and airborne instruments, will be processed into more than 700 data products that are provided freely over the web to support research, education, and environmental management. NEON is envisioned to be an integral component of an interoperable ecosystem of credible data and information sources. Other members of this information ecosystem include Federal, commercial, and non-profit entities. NEON is actively involved with the interoperability community via forums like the Foundation for Earth Science Information Partners and the USGS Community for Data Integration in a collective effort to identify the technical standards, best practices, and organizational principles that enable the emergence of such an information ecosystem. These forums have proven to be effective innovation engines for the experimentation of new techniques that evolve into emergent standards. These standards are, for the most part, discipline agnostic. It is becoming increasingly evident that we need to include socio-economic and public health data sources in interoperability initiatives, because the dynamics of coupled natural-human systems cannot be understood in the absence of data about the human dimension. Another essential element is the community of tool and platform developers who create the infrastructure for scientists, educators, resource managers, and policy analysts to discover, analyze, and collaborate on problems using the diverse data that are required to address emerging large-scale environmental challenges. These challenges are very unlikely to be problems confined to this generation: they are urgent, compelling, and long-term problems that require a sustained effort to generate and curate data and information from observations, models, and experiments. NEON's long-term national physical and information infrastructure for environmental observation is one of the cornerstones of a framework that transforms science and information for the benefit of society.
Contributions of Ecological School Mental Health Services to Students' Academic Success
ERIC Educational Resources Information Center
Doll, Beth; Spies, Rob; Champion, Allison
2012-01-01
This article describes an ecological framework for school mental health services that differs in important ways from existing service delivery models. The model is based on research describing ecological frameworks underlying students' school success. Ecological characteristics of schools and classrooms that promote academic success are described…
The Organizational Configurations of Schools
ERIC Educational Resources Information Center
Lee, Linda C.
2010-01-01
The goal of this dissertation is twofold: 1) To develop a framework and method for simultaneously examining many different school organizations and their relation to patterns of school change and 2) To utilize that framework to obtain a district-wide understanding of the organizational functioning and change patterns of Chicago Public School (CPS)…
A Systems Approach to Rapid School Improvement
ERIC Educational Resources Information Center
McCauley, Carlas
2018-01-01
To support systemic thinking about school improvement, the Center on School Turnaround at WestEd developed a framework to assist states, districts, and schools in leading and managing rapid improvement efforts. The framework, which is presented in this article, has four domains that have proved central to rapid, significant improvement: (1)…
Eight Elements of High School Improvement: An Annotated Bibliography
ERIC Educational Resources Information Center
Jones, Wehmah
2009-01-01
This paper provides a table that contains a summary of resources available that support the National High School Center's "Eight Elements of High School Improvement: A Mapping Framework". This framework is based on eight common elements of systemic school reform and provides a structure for understanding, aligning, and prioritizing…
The Next Generation of Interoperability Agents in Healthcare
Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel ; Abelha, António; Machado, José
2014-01-01
Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA. PMID:24840351
The role of markup for enabling interoperability in health informatics.
McKeever, Steve; Johnson, David
2015-01-01
Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable.
NASA Astrophysics Data System (ADS)
Kruger, Scott; Shasharina, S.; Vadlamani, S.; McCune, D.; Holland, C.; Jenkins, T. G.; Candy, J.; Cary, J. R.; Hakim, A.; Miah, M.; Pletzer, A.
2010-11-01
As various efforts to integrate fusion codes proceed worldwide, standards for sharing data have emerged. In the U.S., the SWIM project has pioneered the development of the Plasma State, which has a flat-hierarchy and is dominated by its use within 1.5D transport codes. The European Integrated Tokamak Modeling effort has developed a more ambitious data interoperability effort organized around the concept of Consistent Physical Objects (CPOs). CPOs have deep hierarchies as needed by an effort that seeks to encompass all of fusion computing. Here, we discuss ideas for implementing data interoperability that is complementary to both the Plasma State and CPOs. By making use of attributes within the netcdf and HDF5 binary file formats, the goals of data interoperability can be achieved with a more informal approach. In addition, a file can be simultaneously interoperable to several standards at once. As an illustration of this approach, we discuss its application to the development of synthetic diagnostics that can be used for multiple codes.
NASA Technical Reports Server (NTRS)
Conroy, Mike; Gill, Paul; Ingalls, John; Bengtsson, Kjell
2014-01-01
No known system is in place to allow NASA technical data interoperability throughout the whole life cycle. Life Cycle Cost (LCC) will be higher on many developing programs if action isn't taken soon to join disparate systems efficiently. Disparate technical data also increases safety risks from poorly integrated elements. NASA requires interoperability and industry standards, but breaking legacy ways is a challenge.
Interacting with Multi-Robot Systems Using BML
2013-06-01
Pullen, U. Schade, J. Simonsen & R. Gomez-Veiga, NATO MSG-048 C-BML Final Report Summary. 2010 Fall Simulation Interoperability Workshop (10F- SIW -039...NATO MSG-085. 2012 Spring Simulation Interoperability Workshop (12S- SIW -045), Orlando, FL, March 2012. [3] T. Remmersmann, U. Schade, L. Khimeche...B. Grautreau & R. El Abdouni Khayari, Lessons Recognized: How to Combine BML and MSDL. 2012 Spring Simulation Interoperability Workshop (12S- SIW -012
A Linguistic Foundation for Communicating Geo-Information in the context of BML and geoBML
2010-03-23
BML Standard. 2009 Spring Simulation Interoperability Workshop (09S- SIW -046). San Diego, CA. Rein, K., Schade, U. & Hieb, M.R. (2009). Battle...Formalizing Battle Management Language: A Grammar for Specifying Orders. 2006 Spring Simulation Interoperability Workshop (06S- SIW - 068). Huntsville...Hieb, M.R. (2007). Battle Management Language: A Grammar for Specifying Reports. 2007 Spring Simulation Interoperability Workshop (07S- SIW -036
Semantic and syntactic interoperability in online processing of big Earth observation data.
Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea
2018-01-01
The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover).
Semantic and syntactic interoperability in online processing of big Earth observation data
Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea
2018-01-01
ABSTRACT The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover). PMID:29387171
IHE based interoperability - benefits and challenges.
Wozak, Florian; Ammenwerth, Elske; Hörbst, Alexander; Sögner, Peter; Mair, Richard; Schabetsberger, Thomas
2008-01-01
Optimized workflows and communication between institutions involved in a patient's treatment process can lead to improved quality and efficiency in the healthcare sector. Electronic Health Records (EHRs) provide a patient-centered access to clinical data across institutional boundaries supporting the above mentioned aspects. Interoperability is regarded as vital success factor. However a clear definition of interoperability does not exist. The aim of this work is to define and to assess interoperability criteria as required for EHRs. The definition and assessment of interoperability criteria is supported by the analysis of existing literature and personal experience as well as by discussions with several domain experts. Criteria for interoperability addresses the following aspects: Interfaces, Semantics, Legal and organizational aspects and Security. The Integrating the Healthcare Enterprises initiative (IHE) profiles make a major contribution to these aspects, but they also arise new problems. Flexibility for adoption to different organizational/regional or other specific conditions is missing. Regional or national initiatives should get a possibility to realize their specific needs within the boundaries of IHE profiles. Security so far is an optional element which is one of IHE greatest omissions. An integrated security approach seems to be preferable. Irrespective of the so far practical significance of the IHE profiles it appears to be of great importance, that the profiles are constantly checked against practical experiences and are continuously adapted.
Food product tracing technology capabilities and interoperability.
Bhatt, Tejas; Zhang, Jianrong Janet
2013-12-01
Despite the best efforts of food safety and food defense professionals, contaminated food continues to enter the food supply. It is imperative that contaminated food be removed from the supply chain as quickly as possible to protect public health and stabilize markets. To solve this problem, scores of technology companies purport to have the most effective, economical product tracing system. This study sought to compare and contrast the effectiveness of these systems at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. It also determined if these systems can work together to better secure the food supply (their interoperability). Institute of Food Technologists (IFT) hypothesized that when technology providers are given a full set of supply-chain data, even for a multi-ingredient product, their systems will generally be able to trace a contaminated product forward and backward through the supply chain. However, when provided with only a portion of supply-chain data, even for a product with a straightforward supply chain, it was expected that interoperability of the systems will be lacking and that there will be difficulty collaborating to identify sources and/or recipients of potentially contaminated product. IFT provided supply-chain data for one complex product to 9 product tracing technology providers, and then compared and contrasted their effectiveness at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. A vertically integrated foodservice restaurant agreed to work with IFT to secure data from its supply chain for both a multi-ingredient and a simpler product. Potential multi-ingredient products considered included canned tuna, supreme pizza, and beef tacos. IFT ensured that all supply-chain data collected did not include any proprietary information or information that would otherwise identify the supply-chain partner who provided the information prior to sharing this information with product tracing technology providers. The 9 traceability solution providers who agreed to participate in this project have their systems deployed in a wide range of sectors within the food industry including, but not limited to, livestock, dairy, produce, fruits, seafood, meat, and pork; as well as in pharmaceutical, automotive, retail, and other industries. Some have also been implemented across the globe including Canada, China, USA, Norway, and the EU, among others. This broad commercial use ensures that the findings of this work are applicable to a broad spectrum of the food system. Six of the 9 participants successfully completed the data entry phase of this test. To verify successful data entry for these 6, a demo or screenshots of the data set from each system's user interface was requested. Only 4 of the 6 were able to provide us with this evidence for verification. Of the 6 that completed data entry and moved on to the scenarios phase of the test, 5 were able to provide us with the responses to the scenarios. Time metrics were useful for evaluating the scalability and usability of each technology. Scalability was derived from the time it took to enter the nonstandardized data set into the system (ranges from 7 to 11 d). Usability was derived from the time it took to query the scenarios and provide the results (from a few hours to a week). The time was measured in days it took for the participants to respond after we supplied them all the information they would need to successfully execute each test/scenario. Two of the technology solution providers successfully implemented and participated in a proof-of-concept interoperable framework during Year 2 of this study. While not required, they also demonstrated this interoperability capability on the FSMA-mandated food product tracing pilots for the U.S. FDA. This has significant real-world impact since the demonstration of interoperability enables U.S. FDA to obtain evidence on the importance and impact of data-sharing moving forward. Another real-world accomplishment is the modification or upgrade of commercial technology solutions to enhance or implement interoperability. As these systems get deployed by clients in the food industry, interoperability will no longer be an afterthought but will be built into their traceability systems. In turn, industry and regulators will better understand the capabilities of the currently available technologies, and the technology provider community will identify ways in which their systems may be further developed to increase interoperability and utility. © 2013 Institute of Food Technologists®
Mission Control Technologies: A New Way of Designing and Evolving Mission Systems
NASA Technical Reports Server (NTRS)
Trimble, Jay; Walton, Joan; Saddler, Harry
2006-01-01
Current mission operations systems are built as a collection of monolithic software applications. Each application serves the needs of a specific user base associated with a discipline or functional role. Built to accomplish specific tasks, each application embodies specialized functional knowledge and has its own data storage, data models, programmatic interfaces, user interfaces, and customized business logic. In effect, each application creates its own walled-off environment. While individual applications are sometimes reused across multiple missions, it is expensive and time consuming to maintain these systems, and both costly and risky to upgrade them in the light of new requirements or modify them for new purposes. It is even more expensive to achieve new integrated activities across a set of monolithic applications. These problems impact the lifecycle cost (especially design, development, testing, training, maintenance, and integration) of each new mission operations system. They also inhibit system innovation and evolution. This in turn hinders NASA's ability to adopt new operations paradigms, including increasingly automated space systems, such as autonomous rovers, autonomous onboard crew systems, and integrated control of human and robotic missions. Hence, in order to achieve NASA's vision affordably and reliably, we need to consider and mature new ways to build mission control systems that overcome the problems inherent in systems of monolithic applications. The keys to the solution are modularity and interoperability. Modularity will increase extensibility (evolution), reusability, and maintainability. Interoperability will enable composition of larger systems out of smaller parts, and enable the construction of new integrated activities that tie together, at a deep level, the capabilities of many of the components. Modularity and interoperability together contribute to flexibility. The Mission Control Technologies (MCT) Project, a collaboration of multiple NASA Centers, led by NASA Ames Research Center, is building a framework to enable software to be assembled from flexible collections of components and services.
Argo: an integrative, interactive, text mining-based workbench supporting curation
Rak, Rafal; Rowley, Andrew; Black, William; Ananiadou, Sophia
2012-01-01
Curation of biomedical literature is often supported by the automatic analysis of textual content that generally involves a sequence of individual processing components. Text mining (TM) has been used to enhance the process of manual biocuration, but has been focused on specific databases and tasks rather than an environment integrating TM tools into the curation pipeline, catering for a variety of tasks, types of information and applications. Processing components usually come from different sources and often lack interoperability. The well established Unstructured Information Management Architecture is a framework that addresses interoperability by defining common data structures and interfaces. However, most of the efforts are targeted towards software developers and are not suitable for curators, or are otherwise inconvenient to use on a higher level of abstraction. To overcome these issues we introduce Argo, an interoperable, integrative, interactive and collaborative system for text analysis with a convenient graphic user interface to ease the development of processing workflows and boost productivity in labour-intensive manual curation. Robust, scalable text analytics follow a modular approach, adopting component modules for distinct levels of text analysis. The user interface is available entirely through a web browser that saves the user from going through often complicated and platform-dependent installation procedures. Argo comes with a predefined set of processing components commonly used in text analysis, while giving the users the ability to deposit their own components. The system accommodates various areas and levels of user expertise, from TM and computational linguistics to ontology-based curation. One of the key functionalities of Argo is its ability to seamlessly incorporate user-interactive components, such as manual annotation editors, into otherwise completely automatic pipelines. As a use case, we demonstrate the functionality of an in-built manual annotation editor that is well suited for in-text corpus annotation tasks. Database URL: http://www.nactem.ac.uk/Argo PMID:22434844
Academic Research Library as Broker in Addressing Interoperability Challenges for the Geosciences
NASA Astrophysics Data System (ADS)
Smith, P., II
2015-12-01
Data capture is an important process in the research lifecycle. Complete descriptive and representative information of the data or database is necessary during data collection whether in the field or in the research lab. The National Science Foundation's (NSF) Public Access Plan (2015) mandates the need for federally funded projects to make their research data more openly available. Developing, implementing, and integrating metadata workflows into to the research process of the data lifecycle facilitates improved data access while also addressing interoperability challenges for the geosciences such as data description and representation. Lack of metadata or data curation can contribute to (1) semantic, (2) ontology, and (3) data integration issues within and across disciplinary domains and projects. Some researchers of EarthCube funded projects have identified these issues as gaps. These gaps can contribute to interoperability data access, discovery, and integration issues between domain-specific and general data repositories. Academic Research Libraries have expertise in providing long-term discovery and access through the use of metadata standards and provision of access to research data, datasets, and publications via institutional repositories. Metadata crosswalks, open archival information systems (OAIS), trusted-repositories, data seal of approval, persistent URL, linking data, objects, resources, and publications in institutional repositories and digital content management systems are common components in the library discipline. These components contribute to a library perspective on data access and discovery that can benefit the geosciences. The USGS Community for Data Integration (CDI) has developed the Science Support Framework (SSF) for data management and integration within its community of practice for contribution to improved understanding of the Earth's physical and biological systems. The USGS CDI SSF can be used as a reference model to map to EarthCube Funded projects with academic research libraries facilitating the data and information assets components of the USGS CDI SSF via institutional repositories and/or digital content management. This session will explore the USGS CDI SSF for cross-discipline collaboration considerations from a library perspective.
A Descriptive Case Study of the Implementation of the Departmentalized Looping Team Model
ERIC Educational Resources Information Center
Miller, Cody R.
2011-01-01
The conceptual framework guiding this study focuses on local, state, and federal standards as well as demands on schools to improve performance of underserved student populations as impetuses for school structure changes. As related to the aforementioned framework, many schools have developed innovative school restructuring methods such as the…
Eliminating Disparities in School Discipline: A Framework for Intervention
ERIC Educational Resources Information Center
Gregory, Anne; Skiba, Russell J.; Mediratta, Kavitha
2017-01-01
Race and gender disparities in school discipline and associated harms have been well documented for decades. Suspension from school can reduce instructional time and impede academic progress for students who may already be lagging in their achievement. This chapter offers a research-based framework for increasing equity in school discipline. The…
ERIC Educational Resources Information Center
Swain-Bradway, Jessica; Pinkney, Christopher; Flannery, K. Brigid
2015-01-01
Schoolwide positive behavior interventions and supports (SWPBIS) are an increasingly popular framework for school improvement practices, but many high schools are still lagging behind their elementary counterparts. High school leadership teams can struggle with merging the SWPBIS framework with current operations, and there are limited examples of…
Modeling Successful STEM High Schools in the United States: An Ecology Framework
ERIC Educational Resources Information Center
Erdogan, Niyazi; Stuessy, Carol L.
2015-01-01
This study aims to generate a conceptual framework for specialized Science, Technology, Engineering, and Mathematics (STEM) schools. To do so, we focused on literature and found specialized STEM schools have existed for over 100 years and recently expanded nationwide. The current perception for these schools can be described as unique environments…
Study of Sharing Knowledge Resources in Business Schools
ERIC Educational Resources Information Center
Ranjan, Jayanthi
2011-01-01
Purpose: The purpose of this paper is to propose a common business school framework based on knowledge resources that are available in business schools. To support the arguments made based on review literature, the paper presents the holistic framework of knowledge resources in a business school and also provides a knowledge value chain in sharing…
Knowledge Management for School Leaders: An Ecological Framework for Thinking Schools.
ERIC Educational Resources Information Center
Petrides, Lisa A.; Guiney, Susan Zahra
2002-01-01
Using examples from schools, this paper illustrates how knowledge management can enable schools to examine the plethora of data they collect and how an ecological framework can be used to transform these data into meaningful information. The paper highlights: the history of management information systems; shifts from information management to…
Fostering School Belonging in Secondary Schools Using a Socio-Ecological Framework
ERIC Educational Resources Information Center
Allen, Kelly-Ann; Vella-Brodrick, Dianne; Waters, Lea
2016-01-01
The benefits of belonging and feeling connected to school for adolescent mental health and wellbeing are well documented, but how belonging is fostered is less understood. The present article puts forward a new conceptual framework of school belonging based on Bronfenbrenner's (1979) sociological model of human development, using evidence from a…
ERIC Educational Resources Information Center
Wohlstetter, Priscilla; Mohrman, Susan Albers
This document presents findings of the Assessment of School-Based Management Study, which identified the conditions in schools that promote high performance through school-based management (SBM). The study's conceptual framework was based on Edward E. Lawler's (1986) model. The high-involvement framework posits that four resources must spread…
Empowering Chicana/o and Latina: A Framework for High School Counselors
ERIC Educational Resources Information Center
Padilla, Alejandro
2014-01-01
Using Hipolito-Delgado and Lee's empowerment theory for the professional school counselor as a framework, this qualitative study explored the techniques employed by school counselors to facilitate the empowerment of Chicana/o and Latina/o students in large California urban high schools. The qualitative methodology included in-depth interviews…
2016-07-13
ELECTRONIC HEALTH RECORDS VA’s Efforts Raise Concerns about Interoperability Goals and Measures, Duplication with DOD...Agencies, Committee on Appropriations, U.S. Senate July 13, 2016 ELECTRONIC HEALTH RECORDS VA’s Efforts Raise Concerns about Interoperability Goals...initiatives with the Department of Defense (DOD) that were intended to advance the ability of the two departments to share electronic health records , the
Enabling Medical Device Interoperability for the Integrated Clinical Environment
2016-12-01
else who is eager to work together to mature the healthcare technology ecosystem to enable the next generation of safe and intelligent medical device...Award Number: W81XWH-12-C-0154 TITLE: “Enabling Medical Device Interoperability for the Integrated Clinical Environment ” PRINCIPAL INVESTIGATOR...SUBTITLE 5a. CONTRACT NUMBER W81XWH-12-C-0154 “Enabling Medical Device Interoperability for the Integrated Clinical Environment ” 5b. GRANT NUMBER 5c
An HLA-Based Approach to Quantify Achievable Performance for Tactical Edge Applications
2011-05-01
in: Proceedings of the 2002 Fall Simulation Interoperability Workshop, 02F- SIW -068, Nov 2002. [16] P. Knight, et al. ―WBT RTI Independent...Benchmark Tests: Design, Implementation, and Updated Results‖, in: Proceedings of the 2002 Spring Simulation Interoperability Workshop, 02S- SIW -081, March...Interoperability Workshop, 98F- SIW -085, Nov 1998. [18] S. Ferenci and R. Fujimoto. ―RTI Performance on Shared Memory and Message Passing Architectures‖, in
Planetary Sciences Interoperability at VO Paris Data Centre
NASA Astrophysics Data System (ADS)
Le Sidaner, P.; Aboudarham, J.; Birlan, M.; Briot, D.; Bonnin, X.; Cecconi, B.; Chauvin, C.; Erard, S.; Henry, F.; Lamy, L.; Mancini, M.; Normand, J.; Popescu, F.; Roques, F.; Savalle, R.; Schneider, J.; Shih, A.; Thuillot, W.; Vinatier, S.
2015-10-01
The Astronomy community has been developing interoperability since more than 10 years, by standardizing data access, data formats, and metadata. This international action is led by the International Virtual Observatory Alliance (IVOA). Observatoire de Paris is an active participant in this project. All actions on interoperability, data and service provision are centralized in and managed by VOParis Data Centre (VOPDC). VOPDC is a coordinated project from all scientific departments of Observatoire de Paris..
Interoperable and standard e-Health solution over Bluetooth.
Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J
2010-01-01
The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.
NASA Technical Reports Server (NTRS)
Fern, Lisa; Rorie, Conrad; Shively, Jay
2016-01-01
At the May 2015 SC-228 meeting, requirements for TCAS II interoperability became elevated in priority. A TCAS interoperability work group was formed to identify and address key issuesquestions. The TCAS work group came up with an initial list of questions and a plan to address those questions. As part of that plan, NASA proposed to run a mini HITL to address display, alerting and guidance issues. A TCAS Interoperability Workshop was held to determine potential displayalertingguidance issues that could be explored in future NASA mini HITLS. Consensus on main functionality of DAA guidance when TCAS II RA occurs. Prioritized list of independent variables for experimental design. Set of use cases to stress TCAS Interoperability.
Connecting Hazard Analysts and Risk Managers to Sensor Information.
Le Cozannet, Gonéri; Hosford, Steven; Douglas, John; Serrano, Jean-Jacques; Coraboeuf, Damien; Comte, Jérémie
2008-06-11
Hazard analysts and risk managers of natural perils, such as earthquakes, landslides and floods, need to access information from sensor networks surveying their regions of interest. However, currently information about these networks is difficult to obtain and is available in varying formats, thereby restricting accesses and consequently possibly leading to decision-making based on limited information. As a response to this issue, state-of-the-art interoperable catalogues are being currently developed within the framework of the Group on Earth Observations (GEO) workplan. This article provides an overview of the prototype catalogue that was developed to improve access to information about the sensor networks surveying geological hazards (geohazards), such as earthquakes, landslides and volcanoes.
VisIVO: A Tool for the Virtual Observatory and Grid Environment
NASA Astrophysics Data System (ADS)
Becciani, U.; Comparato, M.; Costa, A.; Larsson, B.; Gheller, C.; Pasian, F.; Smareglia, R.
2007-10-01
We present the new features of VisIVO, software for the visualization and analysis of astrophysical data which can be retrieved from the Virtual Observatory framework and used for cosmological simulations running both on Windows and GNU/Linux platforms. VisIVO is VO standards compliant and supports the most important astronomical data formats such as FITS, HDF5 and VOTables. It is free software and can be downloaded from the web site http://visivo.cineca.it. VisIVO can interoperate with other astronomical VO compliant tools through PLASTIC (PLatform for AStronomical Tool InterConnection). This feature allows VisIVO to share data with many other astronomical packages to further analyze the loaded data.
Connecting Hazard Analysts and Risk Managers to Sensor Information
Le Cozannet, Gonéri; Hosford, Steven; Douglas, John; Serrano, Jean-Jacques; Coraboeuf, Damien; Comte, Jérémie
2008-01-01
Hazard analysts and risk managers of natural perils, such as earthquakes, landslides and floods, need to access information from sensor networks surveying their regions of interest. However, currently information about these networks is difficult to obtain and is available in varying formats, thereby restricting accesses and consequently possibly leading to decision-making based on limited information. As a response to this issue, state-of-the-art interoperable catalogues are being currently developed within the framework of the Group on Earth Observations (GEO) workplan. This article provides an overview of the prototype catalogue that was developed to improve access to information about the sensor networks surveying geological hazards (geohazards), such as earthquakes, landslides and volcanoes. PMID:27879915
Integrating Space Communication Network Capabilities via Web Portal Technologies
NASA Technical Reports Server (NTRS)
Johnston, Mark D.; Lee, Carlyn-Ann; Lau, Chi-Wung; Cheung, Kar-Ming; Levesque, Michael; Carruth, Butch; Coffman, Adam; Wallace, Mike
2014-01-01
We have developed a service portal prototype as part of an investigation into the feasibility of using Java portlet technology as a means of providing integrated access to NASA communications network services. Portal servers provide an attractive platform for this role due to the various built-in collaboration applications they can provide, combined with the possibility to develop custom inter-operating portlets to extent their functionality while preserving common presentation and behavior. This paper describes various options for integration of network services related to planning and scheduling, and results based on use of a popular open-source portal framework. Plans are underway to develop an operational SCaN Service Portal, building on the experiences reported here.
Hoelzer, Simon; Schweiger, Ralf K; Liu, Raymond; Rudolf, Dirk; Rieger, Joerg; Dudeck, Joachim
2005-01-01
With the introduction of the ICD-10 as the standard for diagnosis, the development of an electronic representation of its complete content, inherent semantics and coding rules is necessary. Our concept refers to current efforts of the CEN/TC 251 to establish a European standard for hierarchical classification systems in healthcare. We have developed an electronic representation of the ICD-10 with the extensible Markup Language (XML) that facilitates the integration in current information systems or coding software taking into account different languages and versions. In this context, XML offers a complete framework of related technologies and standard tools for processing that helps to develop interoperable applications.
Open architecture design and approach for the Integrated Sensor Architecture (ISA)
NASA Astrophysics Data System (ADS)
Moulton, Christine L.; Krzywicki, Alan T.; Hepp, Jared J.; Harrell, John; Kogut, Michael
2015-05-01
Integrated Sensor Architecture (ISA) is designed in response to stovepiped integration approaches. The design, based on the principles of Service Oriented Architectures (SOA) and Open Architectures, addresses the problem of integration, and is not designed for specific sensors or systems. The use of SOA and Open Architecture approaches has led to a flexible, extensible architecture. Using these approaches, and supported with common data formats, open protocol specifications, and Department of Defense Architecture Framework (DoDAF) system architecture documents, an integration-focused architecture has been developed. ISA can help move the Department of Defense (DoD) from costly stovepipe solutions to a more cost-effective plug-and-play design to support interoperability.
NASA Technical Reports Server (NTRS)
Afjeh, Abdollah A.; Reed, John A.
2003-01-01
This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.
Aligning vocabulary for interoperability of ISR assets using authoritative sources
NASA Astrophysics Data System (ADS)
Hookway, Steve; Patten, Terry; Gorman, Joe
2017-05-01
The growing arsenal of network-centric sensor platforms shows great potential to enhance situational awareness capabilities. Non-traditional sensors collect a diverse range of data that can provide a more accurate and comprehensive common operational picture when combined with conventional intelligence, surveillance, and reconnaissance (ISR) products. One of the integration challenges is mediating differences in terminology that different data providers use to describe the data they have extracted. A data consumer should be able to reference information using the vocabulary that they are familiar with and rely on the framework to handle the mediation; for example, it should be up to the framework to identify that two different terms are synonyms for the same concept. In this paper we present an approach for automatically performing this alignment using authoritative sources such as Wikipedia (a stand-in for the Intellipedia wiki), and present experimental results that demonstrate that this approach is able to align a large number of concepts between different terminologies.
An approach for formalising the supply chain operations
NASA Astrophysics Data System (ADS)
Zdravković, Milan; Panetto, Hervé; Trajanović, Miroslav; Aubry, Alexis
2011-11-01
Reference models play an important role in the knowledge management of the various complex collaboration domains (such as supply chain networks). However, they often show a lack of semantic precision and, they are sometimes incomplete. In this article, we present an approach to overcome semantic inconsistencies and incompleteness of the Supply Chain Operations Reference (SCOR) model and hence improve its usefulness and expand the application domain. First, we describe a literal web ontology language (OWL) specification of SCOR concepts (and related tools) built with the intention to preserve the original approach in the classification of process reference model entities, and hence enable the effectiveness of usage in original contexts. Next, we demonstrate the system for its exploitation, in specific - tools for SCOR framework browsing and rapid supply chain process configuration. Then, we describe the SCOR-Full ontology, its relations with relevant domain ontology and show how it can be exploited for improvement of SCOR ontological framework competence. Finally, we elaborate the potential impact of the presented approach, to interoperability of systems in supply chain networks.
ImTK: an open source multi-center information management toolkit
NASA Astrophysics Data System (ADS)
Alaoui, Adil; Ingeholm, Mary Lou; Padh, Shilpa; Dorobantu, Mihai; Desai, Mihir; Cleary, Kevin; Mun, Seong K.
2008-03-01
The Information Management Toolkit (ImTK) Consortium is an open source initiative to develop robust, freely available tools related to the information management needs of basic, clinical, and translational research. An open source framework and agile programming methodology can enable distributed software development while an open architecture will encourage interoperability across different environments. The ISIS Center has conceptualized a prototype data sharing network that simulates a multi-center environment based on a federated data access model. This model includes the development of software tools to enable efficient exchange, sharing, management, and analysis of multimedia medical information such as clinical information, images, and bioinformatics data from multiple data sources. The envisioned ImTK data environment will include an open architecture and data model implementation that complies with existing standards such as Digital Imaging and Communications (DICOM), Health Level 7 (HL7), and the technical framework and workflow defined by the Integrating the Healthcare Enterprise (IHE) Information Technology Infrastructure initiative, mainly the Cross Enterprise Document Sharing (XDS) specifications.
Simulating human behavior for national security human interactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernard, Michael Lewis; Hart, Dereck H.; Verzi, Stephen J.
2007-01-01
This 3-year research and development effort focused on what we believe is a significant technical gap in existing modeling and simulation capabilities: the representation of plausible human cognition and behaviors within a dynamic, simulated environment. Specifically, the intent of the ''Simulating Human Behavior for National Security Human Interactions'' project was to demonstrate initial simulated human modeling capability that realistically represents intra- and inter-group interaction behaviors between simulated humans and human-controlled avatars as they respond to their environment. Significant process was made towards simulating human behaviors through the development of a framework that produces realistic characteristics and movement. The simulated humansmore » were created from models designed to be psychologically plausible by being based on robust psychological research and theory. Progress was also made towards enhancing Sandia National Laboratories existing cognitive models to support culturally plausible behaviors that are important in representing group interactions. These models were implemented in the modular, interoperable, and commercially supported Umbra{reg_sign} simulation framework.« less
Fertility transitions and schooling: from micro- to macro-level associations.
Eloundou-Enyegue, Parfait M; Giroux, Sarah C
2012-11-01
Research on the schooling implications of fertility transitions often faces an aggregation problem: despite policy interest in macro-level outcomes, empirical studies usually focus on the micro-level effects of sibsize on schooling. This article proposes an aggregation framework for moving from micro- to macro-level associations between fertility and schooling. The proposed framework is an improvement over previous aggregation methods in that it considers concurrent changes in the effects of sibsize, socioeconomic context, and family structure. The framework is illustrated with data from six sub-Saharan countries. Possible extensions are discussed.
NASA Astrophysics Data System (ADS)
Xue, L.; Firl, G.; Zhang, M.; Jimenez, P. A.; Gill, D.; Carson, L.; Bernardet, L.; Brown, T.; Dudhia, J.; Nance, L. B.; Stark, D. R.
2017-12-01
The Global Model Test Bed (GMTB) has been established to support the evolution of atmospheric physical parameterizations in NCEP global modeling applications. To accelerate the transition to the Next Generation Global Prediction System (NGGPS), a collaborative model development framework known as the Common Community Physics Package (CCPP) is created within the GMTB to facilitate engagement from the broad community on physics experimentation and development. A key component to this Research to Operation (R2O) software framework is the Interoperable Physics Driver (IPD) that hooks the physics parameterizations from one end to the dynamical cores on the other end with minimum implementation effort. To initiate the CCPP, scientists and engineers from the GMTB separated and refactored the GFS physics. This exercise demonstrated the process of creating IPD-compliant code and can serve as an example for other physics schemes to do the same and be considered for inclusion into the CCPP. Further benefits to this process include run-time physics suite configuration and considerably reduced effort for testing modifications to physics suites through GMTB's physics test harness. The implementation will be described and the preliminary results will be presented at the conference.
Framework of sensor-based monitoring for pervasive patient care.
Triantafyllidis, Andreas K; Koutkias, Vassilis G; Chouvarda, Ioanna; Adami, Ilia; Kouroubali, Angelina; Maglaveras, Nicos
2016-09-01
Sensor-based health systems can often become difficult to use, extend and sustain. The authors propose a framework for designing sensor-based health monitoring systems aiming to provide extensible and usable monitoring services in the scope of pervasive patient care. The authors' approach relies on a distributed system for monitoring the patient health status anytime-anywhere and detecting potential health complications, for which healthcare professionals and patients are notified accordingly. Portable or wearable sensing devices measure the patient's physiological parameters, a smart mobile device collects and analyses the sensor data, a Medical Center system receives notifications on the detected health condition, and a Health Professional Platform is used by formal caregivers in order to review the patient condition and configure monitoring schemas. A Service-oriented architecture is utilised to provide extensible functional components and interoperable interactions among the diversified system components. The framework was applied within the REMOTE ambient-assisted living project in which a prototype system was developed, utilising Bluetooth to communicate with the sensors and Web services for data exchange. A scenario of using the REMOTE system and preliminary usability results show the applicability, usefulness and virtue of our approach.
Framework of sensor-based monitoring for pervasive patient care
Koutkias, Vassilis G.; Chouvarda, Ioanna; Adami, Ilia; Kouroubali, Angelina; Maglaveras, Nicos
2016-01-01
Sensor-based health systems can often become difficult to use, extend and sustain. The authors propose a framework for designing sensor-based health monitoring systems aiming to provide extensible and usable monitoring services in the scope of pervasive patient care. The authors’ approach relies on a distributed system for monitoring the patient health status anytime-anywhere and detecting potential health complications, for which healthcare professionals and patients are notified accordingly. Portable or wearable sensing devices measure the patient's physiological parameters, a smart mobile device collects and analyses the sensor data, a Medical Center system receives notifications on the detected health condition, and a Health Professional Platform is used by formal caregivers in order to review the patient condition and configure monitoring schemas. A Service-oriented architecture is utilised to provide extensible functional components and interoperable interactions among the diversified system components. The framework was applied within the REMOTE ambient-assisted living project in which a prototype system was developed, utilising Bluetooth to communicate with the sensors and Web services for data exchange. A scenario of using the REMOTE system and preliminary usability results show the applicability, usefulness and virtue of our approach. PMID:27733920
A framework for investigation into extended enterprise resilience
NASA Astrophysics Data System (ADS)
Erol, Ozgur; Sauser, Brian J.; Mansouri, Mo
2010-05-01
This article proposes a framework for investigation into 'extended enterprise resilience' based on the key attributes of enterprise resilience in the context of extended enterprises. Such attributes, namely agility, flexibility, adaptability and connectivity, are frequently defined as supporting attributes of enterprise resilience, but the issue is how they can be more effectively applied to extended enterprises. The role of information technology in assisting connectivity and collaboration is frequently recognised as contributing to resilience on all levels, and will likewise be employed on the level of extended enterprise systems. The proposed framework is based on the expanded application of two primary enablers of enterprise resilience: (i) the capability of an enterprise to connect systems, people, processes and information in a way that allows enterprise to become more connected and responsive to the dynamics of its environment, stakeholders and competitors; (ii) the alignment of information technology with business goals. The former requires inter- and intra-level interoperability and integration within the extended enterprises, and the latter requires modelling of the underlying technology infrastructure and creation of a consolidated view of, and access to, all available resources in the extended enterprises that can be attained by well-defined enterprise architecture.
School and District Intervention: A Decision-Making Framework for Policymakers.
ERIC Educational Resources Information Center
Bowles, Susan A.; Churchill, Andrew M.; Effrat, Andrew; McDermott, Kathryn A.
This paper seeks to help state policymakers understand their relatively new role in improving the academic performance of local schools and districts. The first section, "Intervention Decision-Making Framework," focuses on the intervention decision making framework model, performance criteria, strategic criteria, diagnostic…
Data Modeling Challenges of Advanced Interoperability.
Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka
2018-01-01
Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.
Semantic interoperability--HL7 Version 3 compared to advanced architecture standards.
Blobel, B G M E; Engel, K; Pharow, P
2006-01-01
To meet the challenge for high quality and efficient care, highly specialized and distributed healthcare establishments have to communicate and co-operate in a semantically interoperable way. Information and communication technology must be open, flexible, scalable, knowledge-based and service-oriented as well as secure and safe. For enabling semantic interoperability, a unified process for defining and implementing the architecture, i.e. structure and functions of the cooperating systems' components, as well as the approach for knowledge representation, i.e. the used information and its interpretation, algorithms, etc. have to be defined in a harmonized way. Deploying the Generic Component Model, systems and their components, underlying concepts and applied constraints must be formally modeled, strictly separating platform-independent from platform-specific models. As HL7 Version 3 claims to represent the most successful standard for semantic interoperability, HL7 has been analyzed regarding the requirements for model-driven, service-oriented design of semantic interoperable information systems, thereby moving from a communication to an architecture paradigm. The approach is compared with advanced architectural approaches for information systems such as OMG's CORBA 3 or EHR systems such as GEHR/openEHR and CEN EN 13606 Electronic Health Record Communication. HL7 Version 3 is maturing towards an architectural approach for semantic interoperability. Despite current differences, there is a close collaboration between the teams involved guaranteeing a convergence between competing approaches.
Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung
2014-08-01
Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.
Resourcing the National Goals for Schooling: An Agreed Framework of Principles for Funding Schools
ERIC Educational Resources Information Center
Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2012
2012-01-01
Funding for school education in Australia should be on the basis of clear and agreed policy principles for achieving effectiveness, efficiency, equity and a socially and culturally cohesive society. On the basis of these principles a national framework for funding schools will be supported by complementary State and Commonwealth models for funding…
ERIC Educational Resources Information Center
Hemmings, Annette
2012-01-01
A framework for urban public high school reform is presented for managing site-based change through re-envisioning, reculturation, restructuring, and remoralization. The four Rs for reform framework is elucidated through a qualitative study of a low-performing urban public high school that was transformed into a new more successful school. The…
School Processes That Can Drive Scaling-Up of an Innovation or Contribute to Its Abandonment
ERIC Educational Resources Information Center
Newman, Denis; Zacamy, Jenna; Lazarev, Valeriy; Lin, Li
2017-01-01
This five-year study focused on school processes that promoted the scaling-up of a high school academic literacy framework, Reading Apprenticeship, developed by WestEd's Strategic Literacy Initiative (SLI). Implementing an innovative strategy for scaling-up involving school-based cross-disciplinary teacher teams, SLI brought the framework to 274…
ERIC Educational Resources Information Center
Frumkin, Peter; Manno, Bruno V.; Edgington, Nell
2011-01-01
"The Strategic Management of Charter Schools" addresses the challenges facing such schools by mapping out, in straightforward and highly pragmatic terms, a management framework for them. The first charter school law in the United States was enacted in Minnesota in 1991. In the twenty years since that modest beginning, the movement has burgeoned…
The BACnet Campus Challenge - Part 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masica, Ken; Tom, Steve
Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less
Study and validation of tools interoperability in JPSEC
NASA Astrophysics Data System (ADS)
Conan, V.; Sadourny, Y.; Jean-Marie, K.; Chan, C.; Wee, S.; Apostolopoulos, J.
2005-08-01
Digital imagery is important in many applications today, and the security of digital imagery is important today and is likely to gain in importance in the near future. The emerging international standard ISO/IEC JPEG-2000 Security (JPSEC) is designed to provide security for digital imagery, and in particular digital imagery coded with the JPEG-2000 image coding standard. One of the primary goals of a standard is to ensure interoperability between creators and consumers produced by different manufacturers. The JPSEC standard, similar to the popular JPEG and MPEG family of standards, specifies only the bitstream syntax and the receiver's processing, and not how the bitstream is created or the details of how it is consumed. This paper examines the interoperability for the JPSEC standard, and presents an example JPSEC consumption process which can provide insights in the design of JPSEC consumers. Initial interoperability tests between different groups with independently created implementations of JPSEC creators and consumers have been successful in providing the JPSEC security services of confidentiality (via encryption) and authentication (via message authentication codes, or MACs). Further interoperability work is on-going.
NASA Astrophysics Data System (ADS)
Yang, Gongping; Zhou, Guang-Tong; Yin, Yilong; Yang, Xiukun
2010-12-01
A critical step in an automatic fingerprint recognition system is the segmentation of fingerprint images. Existing methods are usually designed to segment fingerprint images originated from a certain sensor. Thus their performances are significantly affected when dealing with fingerprints collected by different sensors. This work studies the sensor interoperability of fingerprint segmentation algorithms, which refers to the algorithm's ability to adapt to the raw fingerprints obtained from different sensors. We empirically analyze the sensor interoperability problem, and effectively address the issue by proposing a [InlineEquation not available: see fulltext.]-means based segmentation method called SKI. SKI clusters foreground and background blocks of a fingerprint image based on the [InlineEquation not available: see fulltext.]-means algorithm, where a fingerprint block is represented by a 3-dimensional feature vector consisting of block-wise coherence, mean, and variance (abbreviated as CMV). SKI also employs morphological postprocessing to achieve favorable segmentation results. We perform SKI on each fingerprint to ensure sensor interoperability. The interoperability and robustness of our method are validated by experiments performed on a number of fingerprint databases which are obtained from various sensors.
Identity Management Systems in Healthcare: The Issue of Patient Identifiers
NASA Astrophysics Data System (ADS)
Soenens, Els
According to a recent recommendation of the European Commission, now is the time for Europe to enhance interoperability in eHealth. Although interoperability of patient identifiers seems promising for matters of patient mobility, patient empowerment and effective access to care, we see that today there is indeed a considerable lack of interoperability in the field of patient identification. Looking from a socio-technical rather than a merely technical point of view, one can understand the fact that the development and implementation of an identity management system in a specific healthcare context is influenced by particular social practices, affected by socio-economical history and the political climate and regulated by specific data protection legislations. Consequently, the process of making patient identification in Europe more interoperable is a development beyond semantic and syntactic levels. In this paper, we gives some examples of today’s patient identifier systems in Europe, discuss the issue of interoperability of (unique) patient identifiers from a socio-technical point of view and try not to ignore the ‘privacy side’ of the story.
The BACnet Campus Challenge - Part 1
Masica, Ken; Tom, Steve
2015-12-01
Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less
Smart Grid Interoperability Maturity Model Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Drummond, R.; Giroti, Tony
The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across anmore » information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.« less
Health information technology and the medical school curriculum.
Triola, Marc M; Friedman, Erica; Cimino, Christopher; Geyer, Enid M; Wiederhorn, Jo; Mainiero, Crystal
2010-12-01
Medical schools must teach core biomedical informatics competencies that address health information technology (HIT), including explaining electronic medical record systems and computerized provider order entry systems and their role in patient safety; describing the research uses and limitations of a clinical data warehouse; understanding the concepts and importance of information system interoperability; explaining the difference between biomedical informatics and HIT; and explaining the ways clinical information systems can fail. Barriers to including these topics in the curricula include lack of teachers; the perception that informatics competencies are not applicable during preclinical courses and there is no place in the clerkships to teach them; and the legal and policy issues that conflict with students' need to develop skills. However, curricular reform efforts are creating opportunities to teach these topics with new emphasis on patient safety, team-based medical practice, and evidence-based care. Overarching HIT competencies empower our students to be lifelong technology learners.
Toward a Framework for Translational Research in School Psychology
ERIC Educational Resources Information Center
Edwards, Oliver W.
2017-01-01
This article addresses a translational research framework for school psychology. Translational research uses outcomes of basic and applied science to enhance the overall well-being of persons. This transdisciplinary framework connects disciplines and uses their resources, capacities, systems, and procedures to advance prevention, intervention, and…
Framing the Framework: Discourses in Australia's National Values Education Policy
ERIC Educational Resources Information Center
Jones, Tiffany Mary
2009-01-01
In the past, many Australian state schools avoided teaching about values explicitly. However, the Australian government released Australia's first official values education policy in 2005: the "National Framework for Values Education in Australian Schools" (NFVEAS). This framework represents a local manifestation of the recent…