Sample records for database interoperability interagency

  1. Exploring the Lack of Interoperability of Databases within Department of Homeland Security Interagency Environment Concerning Maritime Port Security

    DTIC Science & Technology

    2009-03-01

    37 Figure 8 New Information Sharing Model from United States Intelligence Community Information Sharing...PRIDE while the Coast Guard has MISSLE and the newly constructed WATCHKEEPER. All these databases contain intelligence on incoming vessels...decisions making. Experts rely heavily on future projections as hallmarks of skilled performance." (Endsley et al. 2006) The SA model above

  2. CCSDS SM and C Mission Operations Interoperability Prototype

    NASA Technical Reports Server (NTRS)

    Lucord, Steven A.

    2010-01-01

    This slide presentation reviews the prototype of the Spacecraft Monitor and Control (SM&C) Operations for interoperability among other space agencies. This particular prototype uses the German Space Agency (DLR) to test the ideas for interagency coordination.

  3. Message Received How to Bridge the Communication Gap and Save Lives

    DTIC Science & Technology

    2004-03-01

    safety during an emergency depend on the ability of first responders to talk via radio, directly, without dispatch and in real time. Many technologies are...Words interoperability Coast Guard first responders procedures interagency communications policies 18...communication interoperability for public safety first responders entails far more than finding and emplacing a technology and training the operators. The

  4. Inter-agency Working Group for Airborne Data and Telemetry Systems (IWGADTS)

    NASA Technical Reports Server (NTRS)

    Webster, Chris; Freudinger, Lawrence; Sorenson, Carl; Myers, Jeff; Sullivan, Don; Oolman, Larry

    2009-01-01

    The Interagency Coordinating Committee for Airborne Geosciences Research and Applications (ICCAGRA) was established to improve cooperation and communication among agencies sponsoring airborne platforms and instruments for research and applications, and to serve as a resource for senior level management on airborne geosciences issues. The Interagency Working Group for Airborne Data and Telecommunications Systems (IWGADTS) is a subgroup to ICCAGRA for the purpose of developing recommendations leading to increased interoperability among airborne platforms and instrument payloads, producing increased synergy among research programs with similar goals, and enabling the suborbital layer of the Global Earth Observing System of Systems.

  5. Watershed and Economic Data InterOperability (WEDO)??

    EPA Science Inventory

    The annual public meeting of the Federal Interagency Steering Committee on Multimedia Environmental Modeling (ISCMEM) will convene to discuss some of the latest developments in environmental modeling applications, tools and frameworks, as well as new operational initiatives for F...

  6. Chile's National Center for Health Information Systems: A Public-Private Partnership to Foster Health Care Information Interoperability.

    PubMed

    Capurro, Daniel; Echeverry, Aisen; Figueroa, Rosa; Guiñez, Sergio; Taramasco, Carla; Galindo, César; Avendaño, Angélica; García, Alejandra; Härtel, Steffen

    2017-01-01

    Despite the continuous technical advancements around health information standards, a critical component to their widespread adoption involves political agreement between a diverse set of stakeholders. Countries that have addressed this issue have used diverse strategies. In this vision paper we present the path that Chile is taking to establish a national program to implement health information standards and achieve interoperability. The Chilean government established an inter-agency program to define the current interoperability situation, existing gaps, barriers, and facilitators for interoperable health information systems. As an answer to the identified issues, the government decided to fund a consortium of Chilean universities to create the National Center for Health Information Systems. This consortium should encourage the interaction between all health care stakeholders, both public and private, to advance the selection of national standards and define certification procedures for software and human resources in health information technologies.

  7. Creating an Assured Joint DOD and Interagency Interoperable Net-Centric Enterprise. Report of the Defense Science Board Task Force on Achieving Interoperability in a Net-Centric Environment

    DTIC Science & Technology

    2009-03-01

    policy, elliptic curve public key cryptography using the 256 -bit prime modulus elliptic curve as specified in FIPS-186-2 and SHA - 256 are appropriate for...publications/fips/fips186-2/fips186-2-change1.pdf 76 I P ART I . CH A PT E R 5 Hashing via the Secure Hash Algorithm (using SHA - 256 and...lithography and processing techniques. Field programmable gate arrays ( FPGAs ) are a chip design of interest. These devices are extensively used in

  8. 78 FR 63964 - Request for Comments on Draft NIST Interagency Report (NISTIR) 7628 Rev. 1, Guidelines for Smart...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-25

    ... Grid Cyber Security AGENCY: National Institute of Standards and Technology (NIST), Department of... and Technology (NIST) seeks comments on draft NISTIR 7628 Rev. 1, Guidelines for Smart Grid Cyber... (formerly the Cyber Security Working Group) of the Smart Grid Interoperability Panel. The document has been...

  9. A Dynamic Approach to Make CDS/ISIS Databases Interoperable over the Internet Using the OAI Protocol

    ERIC Educational Resources Information Center

    Jayakanth, F.; Maly, K.; Zubair, M.; Aswath, L.

    2006-01-01

    Purpose: A dynamic approach to making legacy databases, like CDS/ISIS, interoperable with OAI-compliant digital libraries (DLs). Design/methodology/approach: There are many bibliographic databases that are being maintained using legacy database systems. CDS/ISIS is one such legacy database system. It was designed and developed specifically for…

  10. Interoperability, Data Control and Battlespace Visualization using XML, XSLT and X3D

    DTIC Science & Technology

    2003-09-01

    26 Rosenthal, Arnon, Seligman , Len and Costello, Roger, XML, Databases, and Interoperability, Federal Database Colloquium, AFCEA, San Diego...79 Rosenthal, Arnon, Seligman , Len and Costello, Roger, “XML, Databases, and Interoperability”, Federal Database Colloquium, AFCEA, San Diego, 1999... Linda , Mastering XML, Premium Edition, SYBEX, 2001 Wooldridge, Michael , An Introduction to MultiAgent Systems, Wiley, 2002 PAPERS Abernathy, M

  11. The key to enabling biosurveillance is cooperative technology development.

    PubMed

    Emanuel, Peter; Jones, Franca; Smith, Michael; Huff, William; Jaffe, Richard; Roos, Jason

    2011-12-01

    The world population will continue to face biological threats, whether they are naturally occurring or intentional events. The speed with which diseases can emerge and spread presents serious challenges, because the impact on public health, the economy, and development can be huge. The U.S. government recognizes that global public health can also have an impact on national security. This global perspective manifests itself in U.S. policy documents that clearly articulate the importance of biosurveillance in providing early warning, detection, and situational awareness of infectious disease threats in order to mount a rapid response and save lives. In this commentary, we suggest that early recognition of infectious disease threats, whether naturally occurring or man-made, requires a globally distributed array of interoperable hardware and software fielded in sufficient numbers to create a network of linked collection nodes. We argue that achievement of this end state will require a degree of cooperation that does not exist at this time-either across the U.S. federal government or among our global partners. Successful fielding of a family of interoperable technologies will require interagency research, development, and purchase ("acquisition") of biosurveillance systems through cooperative ventures that likely will involve our strategic allies and public-private partnerships. To this end, we propose leveraging an existing federal interagency group to integrate the acquisition of technologies to enable global biosurveillance. © Mary Ann Liebert, Inc.

  12. Interoperable Risk Management in a Joint Interagency Multinational Environment

    DTIC Science & Technology

    2007-08-01

    est ensuite examinée par rapport à l’approche rendue obligatoire par le Conseil du Trésor. Un examen effectué par le Chef – Service d’examen (2004...relativement à la compréhension du MDN/des FC en matière de gestion des risques. De plus, un examen effectué par le Chef – Service d’examen (2004) a...to support further improvement. The Treasury Board of Canada Secretariat’s framework for risk management is also reflected in a companion

  13. Latest developments for the IAGOS database: Interoperability and metadata

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume

    2014-05-01

    In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for intercomparison. The optimal data transfer protocol is being investigated to insure the interoperability. To facilitate satellite and model validation, tools will be made available for co-location and comparison with IAGOS. We will enhance the JOIN application in order to properly display aircraft data as vertical profiles and along individual flight tracks and to allow for graphical comparison to model results that are accessible through interoperable web services, such as the daily products from the GMES/Copernicus atmospheric service.

  14. Conceptual Model Formalization in a Semantic Interoperability Service Framework: Transforming Relational Database Schemas to OWL.

    PubMed

    Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd

    2014-01-01

    Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.

  15. Recent advances on terrain database correlation testing

    NASA Astrophysics Data System (ADS)

    Sakude, Milton T.; Schiavone, Guy A.; Morelos-Borja, Hector; Martin, Glenn; Cortes, Art

    1998-08-01

    Terrain database correlation is a major requirement for interoperability in distributed simulation. There are numerous situations in which terrain database correlation problems can occur that, in turn, lead to lack of interoperability in distributed training simulations. Examples are the use of different run-time terrain databases derived from inconsistent on source data, the use of different resolutions, and the use of different data models between databases for both terrain and culture data. IST has been developing a suite of software tools, named ZCAP, to address terrain database interoperability issues. In this paper we discuss recent enhancements made to this suite, including improved algorithms for sampling and calculating line-of-sight, an improved method for measuring terrain roughness, and the application of a sparse matrix method to the terrain remediation solution developed at the Visual Systems Lab of the Institute for Simulation and Training. We review the application of some of these new algorithms to the terrain correlation measurement processes. The application of these new algorithms improves our support for very large terrain databases, and provides the capability for performing test replications to estimate the sampling error of the tests. With this set of tools, a user can quantitatively assess the degree of correlation between large terrain databases.

  16. Secure and interoperable communication infrastructures for PPDR organisations

    NASA Astrophysics Data System (ADS)

    Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David

    2016-05-01

    The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.

  17. IRIS Toxicological Review of Tert-Butyl Alcohol (Tert-Butanol) ...

    EPA Pesticide Factsheets

    On April 29, 2016, the Toxicological Review of tert-Butyl Alcohol (tert-Butanol) (Public Comment Draft) was released for public comment. The draft Toxicological Review and charge were reviewed internally by EPA and by other federal agencies and the Executive Office of the President during Step 3 (Interagency Science Consultation) before public release. As part of the IRIS process, all written interagency comments on IRIS assessments will be made publicly available. Accordingly, interagency comments with EPA's response and the interagency science consultation drafts of the IRIS Toxicological Review of tert-Butanol and charge to external peer reviewers are posted on this site. EPA is undertaking a new health assessment for t-butyl alcohol (tert-butanol) for the Integrated Risk Information System (IRIS). The outcome of this project will be a Toxicological Review and IRIS and IRIS Summary of TBA that will be entered on the IRIS database. IRIS is an EPA database containing Agency scientific positions on potential adverse human health effects that may result from chronic (or lifetime) exposure to chemicals in the environment. IRIS contains chemical-specific summaries of qualitative and quantitative health information to evaluate potential public health risks associated with environmental contaminants. The IRIS database is relied on for the development of risk assessments, site-specific environmental decisions, and rule making.

  18. Semantically Interoperable XML Data

    PubMed Central

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-01-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789

  19. USGEO Common Framework For Earth Observation Data

    NASA Astrophysics Data System (ADS)

    Walter, J.; de la Beaujardiere, J.; Bristol, S.

    2015-12-01

    The United States Group on Earth Observations (USGEO) Data Management Working Group (DMWG) is an interagency body established by the White House Office of Science and Technology Policy (OSTP). The primary purpose of this group is to foster interagency cooperation and collaboration for improving the life cycle data management practices and interoperability of federally held earth observation data consistent with White House documents including the National Strategy for Civil Earth Observations, the National Plan for Civil Earth Observations, and the May 2013 Executive Order on Open Data (M-13-13). The members of the USGEO DMWG are working on developing a Common Framework for Earth Observation Data that consists of recommended standards and approaches for realizing these goals as well as improving the discoverability, accessibility, and usability of federally held earth observation data. These recommendations will also guide work being performed under the Big Earth Data Initiative (BEDI). This talk will summarize the Common Framework, the philosophy behind it, and next steps forward.

  20. Designing learning management system interoperability in semantic web

    NASA Astrophysics Data System (ADS)

    Anistyasari, Y.; Sarno, R.; Rochmawati, N.

    2018-01-01

    The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.

  1. How to ensure sustainable interoperability in heterogeneous distributed systems through architectural approach.

    PubMed

    Pape-Haugaard, Louise; Frank, Lars

    2011-01-01

    A major obstacle in ensuring ubiquitous information is the utilization of heterogeneous systems in eHealth. The objective in this paper is to illustrate how an architecture for distributed eHealth databases can be designed without lacking the characteristic features of traditional sustainable databases. The approach is firstly to explain traditional architecture in central and homogeneous distributed database computing, followed by a possible approach to use an architectural framework to obtain sustainability across disparate systems i.e. heterogeneous databases, concluded with a discussion. It is seen that through a method of using relaxed ACID properties on a service-oriented architecture it is possible to achieve data consistency which is essential when ensuring sustainable interoperability.

  2. IRIS Toxicological Review of Benzo[a]pyrene (Interagency ...

    EPA Pesticide Factsheets

    In January 2017, EPA finalized the IRIS assessment of Benzo[a]pyrene. The Toxicological Review was reviewed internally by EPA and by other federal agencies and White House Offices before public release. Consistent with the May 2009 IRIS assessment development process, all written comments on IRIS assessments submitted by other federal agencies and White House Offices are made publicly available. Accordingly, interagency comments and the interagency science discussion materials provided to other agencies, including interagency review drafts of the IRIS Toxicological Review of Benzo[a]pyrene are posted on this site. EPA is undertaking an update of the Integrated Risk Information System (IRIS) health assessment for benzo[a]pyrene (BaP). The outcome of this project is an updated Toxicological Review and IRIS Summary for BaP that will be entered into the IRIS database.

  3. Putting the School Interoperability Framework to the Test

    ERIC Educational Resources Information Center

    Mercurius, Neil; Burton, Glenn; Hopkins, Bill; Larsen, Hans

    2004-01-01

    The Jurupa Unified School District in Southern California recently partnered with Microsoft, Dell and the Zone Integration Group for the implementation of a School Interoperability Framework (SIF) database repository model throughout the district (Magner 2002). A two-week project--the Integrated District Education Applications System, better known…

  4. IRIS Toxicological Review of Biphenyl (Interagency Science ...

    EPA Pesticide Factsheets

    On September 30, 2011, the draft Toxicological Review of Biphenyl and the charge to external peer reviewers were released for external peer review and public comment. The Toxicological Review and charge were reviewed internally by EPA and by other federal agencies and White House Offices before public release. In the new IRIS process (May 2009), introduced by the EPA Administrator, all written comments on IRIS assessments submitted by other federal agencies and White House Offices will be made publicly available. Accordingly, interagency comments and the interagency science consultation draft of the IRIS Toxicological Review of Biphenyl and the charge to external peer reviewers are posted on this site. EPA is undertaking a new health assessment for biphenyl for the Integrated Risk Information System (IRIS). The outcome of this project will be a Toxicological Review and IRIS and IRIS Summary of biohenyl that will be entered on the IRIS database. IRIS is an EPA database containing Agency scientific positions on potential adverse human health effects that may result from chronic (or lifetime) exposure to chemicals in the environment. IRIS contains chemical-specific summaries of qualitative and quantitative health information to evaluate potential public health risks associated with exposure assessment information to evaluate potential public health risks associated with environmental contaminants. The IRIS database is relied on for the development of risk ass

  5. IRIS Toxicological Review of Benzo[a]pyrene (Interagency ...

    EPA Pesticide Factsheets

    On August 21, 2013, the draft Toxicological Review of Benzo[a]pyrene and the draft charge to external peer reviewers were released for public review and comment. The Toxicological Review and charge were reviewed internally by EPA and by other federal agencies and White House Offices before public release. Consistent with the May 2009 IRIS assessment development process, all written comments on IRIS assessments submitted by other federal agencies and White House Offices are made publicly available. Accordingly, interagency comments and the interagency science consultation materials provided to other agencies, including interagency review drafts of the IRIS Toxicological Review of Benzo[a]pyrene and the charge to external peer reviewers, are posted on this site. EPA is undertaking an update of the Integrated Risk Information System (IRIS) health assessment for benzo[a]pyrene (BaP). The outcome of this project is an updated Toxicological Review and IRIS Summary for BaP that will be entered into the IRIS database.

  6. A generic method for improving the spatial interoperability of medical and ecological databases.

    PubMed

    Ghenassia, A; Beuscart, J B; Ficheur, G; Occelli, F; Babykina, E; Chazard, E; Genin, M

    2017-10-03

    The availability of big data in healthcare and the intensive development of data reuse and georeferencing have opened up perspectives for health spatial analysis. However, fine-scale spatial studies of ecological and medical databases are limited by the change of support problem and thus a lack of spatial unit interoperability. The use of spatial disaggregation methods to solve this problem introduces errors into the spatial estimations. Here, we present a generic, two-step method for merging medical and ecological databases that avoids the use of spatial disaggregation methods, while maximizing the spatial resolution. Firstly, a mapping table is created after one or more transition matrices have been defined. The latter link the spatial units of the original databases to the spatial units of the final database. Secondly, the mapping table is validated by (1) comparing the covariates contained in the two original databases, and (2) checking the spatial validity with a spatial continuity criterion and a spatial resolution index. We used our novel method to merge a medical database (the French national diagnosis-related group database, containing 5644 spatial units) with an ecological database (produced by the French National Institute of Statistics and Economic Studies, and containing with 36,594 spatial units). The mapping table yielded 5632 final spatial units. The mapping table's validity was evaluated by comparing the number of births in the medical database and the ecological databases in each final spatial unit. The median [interquartile range] relative difference was 2.3% [0; 5.7]. The spatial continuity criterion was low (2.4%), and the spatial resolution index was greater than for most French administrative areas. Our innovative approach improves interoperability between medical and ecological databases and facilitates fine-scale spatial analyses. We have shown that disaggregation models and large aggregation techniques are not necessarily the best ways to tackle the change of support problem.

  7. ECOTOX knowledgebase: New tools for data visualization and database interoperability

    EPA Science Inventory

    The ECOTOXicology knowledgebase (ECOTOX) is a comprehensive, curated database that summarizes toxicology data fromsingle chemical exposure studies to terrestrial and aquatic organisms. The ECOTOX Knowledgebase provides risk assessors and researchers consistent information on toxi...

  8. IRIS Toxicological Review of 2-Hexanone (Interagency ...

    EPA Pesticide Factsheets

    On September 25, 2009, the IRIS Summary and Toxicological Review of 2-hexanone was finalized and loaded onto the IRIS database. The Toxicological Review of 2-hexanone was reviewed internally by EPA, by other federal agencies and White House Offices, by expert external peer reviewers, and by the public. In the new IRIS process, introduced by the EPA Administrator, all written comments on IRIS assessments submitted by other federal agencies and White House Offices will be made publicly available. Accordingly, interagency comments and the interagency draft of the 2-hexanone IRIS assessment are posted on this site. The draft Toxicological Review of 2-hexanone provides scientific support and rationale for the hazard identification and dose-response assessment pertaining to chronic exposure to 2-hexanone.

  9. The DBCLS BioHackathon: standardization and interoperability for bioinformatics web services and workflows. The DBCLS BioHackathon Consortium*.

    PubMed

    Katayama, Toshiaki; Arakawa, Kazuharu; Nakao, Mitsuteru; Ono, Keiichiro; Aoki-Kinoshita, Kiyoko F; Yamamoto, Yasunori; Yamaguchi, Atsuko; Kawashima, Shuichi; Chun, Hong-Woo; Aerts, Jan; Aranda, Bruno; Barboza, Lord Hendrix; Bonnal, Raoul Jp; Bruskiewich, Richard; Bryne, Jan C; Fernández, José M; Funahashi, Akira; Gordon, Paul Mk; Goto, Naohisa; Groscurth, Andreas; Gutteridge, Alex; Holland, Richard; Kano, Yoshinobu; Kawas, Edward A; Kerhornou, Arnaud; Kibukawa, Eri; Kinjo, Akira R; Kuhn, Michael; Lapp, Hilmar; Lehvaslaiho, Heikki; Nakamura, Hiroyuki; Nakamura, Yasukazu; Nishizawa, Tatsuya; Nobata, Chikashi; Noguchi, Tamotsu; Oinn, Thomas M; Okamoto, Shinobu; Owen, Stuart; Pafilis, Evangelos; Pocock, Matthew; Prins, Pjotr; Ranzinger, René; Reisinger, Florian; Salwinski, Lukasz; Schreiber, Mark; Senger, Martin; Shigemoto, Yasumasa; Standley, Daron M; Sugawara, Hideaki; Tashiro, Toshiyuki; Trelles, Oswaldo; Vos, Rutger A; Wilkinson, Mark D; York, William; Zmasek, Christian M; Asai, Kiyoshi; Takagi, Toshihisa

    2010-08-21

    Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies.

  10. The DBCLS BioHackathon: standardization and interoperability for bioinformatics web services and workflows. The DBCLS BioHackathon Consortium*

    PubMed Central

    2010-01-01

    Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies. PMID:20727200

  11. ECOTOX Knowledgebase: New tools for data visualization and database interoperability -Poster

    EPA Science Inventory

    The ECOTOXicology knowledgebase (ECOTOX) is a comprehensive, curated database that summarizes toxicology data from single chemical exposure studies to terrestrial and aquatic organisms. The ECOTOX Knowledgebase provides risk assessors and researchers consistent information on tox...

  12. ECOTOX Knowledgebase: New tools for data visualization and database interoperability (poster)

    EPA Science Inventory

    The ECOTOXicology knowledgebase (ECOTOX) is a comprehensive, curated database that summarizes toxicology data from single chemical exposure studies to terrestrial and aquatic organisms. The ECOTOX Knowledgebase provides risk assessors and researchers consistent information on tox...

  13. Hearing Device Manufacturers Call for Interoperability and Standardization of Internet and Audiology.

    PubMed

    Laplante-Lévesque, Ariane; Abrams, Harvey; Bülow, Maja; Lunner, Thomas; Nelson, John; Riis, Søren Kamaric; Vanpoucke, Filiep

    2016-10-01

    This article describes the perspectives of hearing device manufacturers regarding the exciting developments that the Internet makes possible. Specifically, it proposes to join forces toward interoperability and standardization of Internet and audiology. A summary of why such a collaborative effort is required is provided from historical and scientific perspectives. A roadmap toward interoperability and standardization is proposed. Information and communication technologies improve the flow of health care data and pave the way to better health care. However, hearing-related products, features, and services are notoriously heterogeneous and incompatible with other health care systems (no interoperability). Standardization is the process of developing and implementing technical standards (e.g., Noah hearing database). All parties involved in interoperability and standardization realize mutual gains by making mutually consistent decisions. De jure (officially endorsed) standards can be developed in collaboration with large national health care systems as well as spokespeople for hearing care professionals and hearing device users. The roadmap covers mutual collaboration; data privacy, security, and ownership; compliance with current regulations; scalability and modularity; and the scope of interoperability and standards. We propose to join forces to pave the way to the interoperable Internet and audiology products, features, and services that the world needs.

  14. Current Abstracts Nuclear Reactors and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bales, J.D.; Hicks, S.C.

    1993-01-01

    This publication Nuclear Reactors and Technology (NRT) announces on a monthly basis the current worldwide information available from the open literature on nuclear reactors and technology, including all aspects of power reactors, components and accessories, fuel elements, control systems, and materials. This publication contains the abstracts of DOE reports, journal articles, conference papers, patents, theses, and monographs added to the Energy Science and Technology Database during the past month. Also included are US information obtained through acquisition programs or interagency agreements and international information obtained through acquisition programs or interagency agreements and international information obtained through the International Energy Agency`smore » Energy Technology Data Exchange or government-to-government agreements. The digests in NRT and other citations to information on nuclear reactors back to 1948 are available for online searching and retrieval on the Energy Science and Technology Database and Nuclear Science Abstracts (NSA) database. Current information, added daily to the Energy Science and Technology Database, is available to DOE and its contractors through the DOE Integrated Technical Information System. Customized profiles can be developed to provide current information to meet each user`s needs.« less

  15. EUnetHTA information management system: development and lessons learned.

    PubMed

    Chalon, Patrice X; Kraemer, Peter

    2014-11-01

    The aim of this study was to describe the techniques used in achieving consensus on common standards to be implemented in the EUnetHTA Information Management System (IMS); and to describe how interoperability between tools was explored. Three face to face meetings were organized to identify and agree on common standards to the development of online tools. Two tools were created to demonstrate the added value of implementing interoperability standards at local levels. Developers of tools outside EUnetHTA were identified and contacted. Four common standards have been agreed on by consensus; and consequently all EUnetHTA tools have been modified or designed accordingly. RDF Site Summary (RSS) has demonstrated a good potential to support rapid dissemination of HTA information. Contacts outside EUnetHTA resulted in direct collaboration (HTA glossary, HTAi Vortal), evaluation of options for interoperability between tools (CRD HTA database) or a formal framework to prepare cooperation on concrete projects (INAHTA projects database). While being entitled a project on IT infrastructure, the work program was also about people. When having to agree on complex topics, fostering a cohesive group dynamic and hosting face to face meetings brings added value and enhances understanding between partners. The adoption of widespread standards enhanced the homogeneity of the EUnetHTA tools and should thus contribute to their wider use, therefore, to the general objective of EUnetHTA. The initiatives on interoperability of systems need to be developed further to support a general interoperable information system that could benefit the whole HTA community.

  16. Interoperability Trends in Extravehicular Activity (EVA) Space Operations for the 21st Century

    NASA Technical Reports Server (NTRS)

    Miller, Gerald E.

    1999-01-01

    No other space operations in the 21 st century more comprehensively embody the challenges and dependencies of interoperability than EVA. This discipline is already functioning at an W1paralleled level of interagency, inter-organizational and international cooperation. This trend will only increase as space programs endeavor to expand in the face of shrinking budgets. Among the topics examined in this paper are hardware-oriented issues. Differences in design standards among various space participants dictate differences in the EVA tools that must be manufactured, flown and maintained on-orbit. Presently only two types of functional space suits exist in the world. However, three versions of functional airlocks are in operation. Of the three airlocks, only the International Space Station (ISS) Joint Airlock can accommodate both types of suits. Due to functional differences in the suits, completely different operating protocols are required for each. Should additional space suit or airlock designs become available, the complexity will increase. The lessons learned as a result of designing and operating within such a system are explored. This paper also examines the non-hardware challenges presented by interoperability for a discipline that is as uniquely dependent upon the individual as EVA. Operation of space suits (essentially single-person spacecrafts) by persons whose native language is not that of the suits' designers is explored. The intricacies of shared mission planning, shared control and shared execution of joint EVA's are explained. For example, once ISS is fully functional, the potential exists for two crewmembers of different nationality to be wearing suits manufactured and controlled by a third nation, while operating within an airlock manufactured and controlled by a fourth nation, in an effort to perform tasks upon hardware belonging to a fifth nation. Everything from training issues, to procedures development and writing, to real-time operations is addressed. Finally, this paper looks to the management challenges presented by interoperability in general. With budgets being reduced among all space-faring nations, the need to expand cooperation in the highly expensive field of human space operations is only going to intensify. The question facing management is not if the trend toward interoperation will continue, but how to best facilitate its doing so. Real-world EVA interoperability experience throughout the ShuttlelMir and ISS Programs is discussed to illustrate the challenges and

  17. An HL7/CDA Framework for the Design and Deployment of Telemedicine Services

    DTIC Science & Technology

    2001-10-25

    schemes and prescription databases. Furthermore, interoperability with the Electronic Health Re- cord ( EHR ) facilitates automatic retrieval of relevant...local EHR system or the integrated electronic health record (I- EHR ) [9], which indexes all medical contacts of a patient in the regional net- work...suspected medical problem. Interoperability with middleware services of the HII and other data sources such as the local EHR sys- tem affects

  18. Refining the GPS Space Service Volume (SSV) and Building a Multi-GNSS SSV

    NASA Technical Reports Server (NTRS)

    Parker, Joel J. K.

    2017-01-01

    The GPS (Global Positioning System) Space Service Volume (SSV) was first defined to protect the GPS main lobe signals from changes from block to block. First developed as a concept by NASA in 2000, it has been adopted for the GPS III block of satellites, and is being used well beyond the current specification to enable increased navigation performance for key missions like GOES-R. NASA has engaged the US IFOR (Interagency Forum Operational Requirements) process to adopt a revised requirement to protect this increased and emerging use. Also, NASA is working through the UN International Committee on GNSS (Global Navigation Satellite System) to develop an interoperable multi-GNSS SSV in partnership with all of the foreign GNSS providers.

  19. System architecture of communication infrastructures for PPDR organisations

    NASA Astrophysics Data System (ADS)

    Müller, Wilmuth

    2017-04-01

    The growing number of events affecting public safety and security (PS and S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on organizations responsible for PS and S. In order to respond timely and in an adequate manner to such events Public Protection and Disaster Relief (PPDR) organizations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies do not provide broadband capability, which is a major limitation in supporting new services hence new information flows and currently they have no successor. There is also no known standard that addresses interoperability of these technologies. The paper at hands provides an approach to tackle the above mentioned aspects by defining an Enterprise Architecture (EA) of PPDR organizations and a System Architecture of next generation PPDR communication networks for a variety of applications and services on broadband networks, including the ability of inter-system, inter-agency and cross-border operations. The Open Safety and Security Architecture Framework (OSSAF) provides a framework and approach to coordinate the perspectives of different types of stakeholders within a PS and S organization. It aims at bridging the silos in the chain of commands and on leveraging interoperability between PPDR organizations. The framework incorporates concepts of several mature enterprise architecture frameworks including the NATO Architecture Framework (NAF). However, OSSAF is not providing details on how NAF should be used for describing the OSSAF perspectives and views. In this contribution a mapping of the NAF elements to the OSSAF views is provided. Based on this mapping, an EA of PPDR organizations with a focus on communication infrastructure related capabilities is presented. Following the capability modeling, a system architecture for secure and interoperable communication infrastructures for PPDR organizations is presented. This architecture was implemented within a project sponsored by the European Union and successfully demonstrated in a live validation exercise in June 2016.

  20. Parents in adult psychiatric care and their children: a call for more interagency collaboration with social services and child and adolescent psychiatry.

    PubMed

    Afzelius, Maria; Östman, Margareta; Råstam, Maria; Priebe, Gisela

    2018-01-01

    A parental mental illness affects all family members and should warrant a need for support. To investigate the extent to which psychiatric patients with underage children are the recipients of child-focused interventions and involved in interagency collaboration. Data were retrieved from a psychiatric services medical record database consisting of data regarding 29,972 individuals in southern Sweden and indicating the patients' main diagnoses, comorbidity, children below the age of 18, and child-focused interventions. Among the patients surveyed, 12.9% had registered underage children. One-fourth of the patients received child-focused interventions from adult psychiatry, and out of these 30.7% were involved in interagency collaboration as compared to 7.7% without child-focused interventions. Overall, collaboration with child and adolescent psychiatric services was low for all main diagnoses. If a patient received child-focused interventions from psychiatric services, the likelihood of being involved in interagency collaboration was five times greater as compared to patients receiving no child-focused intervention when controlled for gender, main diagnosis, and inpatient care. Psychiatric services play a significant role in identifying the need for and initiating child-focused interventions in families with a parental mental illness, and need to develop and support strategies to enhance interagency collaboration with other welfare services.

  1. --No Title--

    Science.gov Websites

    interoperability emerging infrastructure for data management on computational grids Software Packages Services : ATLAS: Management and Steering: Computing Management Board Software Project Management Board Database Model Group Computing TDR: 4.5 Event Data 4.8 Database and Data Management Services 6.3.4 Production and

  2. Report of the Interagency biological methods workshop

    USGS Publications Warehouse

    Gurtz, Martin E.; Muir, Thomas A.

    1994-01-01

    The U.S. Geological Survey hosted the Interagency Biological Methods Workshop in Reston, Virginia, during June 22-23, 1993. The purposes of the workshop were to (1) promote better communication among Federal agencies that are using or developing biological methods in water-quality assessment programs for streams and rivers, and (2) facilitate the sharing of data and interagency collaboration. The workshop was attended by 45 biologists representing numerous Federal agencies and programs, and a few regional and State programs that were selected to provide additional perspectives. The focus of the workshop was community assessment methods for fish, invertebrates, and algae; physical habitat characterization; and chemical analyses of biological tissues. Charts comparing program objectives, design features, and sampling methods were compiled from materials that were provided by participating agencies prior to the workshop and formed the basis for small workgroup discussions. Participants noted that differences in methods among programs were often necessitated by differences in program objectives. However, participants agreed that where programs have identified similar data needs, the use of common methods is beneficial. Opportunities discussed for improving data compatibility and information sharing included (1) modifying existing methods, (2) adding parameters, (3) improving access to data through shared databases (potentially with common database structures), and (4) future collaborative efforts that range from research on selected protocol questions to followup meetings and continued discussions.

  3. IRIS Toxicological Review of Ethyl Tertiary Butyl Ether (Etbe) ...

    EPA Pesticide Factsheets

    In September 2016, EPA released the draft IRIS Toxicological Review of Ethyl Tertiary Butyl Ether (ETBE) for public comment and discussion. The draft assessment was reviewed internally by EPA and by other federal agencies and White House Offices before public release. Consistent with the May 2009 IRIS assessment development process, all written comments on IRIS assessments submitted by other federal agencies and White House Offices are made publicly available. Accordingly, interagency comments and the interagency science consultation materials provided to other agencies, including interagency review drafts of the IRIS Toxicological Review of Ethyl Tertiary Butyl Ether are posted on this site. EPA is undertaking an new health assessment for ethyl tertiary butyl ether (ETBE) for the Integrated Risk Information System (IRIS). The outcome of this project will be a Toxicological Review and IRIS Summary of ETBE that will be entered on the IRIS database. IRIS is an EPA database containing Agency scientific positions on potential adverse human health effects that may result from chronic (or lifetime) exposure to chemicals in the environment. IRIS contains chemical-specific summaries of qualitative and quantitative health information in support of two steps of the risk assessment process, i.e., hazard identification and dose-response evaluation. IRIS assessments are used nationally and internationally in combination with specific situational exposure assessment infor

  4. SenseLab

    PubMed Central

    Crasto, Chiquito J.; Marenco, Luis N.; Liu, Nian; Morse, Thomas M.; Cheung, Kei-Hoi; Lai, Peter C.; Bahl, Gautam; Masiar, Peter; Lam, Hugo Y.K.; Lim, Ernest; Chen, Huajin; Nadkarni, Prakash; Migliore, Michele; Miller, Perry L.; Shepherd, Gordon M.

    2009-01-01

    This article presents the latest developments in neuroscience information dissemination through the SenseLab suite of databases: NeuronDB, CellPropDB, ORDB, OdorDB, OdorMapDB, ModelDB and BrainPharm. These databases include information related to: (i) neuronal membrane properties and neuronal models, and (ii) genetics, genomics, proteomics and imaging studies of the olfactory system. We describe here: the new features for each database, the evolution of SenseLab’s unifying database architecture and instances of SenseLab database interoperation with other neuroscience online resources. PMID:17510162

  5. [InlineEquation not available: see fulltext.]-Means Based Fingerprint Segmentation with Sensor Interoperability

    NASA Astrophysics Data System (ADS)

    Yang, Gongping; Zhou, Guang-Tong; Yin, Yilong; Yang, Xiukun

    2010-12-01

    A critical step in an automatic fingerprint recognition system is the segmentation of fingerprint images. Existing methods are usually designed to segment fingerprint images originated from a certain sensor. Thus their performances are significantly affected when dealing with fingerprints collected by different sensors. This work studies the sensor interoperability of fingerprint segmentation algorithms, which refers to the algorithm's ability to adapt to the raw fingerprints obtained from different sensors. We empirically analyze the sensor interoperability problem, and effectively address the issue by proposing a [InlineEquation not available: see fulltext.]-means based segmentation method called SKI. SKI clusters foreground and background blocks of a fingerprint image based on the [InlineEquation not available: see fulltext.]-means algorithm, where a fingerprint block is represented by a 3-dimensional feature vector consisting of block-wise coherence, mean, and variance (abbreviated as CMV). SKI also employs morphological postprocessing to achieve favorable segmentation results. We perform SKI on each fingerprint to ensure sensor interoperability. The interoperability and robustness of our method are validated by experiments performed on a number of fingerprint databases which are obtained from various sensors.

  6. Improving agricultural knowledge management: The AgTrials experience

    PubMed Central

    Hyman, Glenn; Espinosa, Herlin; Camargo, Paola; Abreu, David; Devare, Medha; Arnaud, Elizabeth; Porter, Cheryl; Mwanzia, Leroy; Sonder, Kai; Traore, Sibiry

    2017-01-01

    Background: Opportunities to use data and information to address challenges in international agricultural research and development are expanding rapidly. The use of agricultural trial and evaluation data has enormous potential to improve crops and management practices. However, for a number of reasons, this potential has yet to be realized. This paper reports on the experience of the AgTrials initiative, an effort to build an online database of agricultural trials applying principles of interoperability and open access. Methods: Our analysis evaluates what worked and what did not work in the development of the AgTrials information resource. We analyzed data on our users and their interaction with the platform. We also surveyed our users to gauge their perceptions of the utility of the online database. Results: The study revealed barriers to participation and impediments to interaction, opportunities for improving agricultural knowledge management and a large potential for the use of trial and evaluation data.  Conclusions: Technical and logistical mechanisms for developing interoperable online databases are well advanced.  More effort will be needed to advance organizational and institutional work for these types of databases to realize their potential. PMID:28580127

  7. IRIS Toxicological Review of Ammonia Noncancer Inhalation ...

    EPA Pesticide Factsheets

    In September 2016, EPA finalized the IRIS assessment of Ammonia (Noncancer Inhalation). The Toxicological Review was reviewed internally by EPA and by other federal agencies and White House Offices before public release in June 2016. Consistent with the May 2009 IRIS assessment development process, all written comments on IRIS assessments submitted by other federal agencies and White House Offices are made publicly available. Accordingly, interagency comments and the interagency science discussion materials provided to other agencies, including interagency review drafts of the IRIS Toxicological Review of Ammonia (Noncancer Inhalation) are posted on this site. Note: No major science comments were received on the Interagency Science Discussion Draft. EPA is undertaking an Integrated Risk Information System (IRIS) health assessment for ammonia. IRIS is an EPA database containing Agency scientific positions on potential adverse human health effects that may result from chronic (or lifetime) exposure to chemicals in the environment. IRIS contains chemical-specific summaries of qualitative and quantitative health information in support of two steps of the risk assessment paradigm, i.e., hazard identification and dose-response evaluation. IRIS assessments are used in combination with specific situational exposure assessment information to evaluate potential public health risk associated with environmental contaminants.

  8. IRIS Toxicological Review of Ammonia (Interagency Science ...

    EPA Pesticide Factsheets

    On June 1, 2012, the draft Toxicological Review of Ammonia and the draft charge to external peer reviewers were released for external peer review and public comment. The Toxicological Review and charge were reviewed internally by EPA and by other federal agencies and White House Offices before public release. Consistent with the May 2009 IRIS assessment development process, all written comments on IRIS assessments submitted by other federal agencies and White House Offices are made publicly available. Accordingly, interagency comments and the interagency science consultation materials provided to other agencies, including interagency review drafts of the IRIS Toxicological Review of Ammonia and the charge to external peer reviewers, are posted on this site. EPA is undertaking an Integrated Risk Information System (IRIS) health assessment for ammonia. IRIS is an EPA database containing Agency scientific positions on potential adverse human health effects that may result from chronic (or lifetime) exposure to chemicals in the environment. IRIS contains chemical-specific summaries of qualitative and quantitative health information in support of two steps of the risk assessment paradigm, i.e., hazard identification and dose-response evaluation. IRIS assessments are used in combination with specific situational exposure assessment information to evaluate potential public health risk associated with environmental contaminants.

  9. [Lessons learned in the implementation of interoperable National Health Information Systems: a systematic review].

    PubMed

    Ovies-Bernal, Diana Paola; Agudelo-Londoño, Sandra M

    2014-01-01

    Identify shared criteria used throughout the world in the implementation of interoperable National Health Information Systems (NHIS) and provide validated scientific information on the dimensions affecting interoperability. This systematic review sought to identify primary articles on the implementation of interoperable NHIS published in scientific journals in English, Portuguese, or Spanish between 1990 and 2011 through a search of eight databases of electronic journals in the health sciences and informatics: MEDLINE (PubMed), Proquest, Ovid, EBSCO, MD Consult, Virtual Health Library, Metapress, and SciELO. The full texts of the articles were reviewed, and those that focused on technical computer aspects or on normative issues were excluded, as well as those that did not meet the quality criteria for systematic reviews of interventions. Of 291 studies found and reviewed, only five met the inclusion criteria. These articles reported on the process of implementing an interoperable NHIS in Brazil, China, the United States, Turkey, and the Semiautonomous Region of Zanzíbar, respectively. Five common basic criteria affecting implementation of the NHIS were identified: standards in place to govern the process, availability of trained human talent, financial and structural constraints, definition of standards, and assurance that the information is secure. Four dimensions affecting interoperability were defined: technical, semantic, legal, and organizational. The criteria identified have to be adapted to the actual situation in each country and a proactive approach should be used to ensure that implementation of the interoperable NHIS is strategic, simple, and reliable.

  10. Archive interoperability in the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Genova, Françoise

    2003-02-01

    Main goals of Virtual Observatory projects are to build interoperability between astronomical on-line services, observatory archives, databases and results published in journals, and to develop tools permitting the best scientific usage from the very large data sets stored in observatory archives and produced by large surveys. The different Virtual Observatory projects collaborate to define common exchange standards, which are the key for a truly International Virtual Observatory: for instance their first common milestone has been a standard allowing exchange of tabular data, called VOTable. The Interoperability Work Area of the European Astrophysical Virtual Observatory project aims at networking European archives, by building a prototype using the CDS VizieR and Aladin tools, and at defining basic rules to help archive providers in interoperability implementation. The prototype is accessible for scientific usage, to get user feedback (and science results!) at an early stage of the project. ISO archive participates very actively to this endeavour, and more generally to information networking. The on-going inclusion of the ISO log in SIMBAD will allow higher level links for users.

  11. Special issue on enabling open and interoperable access to Planetary Science and Heliophysics databases and tools

    NASA Astrophysics Data System (ADS)

    2018-01-01

    The large amount of data generated by modern space missions calls for a change of organization of data distribution and access procedures. Although long term archives exist for telescopic and space-borne observations, high-level functions need to be developed on top of these repositories to make Planetary Science and Heliophysics data more accessible and to favor interoperability. Results of simulations and reference laboratory data also need to be integrated to support and interpret the observations. Interoperable software and interfaces have recently been developed in many scientific domains. The Virtual Observatory (VO) interoperable standards developed for Astronomy by the International Virtual Observatory Alliance (IVOA) can be adapted to Planetary Sciences, as demonstrated by the VESPA (Virtual European Solar and Planetary Access) team within the Europlanet-H2020-RI project. Other communities have developed their own standards: GIS (Geographic Information System) for Earth and planetary surfaces tools, SPASE (Space Physics Archive Search and Extract) for space plasma, PDS4 (NASA Planetary Data System, version 4) and IPDA (International Planetary Data Alliance) for planetary mission archives, etc, and an effort to make them interoperable altogether is starting, including automated workflows to process related data from different sources.

  12. Implications of Multilingual Interoperability of Speech Technology for Military Use (Les implications de l’interoperabilite multilingue des technologies vocales pour applications militaires)

    DTIC Science & Technology

    2004-09-01

    Databases 2-2 2.3.1 Translanguage English Database 2-2 2.3.2 Australian National Database of Spoken Language 2-3 2.3.3 Strange Corpus 2-3 2.3.4...some relevance to speech technology research. 2.3.1 Translanguage English Database In a daring plan Joseph Mariani, then at LIMSI-CNRS, proposed to...native speakers. The database is known as the ‘ Translanguage English Database’ but is often referred to as the ‘terrible English database.’ About 28

  13. Virtualization of open-source secure web services to support data exchange in a pediatric critical care research network

    PubMed Central

    Sward, Katherine A; Newth, Christopher JL; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael

    2015-01-01

    Objectives To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Material and Methods Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Results Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Conclusions Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. PMID:25796596

  14. Working Group Reports: Working Group 1 - Software Systems Design and Implementation for Environmental Modeling

    EPA Science Inventory

    The purpose of the Interagency Steering Committee on Multimedia Environmental Modeling (ISCMEM) is to foster the exchange of information about environmental modeling tools, modeling frameworks, and environmental monitoring databases that are all in the public domain. It is compos...

  15. Some significant wildlife strikes to civil aircraft in the United States, January 1990 - September 2008

    DOT National Transportation Integrated Search

    2008-10-23

    The U.S. Department of Agriculture, through an interagency agreement with the Federal Aviation Administration, : compiles a database of all reported wildlife strikes to U.S. civil aircraft and to foreign carriers experiencing strikes : in the USA. We...

  16. Workshop report: Identifying opportunities for global integration of toxicogenomics databases, 26-27 June 2013, Research Triangle Park, NC, USA.

    PubMed

    Hendrickx, Diana M; Boyles, Rebecca R; Kleinjans, Jos C S; Dearry, Allen

    2014-12-01

    A joint US-EU workshop on enhancing data sharing and exchange in toxicogenomics was held at the National Institute for Environmental Health Sciences. Currently, efficient reuse of data is hampered by problems related to public data availability, data quality, database interoperability (the ability to exchange information), standardization and sustainability. At the workshop, experts from universities and research institutes presented databases, studies, organizations and tools that attempt to deal with these problems. Furthermore, a case study showing that combining toxicogenomics data from multiple resources leads to more accurate predictions in risk assessment was presented. All participants agreed that there is a need for a web portal describing the diverse, heterogeneous data resources relevant for toxicogenomics research. Furthermore, there was agreement that linking more data resources would improve toxicogenomics data analysis. To outline a roadmap to enhance interoperability between data resources, the participants recommend collecting user stories from the toxicogenomics research community on barriers in data sharing and exchange currently hampering answering to certain research questions. These user stories may guide the prioritization of steps to be taken for enhancing integration of toxicogenomics databases.

  17. IRIS Toxicological Review of Hexahydro-1,3,5-Trinitro-1,3,5 ...

    EPA Pesticide Factsheets

    On March 10, 2016, the public comment draft Toxicological Review of Hexahydro-1,3,5-trinitro-1,3,5-triazine and the draft charge to external peer reviewers were released for public review and comment. The Toxicological Review and charge were reviewed internally by EPA and by other federal agencies and White House Offices before public release. Consistent with the May 2009 IRIS assessment development process, all written comments on IRIS assessments submitted by other federal agencies and White House Offices are made publicly available. Accordingly, interagency comments and the interagency science consultation materials provided to other agencies, including interagency review drafts of the IRIS Toxicological Review of Hexahydro-1,3,5-trinitro-1,3,5-triazine and the charge to external peer reviewers, are posted on this site. EPA is undertaking an update of the Integrated Risk Information System (IRIS) health assessment for RDX. The outcome of this project is an updated Toxicological Review and IRIS Summary for RDX that will be entered into the IRIS database.

  18. WikiPathways: a multifaceted pathway database bridging metabolomics to other omics research.

    PubMed

    Slenter, Denise N; Kutmon, Martina; Hanspers, Kristina; Riutta, Anders; Windsor, Jacob; Nunes, Nuno; Mélius, Jonathan; Cirillo, Elisa; Coort, Susan L; Digles, Daniela; Ehrhart, Friederike; Giesbertz, Pieter; Kalafati, Marianthi; Martens, Marvin; Miller, Ryan; Nishida, Kozo; Rieswijk, Linda; Waagmeester, Andra; Eijssen, Lars M T; Evelo, Chris T; Pico, Alexander R; Willighagen, Egon L

    2018-01-04

    WikiPathways (wikipathways.org) captures the collective knowledge represented in biological pathways. By providing a database in a curated, machine readable way, omics data analysis and visualization is enabled. WikiPathways and other pathway databases are used to analyze experimental data by research groups in many fields. Due to the open and collaborative nature of the WikiPathways platform, our content keeps growing and is getting more accurate, making WikiPathways a reliable and rich pathway database. Previously, however, the focus was primarily on genes and proteins, leaving many metabolites with only limited annotation. Recent curation efforts focused on improving the annotation of metabolism and metabolic pathways by associating unmapped metabolites with database identifiers and providing more detailed interaction knowledge. Here, we report the outcomes of the continued growth and curation efforts, such as a doubling of the number of annotated metabolite nodes in WikiPathways. Furthermore, we introduce an OpenAPI documentation of our web services and the FAIR (Findable, Accessible, Interoperable and Reusable) annotation of resources to increase the interoperability of the knowledge encoded in these pathways and experimental omics data. New search options, monthly downloads, more links to metabolite databases, and new portals make pathway knowledge more effortlessly accessible to individual researchers and research communities. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. MECP2 variation in Rett syndrome-An overview of current coverage of genetic and phenotype data within existing databases.

    PubMed

    Townend, Gillian S; Ehrhart, Friederike; van Kranen, Henk J; Wilkinson, Mark; Jacobsen, Annika; Roos, Marco; Willighagen, Egon L; van Enckevort, David; Evelo, Chris T; Curfs, Leopold M G

    2018-04-27

    Rett syndrome (RTT) is a monogenic rare disorder that causes severe neurological problems. In most cases, it results from a loss-of-function mutation in the gene encoding methyl-CPG-binding protein 2 (MECP2). Currently, about 900 unique MECP2 variations (benign and pathogenic) have been identified and it is suspected that the different mutations contribute to different levels of disease severity. For researchers and clinicians, it is important that genotype-phenotype information is available to identify disease-causing mutations for diagnosis, to aid in clinical management of the disorder, and to provide counseling for parents. In this study, 13 genotype-phenotype databases were surveyed for their general functionality and availability of RTT-specific MECP2 variation data. For each database, we investigated findability and interoperability alongside practical user functionality, and type and amount of genetic and phenotype data. The main conclusions are that, as well as being challenging to find these databases and specific MECP2 variants held within, interoperability is as yet poorly developed and requires effort to search across databases. Nevertheless, we found several thousand online database entries for MECP2 variations and their associated phenotypes, diagnosis, or predicted variant effects, which is a good starting point for researchers and clinicians who want to provide, annotate, and use the data. © 2018 The Authors. Human Mutation published by Wiley Periodicals, Inc.

  20. Towards BioDBcore: a community-defined information specification for biological databases

    PubMed Central

    Gaudet, Pascale; Bairoch, Amos; Field, Dawn; Sansone, Susanna-Assunta; Taylor, Chris; Attwood, Teresa K.; Bateman, Alex; Blake, Judith A.; Bult, Carol J.; Cherry, J. Michael; Chisholm, Rex L.; Cochrane, Guy; Cook, Charles E.; Eppig, Janan T.; Galperin, Michael Y.; Gentleman, Robert; Goble, Carole A.; Gojobori, Takashi; Hancock, John M.; Howe, Douglas G.; Imanishi, Tadashi; Kelso, Janet; Landsman, David; Lewis, Suzanna E.; Mizrachi, Ilene Karsch; Orchard, Sandra; Ouellette, B. F. Francis; Ranganathan, Shoba; Richardson, Lorna; Rocca-Serra, Philippe; Schofield, Paul N.; Smedley, Damian; Southan, Christopher; Tan, Tin Wee; Tatusova, Tatiana; Whetzel, Patricia L.; White, Owen; Yamasaki, Chisato

    2011-01-01

    The present article proposes the adoption of a community-defined, uniform, generic description of the core attributes of biological databases, BioDBCore. The goals of these attributes are to provide a general overview of the database landscape, to encourage consistency and interoperability between resources and to promote the use of semantic and syntactic standards. BioDBCore will make it easier for users to evaluate the scope and relevance of available resources. This new resource will increase the collective impact of the information present in biological databases. PMID:21097465

  1. Towards BioDBcore: a community-defined information specification for biological databases

    PubMed Central

    Gaudet, Pascale; Bairoch, Amos; Field, Dawn; Sansone, Susanna-Assunta; Taylor, Chris; Attwood, Teresa K.; Bateman, Alex; Blake, Judith A.; Bult, Carol J.; Cherry, J. Michael; Chisholm, Rex L.; Cochrane, Guy; Cook, Charles E.; Eppig, Janan T.; Galperin, Michael Y.; Gentleman, Robert; Goble, Carole A.; Gojobori, Takashi; Hancock, John M.; Howe, Douglas G.; Imanishi, Tadashi; Kelso, Janet; Landsman, David; Lewis, Suzanna E.; Karsch Mizrachi, Ilene; Orchard, Sandra; Ouellette, B.F. Francis; Ranganathan, Shoba; Richardson, Lorna; Rocca-Serra, Philippe; Schofield, Paul N.; Smedley, Damian; Southan, Christopher; Tan, Tin W.; Tatusova, Tatiana; Whetzel, Patricia L.; White, Owen; Yamasaki, Chisato

    2011-01-01

    The present article proposes the adoption of a community-defined, uniform, generic description of the core attributes of biological databases, BioDBCore. The goals of these attributes are to provide a general overview of the database landscape, to encourage consistency and interoperability between resources; and to promote the use of semantic and syntactic standards. BioDBCore will make it easier for users to evaluate the scope and relevance of available resources. This new resource will increase the collective impact of the information present in biological databases. PMID:21205783

  2. Some significant wildlife strikes to civil aircraft in the United States, January 1990 - September 2010

    DOT National Transportation Integrated Search

    2010-11-10

    The U.S. Department of Agriculture, through an interagency agreement with the Federal Aviation Administration, compiles a database of all reported wildlife strikes to U.S. civil aircraft and to foreign carriers experiencing strikes in the USA. We hav...

  3. IRIS Toxicological Review of 2-Hexanone (Interagency Science Discussion Draft)

    EPA Science Inventory

    On September 25, 2009, the IRIS Summary and Toxicological Review of 2-hexanone was finalized and loaded onto the IRIS database. The Toxicological Review of 2-hexanone was reviewed internally by EPA, by other federal agencies and White House Offices, by expert external peer revie...

  4. Summary of the Workshop on The Power of Aggregated Toxicity Data

    EPA Science Inventory

    In April 2007, a workshop on Development of Federal Interagency Exposure Toxicity Database was held in conjunction with the Toxicology and Risk Assessment Conference in Cincinnati, OH to discuss the potential to develop a shared data resource for dose-response toxicity ...

  5. On the feasibility of interoperable schemes in hand biometrics.

    PubMed

    Morales, Aythami; González, Ester; Ferrer, Miguel A

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  6. On the Feasibility of Interoperable Schemes in Hand Biometrics

    PubMed Central

    Morales, Aythami; González, Ester; Ferrer, Miguel A.

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors. PMID:22438714

  7. An Information System for European culture collections: the way forward.

    PubMed

    Casaregola, Serge; Vasilenko, Alexander; Romano, Paolo; Robert, Vincent; Ozerskaya, Svetlana; Kopf, Anna; Glöckner, Frank O; Smith, David

    2016-01-01

    Culture collections contain indispensable information about the microorganisms preserved in their repositories, such as taxonomical descriptions, origins, physiological and biochemical characteristics, bibliographic references, etc. However, information currently accessible in databases rarely adheres to common standard protocols. The resultant heterogeneity between culture collections, in terms of both content and format, notably hampers microorganism-based research and development (R&D). The optimized exploitation of these resources thus requires standardized, and simplified, access to the associated information. To this end, and in the interest of supporting R&D in the fields of agriculture, health and biotechnology, a pan-European distributed research infrastructure, MIRRI, including over 40 public culture collections and research institutes from 19 European countries, was established. A prime objective of MIRRI is to unite and provide universal access to the fragmented, and untapped, resources, information and expertise available in European public collections of microorganisms; a key component of which is to develop a dynamic Information System. For the first time, both culture collection curators as well as their users have been consulted and their feedback, concerning the needs and requirements for collection databases and data accessibility, utilised. Users primarily noted that databases were not interoperable, thus rendering a global search of multiple databases impossible. Unreliable or out-of-date and, in particular, non-homogenous, taxonomic information was also considered to be a major obstacle to searching microbial data efficiently. Moreover, complex searches are rarely possible in online databases thus limiting the extent of search queries. Curators also consider that overall harmonization-including Standard Operating Procedures, data structure, and software tools-is necessary to facilitate their work and to make high-quality data easily accessible to their users. Clearly, the needs of culture collection curators coincide with those of users on the crucial point of database interoperability. In this regard, and in order to design an appropriate Information System, important aspects on which the culture collection community should focus include: the interoperability of data sets with the ontologies to be used; setting best practice in data management, and the definition of an appropriate data standard.

  8. THE SOUTHWEST REGIONAL GAP PROJECT: A DATABASE MODEL FOR REGIONAL LANDSCAPE ASSESSMENT, RESOURCE PLANNING, AND VULNERABILITY ANALYSIS

    EPA Science Inventory

    The Gap Analysis Program (GAP) is a national interagency program that maps the distribution of plant communities and selected animal species and compares these distributions with land stewardship to identify biotic elements at potential risk of endangerment. Acquisition of primar...

  9. Implementation of Goal Attainment Scaling in Community Intellectual Disability Services

    ERIC Educational Resources Information Center

    Chapman, Melanie; Burton, Mark; Hunt, Victoria; Reeves, David

    2006-01-01

    The authors describe the evaluation of the implementation of an outcome measurement system (Goal Attainment Scaling-GAS) within the context of an interdisciplinary and interagency intellectual disability services setting. The GAS database allowed analysis of follow-up goals and indicated the extent of implementation, while a rater study evaluated…

  10. IRIS Toxicological Review of 1,2,3-trichloropropane (Interagency Science Discussion Draft)

    EPA Science Inventory

    On September 30, 2009, the IRIS Summary and Toxicological Review of 1,2,3-trichloropropane (TCP) was finalized and loaded onto the IRIS database. The Toxicological Review of TCP was reviewed internally by EPA, by other federal agencies and White House Offices, by expert external...

  11. A SMART groundwater portal: An OGC web services orchestration framework for hydrology to improve data access and visualisation in New Zealand

    NASA Astrophysics Data System (ADS)

    Klug, Hermann; Kmoch, Alexander

    2014-08-01

    Transboundary and cross-catchment access to hydrological data is the key to designing successful environmental policies and activities. Electronic maps based on distributed databases are fundamental for planning and decision making in all regions and for all spatial and temporal scales. Freshwater is an essential asset in New Zealand (and globally) and the availability as well as accessibility of hydrological information held by or held for public authorities and businesses are becoming a crucial management factor. Access to and visual representation of environmental information for the public is essential for attracting greater awareness of water quality and quantity matters. Detailed interdisciplinary knowledge about the environment is required to ensure that the environmental policy-making community of New Zealand considers regional and local differences of hydrological statuses, while assessing the overall national situation. However, cross-regional and inter-agency sharing of environmental spatial data is complex and challenging. In this article, we firstly provide an overview of the state of the art standard compliant techniques and methodologies for the practical implementation of simple, measurable, achievable, repeatable, and time-based (SMART) hydrological data management principles. Secondly, we contrast international state of the art data management developments with the present status for groundwater information in New Zealand. Finally, for the topics (i) data access and harmonisation, (ii) sensor web enablement and (iii) metadata, we summarise our findings, provide recommendations on future developments and highlight the specific advantages resulting from a seamless view, discovery, access, and analysis of interoperable hydrological information and metadata for decision making.

  12. An incremental database access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nicholas; Sellis, Timos

    1994-01-01

    We investigated a number of design and performance issues of interoperable database management systems (DBMS's). The major results of our investigation were obtained in the areas of client-server database architectures for heterogeneous DBMS's, incremental computation models, buffer management techniques, and query optimization. We finished a prototype of an advanced client-server workstation-based DBMS which allows access to multiple heterogeneous commercial DBMS's. Experiments and simulations were then run to compare its performance with the standard client-server architectures. The focus of this research was on adaptive optimization methods of heterogeneous database systems. Adaptive buffer management accounts for the random and object-oriented access methods for which no known characterization of the access patterns exists. Adaptive query optimization means that value distributions and selectives, which play the most significant role in query plan evaluation, are continuously refined to reflect the actual values as opposed to static ones that are computed off-line. Query feedback is a concept that was first introduced to the literature by our group. We employed query feedback for both adaptive buffer management and for computing value distributions and selectivities. For adaptive buffer management, we use the page faults of prior executions to achieve more 'informed' management decisions. For the estimation of the distributions of the selectivities, we use curve-fitting techniques, such as least squares and splines, for regressing on these values.

  13. End-to-end interoperability and workflows from building architecture design to one or more simulations

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  14. Successful integration efforts in water quality from the integrated Ocean Observing System Regional Associations and the National Water Quality Monitoring Network

    USGS Publications Warehouse

    Ragsdale, R.; Vowinkel, E.; Porter, D.; Hamilton, P.; Morrison, R.; Kohut, J.; Connell, B.; Kelsey, H.; Trowbridge, P.

    2011-01-01

    The Integrated Ocean Observing System (IOOS??) Regional Associations and Interagency Partners hosted a water quality workshop in January 2010 to discuss issues of nutrient enrichment and dissolved oxygen depletion (hypoxia), harmful algal blooms (HABs), and beach water quality. In 2007, the National Water Quality Monitoring Council piloted demonstration projects as part of the National Water Quality Monitoring Network (Network) for U.S. Coastal Waters and their Tributaries in three IOOS Regional Associations, and these projects are ongoing. Examples of integrated science-based solutions to water quality issues of major concern from the IOOS regions and Network demonstration projects are explored in this article. These examples illustrate instances where management decisions have benefited from decision-support tools that make use of interoperable data. Gaps, challenges, and outcomes are identified, and a proposal is made for future work toward a multiregional water quality project for beach water quality.

  15. Water quality success stories: Integrated assessments from the IOOS regional associations and national water quality monitoring network

    USGS Publications Warehouse

    Ragsdale, Rob; Vowinkel, Eric; Porter, Dwayne; Hamilton, Pixie; Morrison, Ru; Kohut, Josh; Connell, Bob; Kelsey, Heath; Trowbridge, Phil

    2011-01-01

    The Integrated Ocean Observing System (IOOS®) Regional Associations and Interagency Partners hosted a water quality workshop in January 2010 to discuss issues of nutrient enrichment and dissolved oxygen depletion (hypoxia), harmful algal blooms (HABs), and beach water quality. In 2007, the National Water Quality Monitoring Council piloted demonstration projects as part of the National Water Quality Monitoring Network (Network) for U.S. Coastal Waters and their Tributaries in three IOOS Regional Associations, and these projects are ongoing. Examples of integrated science-based solutions to water quality issues of major concern from the IOOS regions and Network demonstration projects are explored in this article. These examples illustrate instances where management decisions have benefited from decision-support tools that make use of interoperable data. Gaps, challenges, and outcomes are identified, and a proposal is made for future work toward a multiregional water quality project for beach water quality.

  16. Moving Toward Space Internetworking via DTN: Its Operational Challenges, Benefits, and Management

    NASA Technical Reports Server (NTRS)

    Barkley, Erik; Burleigh, Scott; Gladden, Roy; Malhotra, Shan; Shames, Peter

    2010-01-01

    The international space community has begun to recognize that the established model for management of communications with spacecraft - commanded data transmission over individual pair-wise contacts - is operationally unwieldy and will not scale in support of increasingly complex and sophisticated missions such as NASA's Constellation project. Accordingly, the international Inter-Agency Operations Advisory Group (IOAG) ichartered a Space Internetworking Strategy Group (SISG), which released its initial recommendations in a November 2008 report. The report includes a recommendation that the space flight community adopt Delay-Tolerant Networking (DTN) to address the problem of interoperability and communication scaling, especially in mission environments where there are multiple spacecraft operating in concert. This paper explores some of the issues that must be addressed in implementing, deploying, and operating DTN as part of a multi-mission, multi-agency space internetwork as well as benefits and future operational scenarios afforded by DTN-based space internetworking.

  17. IRIS Toxicological Review of 1,2,3-trichloropropane ...

    EPA Pesticide Factsheets

    On September 30, 2009, the IRIS Summary and Toxicological Review of 1,2,3-trichloropropane (TCP) was finalized and loaded onto the IRIS database. The Toxicological Review of TCP was reviewed internally by EPA, by other federal agencies and White House Offices, by expert external peer reviewers, and by the public. In the new IRIS process, introduced by the EPA Administrator, all written comments on IRIS assessments submitted by other federal agencies and White House Offices will be made publicly available. Accordingly, interagency comments and the interagency draft of the TCP IRIS assessment are posted on this site. This Tox Review provides scientific support and rationale for the hazard and dose-response assessment pertaining to chronic exposure to 1,2,3-trichloropropane.

  18. Use of FIA plot data in the LANDFIRE project

    Treesearch

    Chris Toney; Matthew Rollins; Karen Short; Tracey Frescino; Ronald Tymcio; Birgit Peterson

    2007-01-01

    LANDFIRE is an interagency project that will generate consistent maps and data describing vegetation, fire, and fuel characteristics across the United States within a 5-year timeframe. Modeling and mapping in LANDFIRE depend extensively on a large database of georeferenced field measurements describing vegetation, site characteristics, and fuel. The LANDFIRE Reference...

  19. 75 FR 60085 - NOAA Proposed Policy on Prohibited and Authorized Uses of the Asset Forfeiture Fund

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-29

    ... NOAA's enforcement and legal systems and databases; Annual interagency agreement and contract costs for... with applicable legal authority and that will help assure those regulated that all fines and penalties... supported as a key component of supporting legal fishers and the American public through barring illegal...

  20. MRML: an extensible communication protocol for interoperability and benchmarking of multimedia information retrieval systems

    NASA Astrophysics Data System (ADS)

    Mueller, Wolfgang; Mueller, Henning; Marchand-Maillet, Stephane; Pun, Thierry; Squire, David M.; Pecenovic, Zoran; Giess, Christoph; de Vries, Arjen P.

    2000-10-01

    While in the area of relational databases interoperability is ensured by common communication protocols (e.g. ODBC/JDBC using SQL), Content Based Image Retrieval Systems (CBIRS) and other multimedia retrieval systems are lacking both a common query language and a common communication protocol. Besides its obvious short term convenience, interoperability of systems is crucial for the exchange and analysis of user data. In this paper, we present and describe an extensible XML-based query markup language, called MRML (Multimedia Retrieval markup Language). MRML is primarily designed so as to ensure interoperability between different content-based multimedia retrieval systems. Further, MRML allows researchers to preserve their freedom in extending their system as needed. MRML encapsulates multimedia queries in a way that enable multimedia (MM) query languages, MM content descriptions, MM query engines, and MM user interfaces to grow independently from each other, reaching a maximum of interoperability while ensuring a maximum of freedom for the developer. For benefitting from this, only a few simple design principles have to be respected when extending MRML for one's fprivate needs. The design of extensions withing the MRML framework will be described in detail in the paper. MRML has been implemented and tested for the CBIRS Viper, using the user interface Snake Charmer. Both are part of the GNU project and can be downloaded at our site.

  1. Data Publication and Interoperability for Long Tail Researchers via the Open Data Repository's (ODR) Data Publisher.

    NASA Astrophysics Data System (ADS)

    Stone, N.; Lafuente, B.; Bristow, T.; Keller, R.; Downs, R. T.; Blake, D. F.; Fonda, M.; Pires, A.

    2016-12-01

    Working primarily with astrobiology researchers at NASA Ames, the Open Data Repository (ODR) has been conducting a software pilot to meet the varying needs of this multidisciplinary community. Astrobiology researchers often have small communities or operate individually with unique data sets that don't easily fit into existing database structures. The ODR constructed its Data Publisher software to allow researchers to create databases with common metadata structures and subsequently extend them to meet their individual needs and data requirements. The software accomplishes these tasks through a web-based interface that allows collaborative creation and revision of common metadata templates and individual extensions to these templates for custom data sets. This allows researchers to search disparate datasets based on common metadata established through the metadata tools, but still facilitates distinct analyses and data that may be stored alongside the required common metadata. The software produces web pages that can be made publicly available at the researcher's discretion so that users may search and browse the data in an effort to make interoperability and data discovery a human-friendly task while also providing semantic data for machine-based discovery. Once relevant data has been identified, researchers can utilize the built-in application programming interface (API) that exposes the data for machine-based consumption and integration with existing data analysis tools (e.g. R, MATLAB, Project Jupyter - http://jupyter.org). The current evolution of the project has created the Astrobiology Habitable Environments Database (AHED)[1] which provides an interface to databases connected through a common metadata core. In the next project phase, the goal is for small research teams and groups to be self-sufficient in publishing their research data to meet funding mandates and academic requirements as well as fostering increased data discovery and interoperability through human-readable and machine-readable interfaces. This project is supported by the Science-Enabling Research Activity (SERA) and NASA NNX11AP82A, MSL. [1] B. Lafuente et al. (2016) AGU, submitted.

  2. Breaking barriers to interoperability: assigning spatially and temporally unique identifiers to spaces and buildings.

    PubMed

    Pyke, Christopher R; Madan, Isaac

    2013-08-01

    The real estate industry routinely uses specialized information systems for functions, including design, construction, facilities management, brokerage, tax assessment, and utilities. These systems are mature and effective within vertically integrated market segments. However, new questions are reaching across these traditional information silos. For example, buyers may be interested in evaluating the design, energy efficiency characteristics, and operational performance of a commercial building. This requires the integration of information across multiple databases held by different institutions. Today, this type of data integration is difficult to automate and propone to errors due, in part, to the lack of generally accepted building and spaces identifiers. Moving forward, the real estate industry needs a new mechanism to assign identifiers for whole buildings and interior spaces for the purpose of interoperability, data exchange, and integration. This paper describes a systematic process to identify activities occurring at building or within interior spaces to provide a foundation for exchange and interoperability. We demonstrate the application of the approach with a prototype Web application. This concept and demonstration illustrate the elements of a practical interoperability framework that can increase productivity, create new business opportunities, and reduce errors, waste, and redundancy. © 2013 New York Academy of Sciences.

  3. FHIR Healthcare Directories: Adopting Shared Interfaces to Achieve Interoperable Medical Device Data Integration.

    PubMed

    Tyndall, Timothy; Tyndall, Ayami

    2018-01-01

    Healthcare directories are vital for interoperability among healthcare providers, researchers and patients. Past efforts at directory services have not provided the tools to allow integration of the diverse data sources. Many are overly strict, incompatible with legacy databases, and do not provide Data Provenance. A more architecture-independent system is needed to enable secure, GDPR-compatible (8) service discovery across organizational boundaries. We review our development of a portable Data Provenance Toolkit supporting provenance within Health Information Exchange (HIE) systems. The Toolkit has been integrated with client software and successfully leveraged in clinical data integration. The Toolkit validates provenance stored in a Blockchain or Directory record and creates provenance signatures, providing standardized provenance that moves with the data. This healthcare directory suite implements discovery of healthcare data by HIE and EHR systems via FHIR. Shortcomings of past directory efforts include the ability to map complex datasets and enabling interoperability via exchange endpoint discovery. By delivering data without dictating how it is stored we improve exchange and facilitate discovery on a multi-national level through open source, fully interoperable tools. With the development of Data Provenance resources we enhance exchange and improve security and usability throughout the health data continuum.

  4. Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.

    PubMed

    Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher

    2017-08-01

    Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.

  5. A review on digital ECG formats and the relationships between them.

    PubMed

    Trigo, Jesús Daniel; Alesanco, Alvaro; Martínez, Ignacio; García, José

    2012-05-01

    A plethora of digital ECG formats have been proposed and implemented. This heterogeneity hinders the design and development of interoperable systems and entails critical integration issues for the healthcare information systems. This paper aims at performing a comprehensive overview on the current state of affairs of the interoperable exchange of digital ECG signals. This includes 1) a review on existing digital ECG formats, 2) a collection of applications and cardiology settings using such formats, 3) a compilation of the relationships between such formats, and 4) a reflection on the current situation and foreseeable future of the interoperable exchange of digital ECG signals. The objectives have been approached by completing and updating previous reviews on the topic through appropriate database mining. 39 digital ECG formats, 56 applications, tools or implantation experiences, 47 mappings/converters, and 6 relationships between such formats have been found in the literature. The creation and generalization of a single standardized ECG format is a desirable goal. However, this unification requires political commitment and international cooperation among different standardization bodies. Ongoing ontology-based approaches covering ECG domain have recently emerged as a promising alternative for reaching fully fledged ECG interoperability in the near future.

  6. Databases for multilevel biophysiology research available at Physiome.jp.

    PubMed

    Asai, Yoshiyuki; Abe, Takeshi; Li, Li; Oka, Hideki; Nomura, Taishin; Kitano, Hiroaki

    2015-01-01

    Physiome.jp (http://physiome.jp) is a portal site inaugurated in 2007 to support model-based research in physiome and systems biology. At Physiome.jp, several tools and databases are available to support construction of physiological, multi-hierarchical, large-scale models. There are three databases in Physiome.jp, housing mathematical models, morphological data, and time-series data. In late 2013, the site was fully renovated, and in May 2015, new functions were implemented to provide information infrastructure to support collaborative activities for developing models and performing simulations within the database framework. This article describes updates to the databases implemented since 2013, including cooperation among the three databases, interactive model browsing, user management, version management of models, management of parameter sets, and interoperability with applications.

  7. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification.

    PubMed

    Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ © The Author(s) 2014. Published by Oxford University Press.

  8. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification

    PubMed Central

    Wiegers, Thomas C.; Davis, Allan Peter; Mattingly, Carolyn J.

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ PMID:24919658

  9. NETMARK

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Koga, Dennis (Technical Monitor)

    2002-01-01

    This presentation discuss NASA's proposed NETMARK knowledge management tool which aims 'to control and interoperate with every block in a document, email, spreadsheet, power point, database, etc. across the lifecycle'. Topics covered include: system software requirements and hardware requirements, seamless information systems, computer architecture issues, and potential benefits to NETMARK users.

  10. Test of US Federal Life Cycle Inventory Data Interoperability

    EPA Science Inventory

    Life cycle assessment practitioners must gather data from a variety of sources. For modeling activities in the US, practitioners may wish to use life cycle inventory data from public databases and libraries provided by US government entities. An exercise was conducted to test if ...

  11. DbMap: improving database interoperability issues in medical software using a simple, Java-Xml based solution.

    PubMed Central

    Karadimas, H.; Hemery, F.; Roland, P.; Lepage, E.

    2000-01-01

    In medical software development, the use of databases plays a central role. However, most of the databases have heterogeneous encoding and data models. To deal with these variations in the application code directly is error-prone and reduces the potential reuse of the produced software. Several approaches to overcome these limitations have been proposed in the medical database literature, which will be presented. We present a simple solution, based on a Java library, and a central Metadata description file in XML. This development approach presents several benefits in software design and development cycles, the main one being the simplicity in maintenance. PMID:11079915

  12. Virtualization of open-source secure web services to support data exchange in a pediatric critical care research network.

    PubMed

    Frey, Lewis J; Sward, Katherine A; Newth, Christopher J L; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael

    2015-11-01

    To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. FAIR principles and the IEDB: short-term improvements and a long-term vision of OBO-foundry mediated machine-actionable interoperability

    PubMed Central

    Vita, Randi; Overton, James A; Mungall, Christopher J; Sette, Alessandro

    2018-01-01

    Abstract The Immune Epitope Database (IEDB), at www.iedb.org, has the mission to make published experimental data relating to the recognition of immune epitopes easily available to the scientific public. By presenting curated data in a searchable database, we have liberated it from the tables and figures of journal articles, making it more accessible and usable by immunologists. Recently, the principles of Findability, Accessibility, Interoperability and Reusability have been formulated as goals that data repositories should meet to enhance the usefulness of their data holdings. We here examine how the IEDB complies with these principles and identify broad areas of success, but also areas for improvement. We describe short-term improvements to the IEDB that are being implemented now, as well as a long-term vision of true ‘machine-actionable interoperability’, which we believe will require community agreement on standardization of knowledge representation that can be built on top of the shared use of ontologies. PMID:29688354

  14. Interoperability Outlook in the Big Data Future

    NASA Astrophysics Data System (ADS)

    Kuo, K. S.; Ramachandran, R.

    2015-12-01

    The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center" interoperability is almost guaranteed because data, analysis, and results all can be readily shared and reused. Effectively, with the establishment of "distributed active analysis centers", interoperation turns from a many-to-many problem into a less complicated few-to-few problem and becomes easier to solve.

  15. A VO-Driven Astronomical Data Grid in China

    NASA Astrophysics Data System (ADS)

    Cui, C.; He, B.; Yang, Y.; Zhao, Y.

    2010-12-01

    With the implementation of many ambitious observation projects, including LAMOST, FAST, and Antarctic observatory at Doom A, observational astronomy in China is stepping into a brand new era with emerging data avalanche. In the era of e-Science, both these cutting-edge projects and traditional astronomy research need much more powerful data management, sharing and interoperability. Based on data-grid concept, taking advantages of the IVOA interoperability technologies, China-VO is developing a VO-driven astronomical data grid environment to enable multi-wavelength science and large database science. In the paper, latest progress and data flow of the LAMOST, architecture of the data grid, and its supports to the VO are discussed.

  16. IRIS Toxicological Review of n-Butanol (Interagency Science ...

    EPA Pesticide Factsheets

    On September 8, 2011, the Toxicological Review of n-Butanol (External Review Draft) was released for external peer review and public comment. The Toxicological Review and charge were reviewed internally by EPA and by other federal agencies and White House Offices before public release. In the new IRIS process, introduced by the EPA Administrator, all written comments on IRIS assessments submitted by other federal agencies and White House Offices will be made publicly available. Accordingly, interagency comments with EPA's response and the interagency science consultation draft of the IRIS Toxicological Review of n-Butanol and the charge to external peer reviewers are posted on this site. EPA is undertaking an Integrated Risk Information System (IRIS) health assessment for n-butanol. IRIS is an EPA database containing Agency scientific positions on potential adverse human health effects that may result from chronic (or lifetime) exposure to chemicals in the environment. IRIS contains chemical-specific summaries of qualitative and quantitative health information in support of two steps of the risk assessment paradigm, i.e., hazard identification and dose-response evaluation. IRIS assessments are used in combination with specific situational exposure assessment information to evaluate potential public health risk associated with environmental contaminants.

  17. Self-Assembling Texts & Courses of Study.

    ERIC Educational Resources Information Center

    Gibson, David

    This paper describes the development of an interoperable meta-database system--a system of applications using metadata--that is intended to facilitate learner-centered collaboration, access to learning resources, and the fitness of channels of information to the emerging needs of learners at both individual and group levels. Highlights include:…

  18. IDA Publications on Irregular Warfare: A Bibliography 2000 - Fall 2008

    DTIC Science & Technology

    2008-12-01

    Affairs Insurgency Quds Force Civil Military Insurrection Radical Islam Civil Services Interagency Reconciliation Coalition Intifada Reconstruction...decision-making. While the reports have not been formally released, the database has been shared on a regular basis with other agencies and Services ...ACT ID A PUBLICATION NO. & L IMITATIONS PUBLICATION YEAR individuals. The objective SEIR curves characterize health care and mortuary service

  19. VIEWCACHE: An incremental pointer-based access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, N.; Sellis, Timos

    1992-01-01

    One of biggest problems facing NASA today is to provide scientists efficient access to a large number of distributed databases. Our pointer-based incremental database access method, VIEWCACHE, provides such an interface for accessing distributed data sets and directories. VIEWCACHE allows database browsing and search performing inter-database cross-referencing with no actual data movement between database sites. This organization and processing is especially suitable for managing Astrophysics databases which are physically distributed all over the world. Once the search is complete, the set of collected pointers pointing to the desired data are cached. VIEWCACHE includes spatial access methods for accessing image data sets, which provide much easier query formulation by referring directly to the image and very efficient search for objects contained within a two-dimensional window. We will develop and optimize a VIEWCACHE External Gateway Access to database management systems to facilitate distributed database search.

  20. Modelling and approaching pragmatic interoperability of distributed geoscience data

    NASA Astrophysics Data System (ADS)

    Ma, Xiaogang

    2010-05-01

    Interoperability of geodata, which is essential for sharing information and discovering insights within a cyberinfrastructure, is receiving increasing attention. A key requirement of interoperability in the context of geodata sharing is that data provided by local sources can be accessed, decoded, understood and appropriately used by external users. Various researchers have discussed that there are four levels in data interoperability issues: system, syntax, schematics and semantics, which respectively relate to the platform, encoding, structure and meaning of geodata. Ontology-driven approaches have been significantly studied addressing schematic and semantic interoperability issues of geodata in the last decade. There are different types, e.g. top-level ontologies, domain ontologies and application ontologies and display forms, e.g. glossaries, thesauri, conceptual schemas and logical theories. Many geodata providers are maintaining their identified local application ontologies in order to drive standardization in local databases. However, semantic heterogeneities often exist between these local ontologies, even though they are derived from equivalent disciplines. In contrast, common ontologies are being studied in different geoscience disciplines (e.g., NAMD, SWEET, etc.) as a standardization procedure to coordinate diverse local ontologies. Semantic mediation, e.g. mapping between local ontologies, or mapping local ontologies to common ontologies, has been studied as an effective way of achieving semantic interoperability between local ontologies thus reconciling semantic heterogeneities in multi-source geodata. Nevertheless, confusion still exists in the research field of semantic interoperability. One problem is caused by eliminating elements of local pragmatic contexts in semantic mediation. Comparing to the context-independent feature of a common domain ontology, local application ontologies are closely related to elements (e.g., people, time, location, intention, procedure, consequence, etc.) of local pragmatic contexts and thus context-dependent. Elimination of these elements will inevitably lead to information loss in semantic mediation between local ontologies. Correspondingly, understanding and effect of exchanged data in a new context may differ from that in its original context. Another problem is the dilemma on how to find a balance between flexibility and standardization of local ontologies, because ontologies are not fixed, but continuously evolving. It is commonly realized that we cannot use a unified ontology to replace all local ontologies because they are context-dependent and need flexibility. However, without coordination of standards, freely developed local ontologies and databases will bring enormous work of mediation between them. Finding a balance between standardization and flexibility for evolving ontologies, in a practical sense, requires negotiations (i.e. conversations, agreements and collaborations) between different local pragmatic contexts. The purpose of this work is to set up a computer-friendly model representing local pragmatic contexts (i.e. geodata sources), and propose a practical semantic negotiation procedure for approaching pragmatic interoperability between local pragmatic contexts. Information agents, objective facts and subjective dimensions are reviewed as elements of a conceptual model for representing pragmatic contexts. The author uses them to draw a practical semantic negotiation procedure approaching pragmatic interoperability of distributed geodata. The proposed conceptual model and semantic negotiation procedure were encoded with Description Logic, and then applied to analyze and manipulate semantic negotiations between different local ontologies within the National Mineral Resources Assessment (NMRA) project of China, which involves multi-source and multi-subject geodata sharing.

  1. IRIS Toxicological Review of Cerium Oxide and Cerium ...

    EPA Pesticide Factsheets

    On September 29, 2009, the IRIS Summary and Toxicological Review of Cerium Oxide and Cerium Compounds was finalized and loaded onto the IRIS database. The Toxicological Review of Cerium Oxide and Cerium Compounds was reviewed internally by EPA, by other federal agencies and White House Offices, by expert external peer reviewers, and by the public. In the new IRIS process, introduced by the EPA Administrator, all written comments on IRIS assessments submitted by other federal agencies and White House Offices will be made publicly available. Accordingly, interagency comments and the interagency draft of the Cerium Oxide and Cerium Compounds IRIS assessment are posted on this site. The draft Toxicological Review of Cerium Oxide and Cerium Compounds provides scientific support and rationale for the hazard identification and dose-response assessment pertaining to chronic exposure to cerium oxide and cerium compounds.

  2. The IAGOS information system

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Schultz, Martin; Brötz, Björn; Rauthe-Schöch, Armin; Thouret, Valérie

    2015-04-01

    IAGOS (In-service Aircraft for a Global Observing System) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. The IAGOS database (http://www.iagos.fr, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data centre Ether (CNES and CNRS). In the framework of the IGAS project (IAGOS for Copernicus Atmospheric Service) interoperability with international portals or other databases is implemented in order to improve IAGOS data discovery. The IGAS data network is composed of three data centres: the IAGOS database in Toulouse including IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data since January 2015; the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de); and the MACC data centre in Jülich (http://join.iek.fz-juelich.de). The MACC (Monitoring Atmospheric Composition and Climate) project is a prominent user of the IGAS data network. In June 2015 a new version of the IAGOS database will be released providing improved services such as download in NetCDF or NASA Ames formats; graphical tools (maps, scatter plots, etc.); standardized metadata (ISO 19115) and a better users management. The link with the MACC data centre, through JOIN (Jülich OWS Interface), will allow to combine model outputs with IAGOS data for intercomparison. The interoperability within the IGAS data network, implemented thanks to many web services, will improve the functionalities of the web interfaces of each data centre.

  3. DECADE web portal: toward the integration of MaGa, EarthChem and VOTW data systems to further the knowledge on Earth degassing

    NASA Astrophysics Data System (ADS)

    Cardellini, Carlo; Frigeri, Alessandro; Lehnert, Kerstin; Ash, Jason; McCormick, Brendan; Chiodini, Giovanni; Fischer, Tobias; Cottrell, Elizabeth

    2015-04-01

    The release of volatiles from the Earth's interior takes place in both volcanic and non-volcanic areas of the planet. The comprehension of such complex process and the improvement of the current estimates of global carbon emissions, will greatly benefit from the integration of geochemical, petrological and volcanological data. At present, major online data repositories relevant to studies of degassing are not linked and interoperable. In the framework of the Deep Earth Carbon Degassing (DECADE) initiative of the Deep Carbon Observatory (DCO), we are developing interoperability between three data systems that will make their data accessible via the DECADE portal: (1) the Smithsonian Institutionian's Global Volcanism Program database (VOTW) of volcanic activity data, (2) EarthChem databases for geochemical and geochronological data of rocks and melt inclusions, and (3) the MaGa database (Mapping Gas emissions) which contains compositional and flux data of gases released at volcanic and non-volcanic degassing sites. The DECADE web portal will create a powerful search engine of these databases from a single entry point and will return comprehensive multi-component datasets. A user will be able, for example, to obtain data relating to compositions of emitted gases, compositions and age of the erupted products and coincident activity, of a specific volcano. This level of capability requires a complete synergy between the databases, including availability of standard-based web services (WMS, WFS) at all data systems. Data and metadata can thus be extracted from each system without interfering with each database's local schema or being replicated to achieve integration at the DECADE web portal. The DECADE portal will enable new synoptic perspectives on the Earth degassing process allowing to explore Earth degassing related datasets over previously unexplored spatial or temporal ranges.

  4. The Human Phenotype Ontology project: linking molecular biology and disease through phenotype data

    PubMed Central

    Köhler, Sebastian; Doelken, Sandra C.; Mungall, Christopher J.; Bauer, Sebastian; Firth, Helen V.; Bailleul-Forestier, Isabelle; Black, Graeme C. M.; Brown, Danielle L.; Brudno, Michael; Campbell, Jennifer; FitzPatrick, David R.; Eppig, Janan T.; Jackson, Andrew P.; Freson, Kathleen; Girdea, Marta; Helbig, Ingo; Hurst, Jane A.; Jähn, Johanna; Jackson, Laird G.; Kelly, Anne M.; Ledbetter, David H.; Mansour, Sahar; Martin, Christa L.; Moss, Celia; Mumford, Andrew; Ouwehand, Willem H.; Park, Soo-Mi; Riggs, Erin Rooney; Scott, Richard H.; Sisodiya, Sanjay; Vooren, Steven Van; Wapner, Ronald J.; Wilkie, Andrew O. M.; Wright, Caroline F.; Vulto-van Silfhout, Anneke T.; de Leeuw, Nicole; de Vries, Bert B. A.; Washingthon, Nicole L.; Smith, Cynthia L.; Westerfield, Monte; Schofield, Paul; Ruef, Barbara J.; Gkoutos, Georgios V.; Haendel, Melissa; Smedley, Damian; Lewis, Suzanna E.; Robinson, Peter N.

    2014-01-01

    The Human Phenotype Ontology (HPO) project, available at http://www.human-phenotype-ontology.org, provides a structured, comprehensive and well-defined set of 10,088 classes (terms) describing human phenotypic abnormalities and 13,326 subclass relations between the HPO classes. In addition we have developed logical definitions for 46% of all HPO classes using terms from ontologies for anatomy, cell types, function, embryology, pathology and other domains. This allows interoperability with several resources, especially those containing phenotype information on model organisms such as mouse and zebrafish. Here we describe the updated HPO database, which provides annotations of 7,278 human hereditary syndromes listed in OMIM, Orphanet and DECIPHER to classes of the HPO. Various meta-attributes such as frequency, references and negations are associated with each annotation. Several large-scale projects worldwide utilize the HPO for describing phenotype information in their datasets. We have therefore generated equivalence mappings to other phenotype vocabularies such as LDDB, Orphanet, MedDRA, UMLS and phenoDB, allowing integration of existing datasets and interoperability with multiple biomedical resources. We have created various ways to access the HPO database content using flat files, a MySQL database, and Web-based tools. All data and documentation on the HPO project can be found online. PMID:24217912

  5. VIEWCACHE: An incremental pointer-based access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, N.; Sellis, Timos

    1993-01-01

    One of the biggest problems facing NASA today is to provide scientists efficient access to a large number of distributed databases. Our pointer-based incremental data base access method, VIEWCACHE, provides such an interface for accessing distributed datasets and directories. VIEWCACHE allows database browsing and search performing inter-database cross-referencing with no actual data movement between database sites. This organization and processing is especially suitable for managing Astrophysics databases which are physically distributed all over the world. Once the search is complete, the set of collected pointers pointing to the desired data are cached. VIEWCACHE includes spatial access methods for accessing image datasets, which provide much easier query formulation by referring directly to the image and very efficient search for objects contained within a two-dimensional window. We will develop and optimize a VIEWCACHE External Gateway Access to database management systems to facilitate database search.

  6. Cracking the humanitarian logistic coordination challenge: lessons from the urban search and rescue community.

    PubMed

    Tatham, Peter; Spens, Karen

    2016-04-01

    The challenges of achieving successful inter-agency logistic coordination in preparing for and responding to natural disasters and complex emergencies are both well understood and well documented. However, although many of these challenges remain unresolved, the literature reveals that the organisations that form the urban search and rescue (USAR) community have attained a high level of coherence and interoperability that results in a highly efficient and effective response. Therefore, this paper uses the idea of 'borrowing' from other fields as it explores how the processes and procedures used by the USAR community might be applied to improve humanitarian logistic operations. The paper analyses the USAR model and explores how the resultant challenges might be addressed in a humanitarian logistic context. The paper recommends that further research be undertaken in order to develop a modified USAR model that could be operationalised by the international community of humanitarian logisticians. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.

  7. NOAA Marine and Arctic Monitoring Using UASs

    NASA Astrophysics Data System (ADS)

    Jacobs, T.; Coffey, J. J.; Hood, R. E.; Hall, P.; Adler, J.

    2014-12-01

    Unmanned systems have the potential to efficiently, effectively, economically and safely bridging critical observation requirements in an environmentally friendly manner. As the United States' Marine and Arctic areas of interest expand and include hard-to-reach regions of the Earth (such as the Arctic and remote oceanic areas) optimizing unmanned capabilities will be needed to advance the United States' science, technology and security efforts. Through increased multi-mission and multi-agency operations using improved inter-operable and autonomous unmanned systems, the research and operations communities will better collect environmental intelligence and better protect our Country against hazardous weather, environmental, marine and polar hazards. This presentation will examine NOAA's Marine and Arctic Monitoring UAS strategies which includes developing a coordinated effort to maximize the efficiency and capabilities of unmanned systems across the federal government and research partners. Numerous intra- and inter-agency operational demonstrations and assessments have been made to verify and validated these strategies. The presentation will also discuss the requisite sUAS capabilities and our experience in using them.

  8. US Army Research Laboratory Joint Interagency Field Experimentation 15-2 Final Report

    DTIC Science & Technology

    2015-12-01

    February 2015, at Alameda Island, California. Advanced text analytics capabilities were demonstrated in a logically coherent workflow pipeline that... text processing capabilities allowed the targeted use of a persistent imagery sensor for rapid detection of mission- critical events. The creation of...a very large text database from open source data provides a relevant and unclassified foundation for continued development of text -processing

  9. A Web-Based Database for Nurse Led Outreach Teams (NLOT) in Toronto.

    PubMed

    Li, Shirley; Kuo, Mu-Hsing; Ryan, David

    2016-01-01

    A web-based system can provide access to real-time data and information. Healthcare is moving towards digitizing patients' medical information and securely exchanging it through web-based systems. In one of Ontario's health regions, Nurse Led Outreach Teams (NLOT) provide emergency mobile nursing services to help reduce unnecessary transfers from long-term care homes to emergency departments. Currently the NLOT team uses a Microsoft Access database to keep track of the health information on the residents that they serve. The Access database lacks scalability, portability, and interoperability. The objective of this study is the development of a web-based database using Oracle Application Express that is easily accessible from mobile devices. The web-based database will allow NLOT nurses to enter and access resident information anytime and from anywhere.

  10. Interagency collaboration models for people with mental ill health in contact with the police: a systematic scoping review

    PubMed Central

    Scantlebury, Arabella; Booth, Alison; MacBryde, Jillian Catherine; Scott, William J; Wright, Kath

    2018-01-01

    Objective To identify existing evidence on interagency collaboration between law enforcement, emergency services, statutory services and third sector agencies regarding people with mental ill health. Design Systematic scoping review. Scoping reviews map particular research areas to identify research gaps. Data sources and eligibility ASSIA, CENTRAL, the Cochrane Library databases, Criminal Justice Abstracts, ERIC, Embase, MEDLINE, PsycINFO, PROSPERO and Social Care Online and Social Sciences Citation Index were searched up to 2017, as were grey literature and hand searches. Eligible articles were empirical evaluations or descriptions of models of interagency collaboration between the police and other agencies. Study appraisal and synthesis Screening and data extraction were undertaken independently by two researchers. Arksey’s framework was used to collate and map included studies. Results One hundred and twenty-five studies were included. The majority of articles were of descriptions of models (28%), mixed methods evaluations of models (18%) and single service evaluations (14%). The most frequently reported outcomes (52%) were ‘organisational or service level outcomes’ (eg, arrest rates). Most articles (53%) focused on adults with mental ill health, whereas others focused on adult offenders with mental ill health (17.4%). Thirteen models of interagency collaboration were described, each involving between 2 and 13 agencies. Frequently reported models were ‘prearrest diversion’ of people with mental ill health (34%), ‘coresponse’ involving joint response by police officers paired with mental health professionals (28.6%) and ‘jail diversion’ following arrest (23.8%). Conclusions We identified 13 different interagency collaboration models catering for a range of mental health-related interactions. All but one of these models involved the police and mental health services or professionals. Several models have sufficient literature to warrant full systematic reviews of their effectiveness, whereas others need robust evaluation, by randomised controlled trial where appropriate. Future evaluations should focus on health-related outcomes and the impact on key stakeholders. PMID:29588323

  11. Interagency collaboration models for people with mental ill health in contact with the police: a systematic scoping review.

    PubMed

    Parker, Adwoa; Scantlebury, Arabella; Booth, Alison; MacBryde, Jillian Catherine; Scott, William J; Wright, Kath; McDaid, Catriona

    2018-03-27

    To identify existing evidence on interagency collaboration between law enforcement, emergency services, statutory services and third sector agencies regarding people with mental ill health. Systematic scoping review. Scoping reviews map particular research areas to identify research gaps. ASSIA, CENTRAL, the Cochrane Library databases, Criminal Justice Abstracts, ERIC, Embase, MEDLINE, PsycINFO, PROSPERO and Social Care Online and Social Sciences Citation Index were searched up to 2017, as were grey literature and hand searches. Eligible articles were empirical evaluations or descriptions of models of interagency collaboration between the police and other agencies. Screening and data extraction were undertaken independently by two researchers. Arksey's framework was used to collate and map included studies. One hundred and twenty-five studies were included. The majority of articles were of descriptions of models (28%), mixed methods evaluations of models (18%) and single service evaluations (14%). The most frequently reported outcomes (52%) were 'organisational or service level outcomes' (eg, arrest rates). Most articles (53%) focused on adults with mental ill health, whereas others focused on adult offenders with mental ill health (17.4%). Thirteen models of interagency collaboration were described, each involving between 2 and 13 agencies. Frequently reported models were 'prearrest diversion' of people with mental ill health (34%), 'coresponse' involving joint response by police officers paired with mental health professionals (28.6%) and 'jail diversion' following arrest (23.8%). We identified 13 different interagency collaboration models catering for a range of mental health-related interactions. All but one of these models involved the police and mental health services or professionals. Several models have sufficient literature to warrant full systematic reviews of their effectiveness, whereas others need robust evaluation, by randomised controlled trial where appropriate. Future evaluations should focus on health-related outcomes and the impact on key stakeholders. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. A spatial database of wildfires in the United States, 1992-2011

    NASA Astrophysics Data System (ADS)

    Short, K. C.

    2013-07-01

    The statistical analysis of wildfire activity is a critical component of national wildfire planning, operations, and research in the United States (US). However, there are multiple federal, state, and local entities with wildfire protection and reporting responsibilities in the US, and no single, unified system of wildfire record-keeping exists. To conduct even the most rudimentary interagency analyses of wildfire numbers and area burned from the authoritative systems of record, one must harvest records from dozens of disparate databases with inconsistent information content. The onus is then on the user to check for and purge redundant records of the same fire (i.e. multijurisdictional incidents with responses reported by several agencies or departments) after pooling data from different sources. Here we describe our efforts to acquire, standardize, error-check, compile, scrub, and evaluate the completeness of US federal, state, and local wildfire records from 1992-2011 for the national, interagency Fire Program Analysis (FPA) application. The resulting FPA Fire-occurrence Database (FPA FOD) includes nearly 1.6 million records from the 20 yr period, with values for at least the following core data elements: location at least as precise as a Public Land Survey System section (2.6 km2 grid), discovery date, and final fire size. The FPA FOD is publicly available from the Research Data Archive of the US Department of Agriculture, Forest Service (doi:10.2737/RDS-2013-0009). While necessarily incomplete in some aspects, the database is intended to facilitate fairly high-resolution geospatial analysis of US wildfire activity over the past two decades, based on available information from the authoritative systems of record.

  13. A spatial database of wildfires in the United States, 1992-2011

    NASA Astrophysics Data System (ADS)

    Short, K. C.

    2014-01-01

    The statistical analysis of wildfire activity is a critical component of national wildfire planning, operations, and research in the United States (US). However, there are multiple federal, state, and local entities with wildfire protection and reporting responsibilities in the US, and no single, unified system of wildfire record keeping exists. To conduct even the most rudimentary interagency analyses of wildfire numbers and area burned from the authoritative systems of record, one must harvest records from dozens of disparate databases with inconsistent information content. The onus is then on the user to check for and purge redundant records of the same fire (i.e., multijurisdictional incidents with responses reported by several agencies or departments) after pooling data from different sources. Here we describe our efforts to acquire, standardize, error-check, compile, scrub, and evaluate the completeness of US federal, state, and local wildfire records from 1992-2011 for the national, interagency Fire Program Analysis (FPA) application. The resulting FPA Fire-Occurrence Database (FPA FOD) includes nearly 1.6 million records from the 20 yr period, with values for at least the following core data elements: location, at least as precise as a Public Land Survey System section (2.6 km2 grid), discovery date, and final fire size. The FPA FOD is publicly available from the Research Data Archive of the US Department of Agriculture, Forest Service (doi:10.2737/RDS-2013-0009). While necessarily incomplete in some aspects, the database is intended to facilitate fairly high-resolution geospatial analysis of US wildfire activity over the past two decades, based on available information from the authoritative systems of record.

  14. Querying XML Data with SPARQL

    NASA Astrophysics Data System (ADS)

    Bikakis, Nikos; Gioldasis, Nektarios; Tsinaraki, Chrisa; Christodoulakis, Stavros

    SPARQL is today the standard access language for Semantic Web data. In the recent years XML databases have also acquired industrial importance due to the widespread applicability of XML in the Web. In this paper we present a framework that bridges the heterogeneity gap and creates an interoperable environment where SPARQL queries are used to access XML databases. Our approach assumes that fairly generic mappings between ontology constructs and XML Schema constructs have been automatically derived or manually specified. The mappings are used to automatically translate SPARQL queries to semantically equivalent XQuery queries which are used to access the XML databases. We present the algorithms and the implementation of SPARQL2XQuery framework, which is used for answering SPARQL queries over XML databases.

  15. Nanopublications for exposing experimental data in the life-sciences: a Huntington's Disease case study.

    PubMed

    Mina, Eleni; Thompson, Mark; Kaliyaperumal, Rajaram; Zhao, Jun; der Horst, van Eelke; Tatum, Zuotian; Hettne, Kristina M; Schultes, Erik A; Mons, Barend; Roos, Marco

    2015-01-01

    Data from high throughput experiments often produce far more results than can ever appear in the main text or tables of a single research article. In these cases, the majority of new associations are often archived either as supplemental information in an arbitrary format or in publisher-independent databases that can be difficult to find. These data are not only lost from scientific discourse, but are also elusive to automated search, retrieval and processing. Here, we use the nanopublication model to make scientific assertions that were concluded from a workflow analysis of Huntington's Disease data machine-readable, interoperable, and citable. We followed the nanopublication guidelines to semantically model our assertions as well as their provenance metadata and authorship. We demonstrate interoperability by linking nanopublication provenance to the Research Object model. These results indicate that nanopublications can provide an incentive for researchers to expose data that is interoperable and machine-readable for future use and preservation for which they can get credits for their effort. Nanopublications can have a leading role into hypotheses generation offering opportunities to produce large-scale data integration.

  16. Clinical data integration model. Core interoperability ontology for research using primary care data.

    PubMed

    Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A

    2015-01-01

    This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.

  17. Community-Supported Data Repositories in Paleobiology: A 'Middle Tail' Between the Geoscientific and Informatics Communities

    NASA Astrophysics Data System (ADS)

    Williams, J. W.; Ashworth, A. C.; Betancourt, J. L.; Bills, B.; Blois, J.; Booth, R.; Buckland, P.; Charles, D.; Curry, B. B.; Goring, S. J.; Davis, E.; Grimm, E. C.; Graham, R. W.; Smith, A. J.

    2015-12-01

    Community-supported data repositories (CSDRs) in paleoecology and paleoclimatology have a decades-long tradition and serve multiple critical scientific needs. CSDRs facilitate synthetic large-scale scientific research by providing open-access and curated data that employ community-supported metadata and data standards. CSDRs serve as a 'middle tail' or boundary organization between information scientists and the long-tail community of individual geoscientists collecting and analyzing paleoecological data. Over the past decades, a distributed network of CSDRs has emerged, each serving a particular suite of data and research communities, e.g. Neotoma Paleoecology Database, Paleobiology Database, International Tree Ring Database, NOAA NCEI for Paleoclimatology, Morphobank, iDigPaleo, and Integrated Earth Data Alliance. Recently, these groups have organized into a common Paleobiology Data Consortium dedicated to improving interoperability and sharing best practices and protocols. The Neotoma Paleoecology Database offers one example of an active and growing CSDR, designed to facilitate research into ecological and evolutionary dynamics during recent past global change. Neotoma combines a centralized database structure with distributed scientific governance via multiple virtual constituent data working groups. The Neotoma data model is flexible and can accommodate a variety of paleoecological proxies from many depositional contests. Data input into Neotoma is done by trained Data Stewards, drawn from their communities. Neotoma data can be searched, viewed, and returned to users through multiple interfaces, including the interactive Neotoma Explorer map interface, REST-ful Application Programming Interfaces (APIs), the neotoma R package, and the Tilia stratigraphic software. Neotoma is governed by geoscientists and provides community engagement through training workshops for data contributors, stewards, and users. Neotoma is engaged in the Paleobiological Data Consortium and other efforts to improve interoperability among cyberinfrastructure in the paleogeosciences.

  18. Electronic Resources in a Next-Generation Catalog: The Case of WorldCat Local

    ERIC Educational Resources Information Center

    Shadle, Steve

    2009-01-01

    In April 2007, the University of Washington Libraries debuted WorldCat Local (WCL), a localized version of the WorldCat database that interoperates with a library's integrated library system and fulfillment services to provide a single-search interface for a library's physical and electronic content. This brief will describe how WCL incorporates a…

  19. Integrating Technologies, Methodologies, and Databases into a Comprehensive Terminology Management Environment to Support Interoperability among Clinical Information Systems

    ERIC Educational Resources Information Center

    Shakib, Shaun Cameron

    2013-01-01

    Controlled clinical terminologies are essential to realizing the benefits of electronic health record systems. However, implementing consistent and sustainable use of terminology has proven to be both intellectually and practically challenging. First, this project derives a conceptual understanding of the scope and intricacies of the challenge by…

  20. Migration of legacy mumps applications to relational database servers.

    PubMed

    O'Kane, K C

    2001-07-01

    An extended implementation of the Mumps language is described that facilitates vendor neutral migration of legacy Mumps applications to SQL-based relational database servers. Implemented as a compiler, this system translates Mumps programs to operating system independent, standard C code for subsequent compilation to fully stand-alone, binary executables. Added built-in functions and support modules extend the native hierarchical Mumps database with access to industry standard, networked, relational database management servers (RDBMS) thus freeing Mumps applications from dependence upon vendor specific, proprietary, unstandardized database models. Unlike Mumps systems that have added captive, proprietary RDMBS access, the programs generated by this development environment can be used with any RDBMS system that supports common network access protocols. Additional features include a built-in web server interface and the ability to interoperate directly with programs and functions written in other languages.

  1. A future-proof architecture for telemedicine using loose-coupled modules and HL7 FHIR.

    PubMed

    Gøeg, Kirstine Rosenbeck; Rasmussen, Rune Kongsgaard; Jensen, Lasse; Wollesen, Christian Møller; Larsen, Søren; Pape-Haugaard, Louise Bilenberg

    2018-07-01

    Most telemedicine solutions are proprietary and disease specific which cause a heterogeneous and silo-oriented system landscape with limited interoperability. Solving the interoperability problem would require a strong focus on data integration and standardization in telemedicine infrastructures. Our objective was to suggest a future-proof architecture, that consisted of small loose-coupled modules to allow flexible integration with new and existing services, and the use of international standards to allow high re-usability of modules, and interoperability in the health IT landscape. We identified core features of our future-proof architecture as the following (1) To provide extended functionality the system should be designed as a core with modules. Database handling and implementation of security protocols are modules, to improve flexibility compared to other frameworks. (2) To ensure loosely coupled modules the system should implement an inversion of control mechanism. (3) A focus on ease of implementation requires the system should use HL7 FHIR (Fast Interoperable Health Resources) as the primary standard because it is based on web-technologies. We evaluated the feasibility of our architecture by developing an open source implementation of the system called ORDS. ORDS is written in TypeScript, and makes use of the Express Framework and HL7 FHIR DSTU2. The code is distributed on GitHub. All modules have been tested unit wise, but end-to-end testing awaits our first clinical example implementations. Our study showed that highly adaptable and yet interoperable core frameworks for telemedicine can be designed and implemented. Future work includes implementation of a clinical use case and evaluation. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Design and implementation of a health data interoperability mediator.

    PubMed

    Kuo, Mu-Hsing; Kushniruk, Andre William; Borycki, Elizabeth Marie

    2010-01-01

    The objective of this study is to design and implement a common-gateway oriented mediator to solve the health data interoperability problems that exist among heterogeneous health information systems. The proposed mediator has three main components: (1) a Synonym Dictionary (SD) that stores a set of global metadata and terminologies to serve as the mapping intermediary, (2) a Semantic Mapping Engine (SME) that can be used to map metadata and instance semantics, and (3) a DB-to-XML module that translates source health data stored in a database into XML format and back. A routine admission notification data exchange scenario is used to test the efficiency and feasibility of the proposed mediator. The study results show that the proposed mediator can make health information exchange more efficient.

  3. Feasibility of Including Green Tea Products for an Analytically Verified Dietary Supplement Database

    PubMed Central

    Saldanha, Leila; Dwyer, Johanna; Andrews, Karen; Betz, Joseph; Harnely, James; Pehrsson, Pamela; Rimmer, Catherine; Savarala, Sushma

    2015-01-01

    The Dietary Supplement Ingredient Database (DSID) is a federally funded, publicly accessible dietary supplement database that currently contains analytically-derived information on micronutrients in selected adult and children’s multivitamin and mineral (MVM) supplements. Other constituents in dietary supplement products such as botanicals are also of interest and thus are being considered for inclusion in the DSID. Thirty-eight constituents, mainly botanicals were identified and prioritized by a federal interagency committee. Green tea was selected from this list as the botanical for expansion of the DSID. This paper describes the process for prioritizing dietary ingredients in the DSID. It also discusses the criteria for inclusion of these ingredients, and the approach for selecting and testing products for the green tea pilot study. PMID:25817236

  4. Could inter-agency working reduce emergency department attendances due to alcohol consumption?

    PubMed

    Benger, J; Carter, R

    2008-06-01

    Excess alcohol consumption and associated harms in terms of health, crime and disorder have been highlighted by the government and media, causing considerable public concern. This study quantified the number of patient attendances at an urban adult and children's emergency department (ED) directly attributable to alcohol intoxication, and investigated ways in which the inter-agency sharing of anonymised information could be used to design, implement and monitor interventions to reduce these harms. Intoxicated patients attending either the adult or children's ED were prospectively identified by qualified nursing staff and anonymised data collected by a dedicated researcher. Collaboration and data sharing between health, police, social services, university experts and local authorities was achieved through the establishment of steering and operational groups with agreed objectives and the formation of a shared anonymised database. The proportion of patients attending the ED as a result of alcohol intoxication was 4% in adults and <1% in children. 70% of patients were male, with a mean age of 30 years, and 72% attended between 20.00 and 08.00 h. The most common reason for ED attendance was accident (34%), followed closely by assault (30%). 27% of patients had done most of their drinking at home, 36% in a pub and 16% in a nightclub. Inter-agency collaboration proved highly successful: pooling of anonymised data created a much clearer picture of the extent of the problem and immediately suggested strategies for intervention. The initiative to achieve inter-agency collaboration and data sharing was highly successful, with clear potential for the development and implementation of interventions that will reduce ED attendance due to excess alcohol consumption.

  5. A Chado case study: an ontology-based modular schema for representing genome-associated biological information.

    PubMed

    Mungall, Christopher J; Emmert, David B

    2007-07-01

    A few years ago, FlyBase undertook to design a new database schema to store Drosophila data. It would fully integrate genomic sequence and annotation data with bibliographic, genetic, phenotypic and molecular data from the literature representing a distillation of the first 100 years of research on this major animal model system. In developing this new integrated schema, FlyBase also made a commitment to ensure that its design was generic, extensible and available as open source, so that it could be employed as the core schema of any model organism data repository, thereby avoiding redundant software development and potentially increasing interoperability. Our question was whether we could create a relational database schema that would be successfully reused. Chado is a relational database schema now being used to manage biological knowledge for a wide variety of organisms, from human to pathogens, especially the classes of information that directly or indirectly can be associated with genome sequences or the primary RNA and protein products encoded by a genome. Biological databases that conform to this schema can interoperate with one another, and with application software from the Generic Model Organism Database (GMOD) toolkit. Chado is distinctive because its design is driven by ontologies. The use of ontologies (or controlled vocabularies) is ubiquitous across the schema, as they are used as a means of typing entities. The Chado schema is partitioned into integrated subschemas (modules), each encapsulating a different biological domain, and each described using representations in appropriate ontologies. To illustrate this methodology, we describe here the Chado modules used for describing genomic sequences. GMOD is a collaboration of several model organism database groups, including FlyBase, to develop a set of open-source software for managing model organism data. The Chado schema is freely distributed under the terms of the Artistic License (http://www.opensource.org/licenses/artistic-license.php) from GMOD (www.gmod.org).

  6. The Earth System Grid Federation (ESGF) Project

    NASA Astrophysics Data System (ADS)

    Carenton-Madiec, Nicolas; Denvil, Sébastien; Greenslade, Mark

    2015-04-01

    The Earth System Grid Federation (ESGF) Peer-to-Peer (P2P) enterprise system is a collaboration that develops, deploys and maintains software infrastructure for the management, dissemination, and analysis of model output and observational data. ESGF's primary goal is to facilitate advancements in Earth System Science. It is an interagency and international effort led by the US Department of Energy (DOE), and co-funded by National Aeronautics and Space Administration (NASA), National Oceanic and Atmospheric Administration (NOAA), National Science Foundation (NSF), Infrastructure for the European Network of Earth System Modelling (IS-ENES) and international laboratories such as the Max Planck Institute for Meteorology (MPI-M) german Climate Computing Centre (DKRZ), the Australian National University (ANU) National Computational Infrastructure (NCI), Institut Pierre-Simon Laplace (IPSL), and the British Atmospheric Data Center (BADC). Its main mission is to support current CMIP5 activities and prepare for future assesments. The ESGF architecture is based on a system of autonomous and distributed nodes, which interoperate through common acceptance of federation protocols and trust agreements. Data is stored at multiple nodes around the world, and served through local data and metadata services. Nodes exchange information about their data holdings and services, trust each other for registering users and establishing access control decisions. The net result is that a user can use a web browser, connect to any node, and seamlessly find and access data throughout the federation. This type of collaborative working organization and distributed architecture context en-lighted the need of integration and testing processes definition to ensure the quality of software releases and interoperability. This presentation will introduce the ESGF project and demonstrate the range of tools and processes that have been set up to support release management activities.

  7. Interoperability challenges in river discharge modelling: A cross domain application scenario

    NASA Astrophysics Data System (ADS)

    Santoro, Mattia; Andres, Volker; Jirka, Simon; Koike, Toshio; Looser, Ulrich; Nativi, Stefano; Pappenberger, Florian; Schlummer, Manuela; Strauch, Adrian; Utech, Michael; Zsoter, Ervin

    2018-06-01

    River discharge is a critical water cycle variable, as it integrates all the processes (e.g. runoff and evapotranspiration) occurring within a river basin and provides a hydrological output variable that can be readily measured. Its prediction is of invaluable help for many water-related tasks including water resources assessment and management, flood protection, and disaster mitigation. Observations of river discharge are important to calibrate and validate hydrological or coupled land, atmosphere and ocean models. This requires using datasets from different scientific domains (Water, Weather, etc.). Typically, such datasets are provided using different technological solutions. This complicates the integration of new hydrological data sources into application systems. Therefore, a considerable effort is often spent on data access issues instead of the actual scientific question. This paper describes the work performed to address multidisciplinary interoperability challenges related to river discharge modeling and validation. This includes definition and standardization of domain specific interoperability standards for hydrological data sharing and their support in global frameworks such as the Global Earth Observation System of Systems (GEOSS). The research was developed in the context of the EU FP7-funded project GEOWOW (GEOSS Interoperability for Weather, Ocean and Water), which implemented a "River Discharge" application scenario. This scenario demonstrates the combination of river discharge observations data from the Global Runoff Data Centre (GRDC) database and model outputs produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) predicting river discharge based on weather forecast information in the context of the GEOSS.

  8. A common layer of interoperability for biomedical ontologies based on OWL EL.

    PubMed

    Hoehndorf, Robert; Dumontier, Michel; Oellrich, Anika; Wimalaratne, Sarala; Rebholz-Schuhmann, Dietrich; Schofield, Paul; Gkoutos, Georgios V

    2011-04-01

    Ontologies are essential in biomedical research due to their ability to semantically integrate content from different scientific databases and resources. Their application improves capabilities for querying and mining biological knowledge. An increasing number of ontologies is being developed for this purpose, and considerable effort is invested into formally defining them in order to represent their semantics explicitly. However, current biomedical ontologies do not facilitate data integration and interoperability yet, since reasoning over these ontologies is very complex and cannot be performed efficiently or is even impossible. We propose the use of less expressive subsets of ontology representation languages to enable efficient reasoning and achieve the goal of genuine interoperability between ontologies. We present and evaluate EL Vira, a framework that transforms OWL ontologies into the OWL EL subset, thereby enabling the use of tractable reasoning. We illustrate which OWL constructs and inferences are kept and lost following the conversion and demonstrate the performance gain of reasoning indicated by the significant reduction of processing time. We applied EL Vira to the open biomedical ontologies and provide a repository of ontologies resulting from this conversion. EL Vira creates a common layer of ontological interoperability that, for the first time, enables the creation of software solutions that can employ biomedical ontologies to perform inferences and answer complex queries to support scientific analyses. The EL Vira software is available from http://el-vira.googlecode.com and converted OBO ontologies and their mappings are available from http://bioonto.gen.cam.ac.uk/el-ont.

  9. DECADE Web Portal: Integrating MaGa, EarthChem and GVP Will Further Our Knowledge on Earth Degassing

    NASA Astrophysics Data System (ADS)

    Cardellini, C.; Frigeri, A.; Lehnert, K. A.; Ash, J.; McCormick, B.; Chiodini, G.; Fischer, T. P.; Cottrell, E.

    2014-12-01

    The release of gases from the Earth's interior to the exosphere takes place in both volcanic and non-volcanic areas of the planet. Fully understanding this complex process requires the integration of geochemical, petrological and volcanological data. At present, major online data repositories relevant to studies of degassing are not linked and interoperable. We are developing interoperability between three of those, which will support more powerful synoptic studies of degassing. The three data systems that will make their data accessible via the DECADE portal are: (1) the Smithsonian Institution's Global Volcanism Program database (GVP) of volcanic activity data, (2) EarthChem databases for geochemical and geochronological data of rocks and melt inclusions, and (3) the MaGa database (Mapping Gas emissions) which contains compositional and flux data of gases released at volcanic and non-volcanic degassing sites. These databases are developed and maintained by institutions or groups of experts in a specific field, and data are archived in formats specific to these databases. In the framework of the Deep Earth Carbon Degassing (DECADE) initiative of the Deep Carbon Observatory (DCO), we are developing a web portal that will create a powerful search engine of these databases from a single entry point. The portal will return comprehensive multi-component datasets, based on the search criteria selected by the user. For example, a single geographic or temporal search will return data relating to compositions of emitted gases and erupted products, the age of the erupted products, and coincident activity at the volcano. The development of this level of capability for the DECADE Portal requires complete synergy between these databases, including availability of standard-based web services (WMS, WFS) at all data systems. Data and metadata can thus be extracted from each system without interfering with each database's local schema or being replicated to achieve integration at the DECADE web portal. The DECADE portal will enable new synoptic perspectives on the Earth degassing process. Other data systems can be easily plugged in using the existing framework. Our vision is to explore Earth degassing related datasets over previously unexplored spatial or temporal ranges.

  10. Building a Dynamic Spectrum Access Smart Radio with Application to Public Safety Disaster Communications

    DTIC Science & Technology

    2009-08-13

    User Interface Master C ontroller (MC ) C ompos ite XML C onfiguration & E vents Awareness Universal  R adio F ramework US R P 802.11 B luetooth E...Bluetooth Database US R P  Database Adapters to be evaluated in Network Testbed Memory Adaptability Policy C ontrol Interoperability Figure 4.2...10 15 20 25 30 10-3 10-2 10-1 100 101 102 103 time (sec) UDP Jitter Ji tte r ( m s) 802.3 Wired 802.11 Wireless Bluetooth GNU Radio/USRP Figure

  11. Software Application Profile: Opal and Mica: open-source software solutions for epidemiological data management, harmonization and dissemination

    PubMed Central

    Doiron, Dany; Marcon, Yannick; Fortier, Isabel; Burton, Paul; Ferretti, Vincent

    2017-01-01

    Abstract Motivation Improving the dissemination of information on existing epidemiological studies and facilitating the interoperability of study databases are essential to maximizing the use of resources and accelerating improvements in health. To address this, Maelstrom Research proposes Opal and Mica, two inter-operable open-source software packages providing out-of-the-box solutions for epidemiological data management, harmonization and dissemination. Implementation Opal and Mica are two standalone but inter-operable web applications written in Java, JavaScript and PHP. They provide web services and modern user interfaces to access them. General features Opal allows users to import, manage, annotate and harmonize study data. Mica is used to build searchable web portals disseminating study and variable metadata. When used conjointly, Mica users can securely query and retrieve summary statistics on geographically dispersed Opal servers in real-time. Integration with the DataSHIELD approach allows conducting more complex federated analyses involving statistical models. Availability Opal and Mica are open-source and freely available at [www.obiba.org] under a General Public License (GPL) version 3, and the metadata models and taxonomies that accompany them are available under a Creative Commons licence. PMID:29025122

  12. Interoperability of GADU in using heterogeneous Grid resources for bioinformatics applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sulakhe, D.; Rodriguez, A.; Wilde, M.

    2008-03-01

    Bioinformatics tools used for efficient and computationally intensive analysis of genetic sequences require large-scale computational resources to accommodate the growing data. Grid computational resources such as the Open Science Grid and TeraGrid have proved useful for scientific discovery. The genome analysis and database update system (GADU) is a high-throughput computational system developed to automate the steps involved in accessing the Grid resources for running bioinformatics applications. This paper describes the requirements for building an automated scalable system such as GADU that can run jobs on different Grids. The paper describes the resource-independent configuration of GADU using the Pegasus-based virtual datamore » system that makes high-throughput computational tools interoperable on heterogeneous Grid resources. The paper also highlights the features implemented to make GADU a gateway to computationally intensive bioinformatics applications on the Grid. The paper will not go into the details of problems involved or the lessons learned in using individual Grid resources as it has already been published in our paper on genome analysis research environment (GNARE) and will focus primarily on the architecture that makes GADU resource independent and interoperable across heterogeneous Grid resources.« less

  13. 75 FR 45606 - Interagency Ocean Policy Task Force-Final Recommendations of the Interagency Ocean Policy Task Force

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-03

    ... COUNCIL ON ENVIRONMENTAL QUALITY Interagency Ocean Policy Task Force--Final Recommendations of the Interagency Ocean Policy Task Force AGENCY: Council on Environmental Quality. ACTION: Notice of Availability, Interagency Ocean Policy Task Force's [[Page 45607

  14. Lessons from collaborative governance and sociobiology theories for reinforcing sustained cooperation: a government food security case study.

    PubMed

    Montoya, L A; Montoya, I; Sánchez González, O D

    2015-07-01

    This research aimed to understand how cooperation and collaboration work in interagency arrangements using a case study of the public management of food security and nutrition in Bogotá, Colombia. This study explored the available scientific literature on Collaborative Governance within the Public Management body of knowledge and the literature on Cooperation from the Sociobiology field. Then, proposals were developed for testing on the ground through an action-research effort that was documented as a case study. Finally, observations were used to test the proposals and some analytical generalizations were developed. To document the case study, several personal interviews, file reviews and normative reviews were conducted to generate a case study database. Collaboration and cooperation concepts within the framework of interagency public management can be understood as a shared desirable outcome that unites different agencies in committing efforts and resources to the accomplishment of a common goal for society, as seen in obtaining food and nutrition security for a specific territory. Collaboration emerges when the following conditions exist and decreases when they are absent: (1) a strong sponsorship that may come from a central government policy or from a distributed interagency consensus; (2) a clear definition of the participating agencies; (3) stability of the staff assigned to the coordination system; and (4) a fitness function for the staff, some mechanism to reward or punish the collaboration level of each individual in the interagency effort. As this research investigated only one case study, the findings must be taken with care and any generalization made from this study needs to be analytical in nature. Additionally, research must be done to accept these results universally. Food security and nutrition efforts are interagency in nature. For collaboration between agencies to emerge, a minimum set of characteristics that were established during the merging of the public management and sociobiology fields of knowledge and validated by means of a case study must be accomplished. Copyright © 2015 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  15. Solar-Terrestrial Ontology Development

    NASA Astrophysics Data System (ADS)

    McGuinness, D.; Fox, P.; Middleton, D.; Garcia, J.; Cinquni, L.; West, P.; Darnell, J. A.; Benedict, J.

    2005-12-01

    The development of an interdisciplinary virtual observatory (the Virtual Solar-Terrestrial Observatory; VSTO) as a scalable environment for searching, integrating, and analyzing databases distributed over the Internet requires a higher level of semantic interoperability than here-to-fore required by most (if not all) distributed data systems or discipline specific virtual observatories. The formalization of semantics using ontologies and their encodings for the internet (e.g. OWL - the Web Ontology Language), as well as the use of accompanying tools, such as reasoning, inference and explanation, open up both a substantial leap in options for interoperability and in the need for formal development principles to guide ontology development and use within modern, multi-tiered network data environments. In this presentation, we outline the formal methodologies we utilize in the VSTO project, the currently developed use-cases, ontologies and their relation to existing ontologies (such as SWEET).

  16. Software support in automation of medicinal product evaluations.

    PubMed

    Juric, Radmila; Shojanoori, Reza; Slevin, Lindi; Williams, Stephen

    2005-01-01

    Medicinal product evaluation is one of the most important tasks undertaken by government health departments and their regulatory authorities, in every country in the world. The automation and adequate software support are critical tasks that can improve the efficiency and interoperation of regulatory systems across the world. In this paper we propose a software solution that supports the automation of the (i) submission of licensing applications, and (ii) evaluations of submitted licensing applications, according to regulatory authorities' procedures. The novelty of our solution is in allowing licensing applications to be submitted in any country in the world and evaluated according to any evaluation procedure (which can be chosen by either regulatory authorities or pharmaceutical companies). Consequently, submission and evaluation procedures become interoperable and the associated data repositories/databases can be shared between various countries and regulatory authorities.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Tian-Jy; Kim, Younghun

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented thatmore » communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.« less

  18. Using Web Ontology Language to Integrate Heterogeneous Databases in the Neurosciences

    PubMed Central

    Lam, Hugo Y.K.; Marenco, Luis; Shepherd, Gordon M.; Miller, Perry L.; Cheung, Kei-Hoi

    2006-01-01

    Integrative neuroscience involves the integration and analysis of diverse types of neuroscience data involving many different experimental techniques. This data will increasingly be distributed across many heterogeneous databases that are web-accessible. Currently, these databases do not expose their schemas (database structures) and their contents to web applications/agents in a standardized, machine-friendly way. This limits database interoperation. To address this problem, we describe a pilot project that illustrates how neuroscience databases can be expressed using the Web Ontology Language, which is a semantically-rich ontological language, as a common data representation language to facilitate complex cross-database queries. In this pilot project, an existing tool called “D2RQ” was used to translate two neuroscience databases (NeuronDB and CoCoDat) into OWL, and the resulting OWL ontologies were then merged. An OWL-based reasoner (Racer) was then used to provide a sophisticated query language (nRQL) to perform integrated queries across the two databases based on the merged ontology. This pilot project is one step toward exploring the use of semantic web technologies in the neurosciences. PMID:17238384

  19. An open platform for promoting interoperability in solar system sciences

    NASA Astrophysics Data System (ADS)

    Csillaghy, André; Aboudarham, Jean; Berghmans, David; Jacquey, Christian

    2013-04-01

    The European coordination project CASSIS is promoting the creation of an integrated data space that will facilitate science across community boundaries in solar system sciences. Many disciplines may need to use the same data set to support scientific research, although the way they are used may depend on the project and on the particular piece of science. Often, access is hindered because of differences in the way the different communities describe, store their data, as well as how they make them accessible. Working towards this goal, we have set up an open collaboration platform, www.explorespace.eu, that can serve as a hub for discovering and developing interoperability resources in the communities involved. The platform is independent of the project and will be maintained well after the end of the funding. As a first step, we have captured the description of services already provided by the community. The openness of the collaboration platform should allow to discuss with all stakeholders ways to make key types of metadata and derived products more complete and coherent and thus more usable across the domain boundaries. Furthermore, software resources and discussions should help facilitating the development of interoperable services. The platform, along with the database of services, address the following questions, which we consider crucial for promoting interoperability: • Current extent of the data space coverage: What part of the common data space is already covered by the existing interoperable services in terms of data access. In other words, what data, from catalogues as well as from raw data, can be reached by an application through standard protocols today? • Needed extension of the data space coverage: What would be needed to extend the data space coverage? In other words, how can the currently accessible data space be extended by adding services? • Missing services: What applications / services are still missing and need to be developed? This is not a trivial question, as the generation of the common data space in itself creates new requirements on overarching applications that might be necessary to provide a unified access to all the services. As an example, one particular aspect discussed in the platform is the design of web services. Applications of today are mainly human centred while interoperability must happen one level below and the back ends (databases) must be generic, i.e. independent from the applications. We intent our effort to provide to developers resources that disentangle user interfaces from data services. Many activities are challenging and we hope they will be discussed on our platform. In particular, the quality of the services, the data space and the needs of interdisciplinary approaches are serious concerns for instruments such as ATST and EST or the ones onboard SDO and, in the future, Solar Orbiter. We believe that our platform might be useful as a kind of guide that would allow groups of not having to reinvent the wheel for each new instrument.

  20. From the Battlefield to the Bedside: Supporting Warfighter and Civilian Health With the "ART" of Whole Genome Sequencing for Antibiotic Resistance and Outbreak Investigations.

    PubMed

    Lesho, Emil; Lin, Xiaoxu; Clifford, Robert; Snesrud, Erik; Onmus-Leone, Fatma; Appalla, Lakshmi; Ong, Ana; Maybank, Rosslyn; Nielsen, Lindsey; Kwak, Yoon; Hinkle, Mary; Turco, John; Marin, Juan A; Hooks, Sally; Matthews, Stacy; Hyland, Stephen; Little, Jered; Waterman, Paige; McGann, Patrick

    2016-07-01

    Awareness, responsiveness, and throughput characterize an approach for enhancing the clinical impact of whole genome sequencing for austere environments and for large geographically dispersed health systems. This Department of Defense approach is informing interagency efforts linking antibiograms of multidrug-resistant organisms to their genome sequences in a public database. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.

  1. Interoperable Solar Data and Metadata via LISIRD 3

    NASA Astrophysics Data System (ADS)

    Wilson, A.; Lindholm, D. M.; Pankratz, C. K.; Snow, M. A.; Woods, T. N.

    2015-12-01

    LISIRD 3 is a major upgrade of the LASP Interactive Solar Irradiance Data Center (LISIRD), which serves several dozen space based solar irradiance and related data products to the public. Through interactive plots, LISIRD 3 provides data browsing supported by data subsetting and aggregation. Incorporating a semantically enabled metadata repository, LISIRD 3 users see current, vetted, consistent information about the datasets offered. Users can now also search for datasets based on metadata fields such as dataset type and/or spectral or temporal range. This semantic database enables metadata browsing, so users can discover the relationships between datasets, instruments, spacecraft, mission and PI. The database also enables creation and publication of metadata records in a variety of formats, such as SPASE or ISO, making these datasets more discoverable. The database also enables the possibility of a public SPARQL endpoint, making the metadata browsable in an automated fashion. LISIRD 3's data access middleware, LaTiS, provides dynamic, on demand reformatting of data and timestamps, subsetting and aggregation, and other server side functionality via a RESTful OPeNDAP compliant API, enabling interoperability between LASP datasets and many common tools. LISIRD 3's templated front end design, coupled with the uniform data interface offered by LaTiS, allows easy integration of new datasets. Consequently the number and variety of datasets offered by LISIRD has grown to encompass several dozen, with many more to come. This poster will discuss design and implementation of LISIRD 3, including tools used, capabilities enabled, and issues encountered.

  2. Hybrid knowledge systems

    NASA Technical Reports Server (NTRS)

    Subrahmanian, V. S.

    1994-01-01

    An architecture called hybrid knowledge system (HKS) is described that can be used to interoperate between a specification of the control laws describing a physical system, a collection of databases, knowledge bases and/or other data structures reflecting information about the world in which the physical system controlled resides, observations (e.g. sensor information) from the external world, and actions that must be taken in response to external observations.

  3. 48 CFR 51.204 - Use of interagency fleet management system (IFMS) vehicles and related services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Contractor Use of Interagency Fleet Management System (IFMS) 51.204 Use of interagency fleet management system (IFMS) vehicles and related services. Contractors authorized to use interagency fleet management... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Use of interagency fleet...

  4. A Toolkit for Active Object-Oriented Databases with Application to Interoperability

    NASA Technical Reports Server (NTRS)

    King, Roger

    1996-01-01

    In our original proposal we stated that our research would 'develop a novel technology that provides a foundation for collaborative information processing.' The essential ingredient of this technology is the notion of 'deltas,' which are first-class values representing collections of proposed updates to a database. The Heraclitus framework provides a variety of algebraic operators for building up, combining, inspecting, and comparing deltas. Deltas can be directly applied to the database to yield a new state, or used 'hypothetically' in queries against the state that would arise if the delta were applied. The central point here is that the step of elevating deltas to 'first-class' citizens in database programming languages will yield tremendous leverage on the problem of supporting updates in collaborative information processing. In short, our original intention was to develop the theoretical and practical foundation for a technology based on deltas in an object-oriented database context, develop a toolkit for active object-oriented databases, and apply this toward collaborative information processing.

  5. A Toolkit for Active Object-Oriented Databases with Application to Interoperability

    NASA Technical Reports Server (NTRS)

    King, Roger

    1996-01-01

    In our original proposal we stated that our research would 'develop a novel technology that provides a foundation for collaborative information processing.' The essential ingredient of this technology is the notion of 'deltas,' which are first-class values representing collections of proposed updates to a database. The Heraclitus framework provides a variety of algebraic operators for building up, combining, inspecting, and comparing deltas. Deltas can be directly applied to the database to yield a new state, or used 'hypothetically' in queries against the state that would arise if the delta were applied. The central point here is that the step of elevating deltas to 'first-class' citizens in database programming languages will yield tremendous leverage on the problem of supporting updates in collaborative information processing. In short, our original intention was to develop the theoretical and practical foundation for a technology based on deltas in an object- oriented database context, develop a toolkit for active object-oriented databases, and apply this toward collaborative information processing.

  6. An Object-Relational Ifc Storage Model Based on Oracle Database

    NASA Astrophysics Data System (ADS)

    Li, Hang; Liu, Hua; Liu, Yong; Wang, Yuan

    2016-06-01

    With the building models are getting increasingly complicated, the levels of collaboration across professionals attract more attention in the architecture, engineering and construction (AEC) industry. In order to adapt the change, buildingSMART developed Industry Foundation Classes (IFC) to facilitate the interoperability between software platforms. However, IFC data are currently shared in the form of text file, which is defective. In this paper, considering the object-based inheritance hierarchy of IFC and the storage features of different database management systems (DBMS), we propose a novel object-relational storage model that uses Oracle database to store IFC data. Firstly, establish the mapping rules between data types in IFC specification and Oracle database. Secondly, design the IFC database according to the relationships among IFC entities. Thirdly, parse the IFC file and extract IFC data. And lastly, store IFC data into corresponding tables in IFC database. In experiment, three different building models are selected to demonstrate the effectiveness of our storage model. The comparison of experimental statistics proves that IFC data are lossless during data exchange.

  7. Archetype-based electronic health records: a literature review and evaluation of their applicability to health data interoperability and access.

    PubMed

    Wollersheim, Dennis; Sari, Anny; Rahayu, Wenny

    Health Information Managers (HIMs) are responsible for overseeing health information. The change management necessary during the transition to electronic health records (EHR) is substantial, and ongoing. Archetype-based EHRs are a core health information system component which solve many of the problems that arise during this period of change. Archetypes are models of clinical content, and they have many beneficial properties. They are interoperable, both between settings and through time. They are more amenable to change than conventional paradigms, and their design is congruent with clinical practice. This paper is an overview of the current archetype literature relevant to Health Information Managers. The literature was sourced in the English language sections of ScienceDirect, IEEE Explore, Pubmed, Google Scholar, ACM Digital library and other databases on the usage of archetypes for electronic health record storage, looking at the current areas of archetype research, appropriate usage, and future research. We also used reference lists from the cited papers, papers referenced by the openEHR website, and the recommendations from experts in the area. Criteria for inclusion were (a) if studies covered archetype research and (b) were either studies of archetype use, archetype system design, or archetype effectiveness. The 47 papers included show a wide and increasing worldwide archetype usage, in a variety of medical domains. Most of the papers noted that archetypes are an appropriate solution for future-proof and interoperable medical data storage. We conclude that archetypes are a suitable solution for the complex problem of electronic health record storage and interoperability.

  8. Electronic health records and cardiac implantable electronic devices: new paradigms and efficiencies.

    PubMed

    Slotwiner, David J

    2016-10-01

    The anticipated advantages of electronic health records (EHRs)-improved efficiency and the ability to share information across the healthcare enterprise-have so far failed to materialize. There is growing recognition that interoperability holds the key to unlocking the greatest value of EHRs. Health information technology (HIT) systems including EHRs must be able to share data and be able to interpret the shared data. This requires a controlled vocabulary with explicit definitions (data elements) as well as protocols to communicate the context in which each data element is being used (syntactic structure). Cardiac implantable electronic devices (CIEDs) provide a clear example of the challenges faced by clinicians when data is not interoperable. The proprietary data formats created by each CIED manufacturer, as well as the multiple sources of data generated by CIEDs (hospital, office, remote monitoring, acute care setting), make it challenging to aggregate even a single patient's data into an EHR. The Heart Rhythm Society and CIED manufacturers have collaborated to develop and implement international standard-based specifications for interoperability that provide an end-to-end solution, enabling structured data to be communicated from CIED to a report generation system, EHR, research database, referring physician, registry, patient portal, and beyond. EHR and other health information technology vendors have been slow to implement these tools, in large part, because there have been no financial incentives for them to do so. It is incumbent upon us, as clinicians, to insist that the tools of interoperability be a prerequisite for the purchase of any and all health information technology systems.

  9. Design and implementation of a CORBA-based genome mapping system prototype.

    PubMed

    Hu, J; Mungall, C; Nicholson, D; Archibald, A L

    1998-01-01

    CORBA (Common Object Request Broker Architecture), as an open standard, is considered to be a good solution for the development and deployment of applications in distributed heterogeneous environments. This technology can be applied in the bioinformatics area to enhance utilization, management and interoperation between biological resources. This paper investigates issues in developing CORBA applications for genome mapping information systems in the Internet environment with emphasis on database connectivity and graphical user interfaces. The design and implementation of a CORBA prototype for an animal genome mapping database are described. The prototype demonstration is available via: http://www.ri.bbsrc.ac.uk/ark_corba/. jian.hu@bbsrc.ac.uk

  10. IRIS Toxicological Review for Carbon Tetrachloride ...

    EPA Pesticide Factsheets

    EPA released the draft report,Toxicological Review for Carbon Tetrachloride(Interagency Science Discussion Draft), that was distributed to Federal agencies and White House Offices for comment during the Science Discussion step of the IRIS Assessment Development Process. Comments received from other Federal agencies and White House Offices are provided below with external peer review panel comments. EPA is conducting a peer review of the scientific basis supporting the human health hazard and dose-response assessment of carbon tetrachloride that will appear on the Integrated Risk Information System (IRIS) database.

  11. Flexible solution for interoperable cloud healthcare systems.

    PubMed

    Vida, Mihaela Marcella; Lupşe, Oana Sorina; Stoicu-Tivadar, Lăcrămioara; Bernad, Elena

    2012-01-01

    It is extremely important for the healthcare domain to have a standardized communication because will improve the quality of information and in the end the resulting benefits will improve the quality of patients' life. The standards proposed to be used are: HL7 CDA and CCD. For a better access to the medical data a solution based on cloud computing (CC) is investigated. CC is a technology that supports flexibility, seamless care, and reduced costs of the medical act. To ensure interoperability between healthcare information systems a solution creating a Web Custom Control is presented. The control shows the database tables and fields used to configure the two standards. This control will facilitate the work of the medical staff and hospital administrators, because they can configure the local system easily and prepare it for communication with other systems. The resulted information will have a higher quality and will provide knowledge that will support better patient management and diagnosis.

  12. Model for Semantically Rich Point Cloud Data

    NASA Astrophysics Data System (ADS)

    Poux, F.; Neuville, R.; Hallot, P.; Billen, R.

    2017-10-01

    This paper proposes an interoperable model for managing high dimensional point clouds while integrating semantics. Point clouds from sensors are a direct source of information physically describing a 3D state of the recorded environment. As such, they are an exhaustive representation of the real world at every scale: 3D reality-based spatial data. Their generation is increasingly fast but processing routines and data models lack of knowledge to reason from information extraction rather than interpretation. The enhanced smart point cloud developed model allows to bring intelligence to point clouds via 3 connected meta-models while linking available knowledge and classification procedures that permits semantic injection. Interoperability drives the model adaptation to potentially many applications through specialized domain ontologies. A first prototype is implemented in Python and PostgreSQL database and allows to combine semantic and spatial concepts for basic hybrid queries on different point clouds.

  13. The future application of GML database in GIS

    NASA Astrophysics Data System (ADS)

    Deng, Yuejin; Cheng, Yushu; Jing, Lianwen

    2006-10-01

    In 2004, the Geography Markup Language (GML) Implementation Specification (version 3.1.1) was published by Open Geospatial Consortium, Inc. Now more and more applications in geospatial data sharing and interoperability depend on GML. The primary purpose of designing GML is for exchange and transportation of geo-information by standard modeling and encoding of geography phenomena. However, the problems of how to organize and access lots of GML data effectively arise in applications. The research on GML database focuses on these problems. The effective storage of GML data is a hot topic in GIS communities today. GML Database Management System (GDBMS) mainly deals with the problem of storage and management of GML data. Now two types of XML database, namely Native XML Database, and XML-Enabled Database are classified. Since GML is an application of the XML standard to geographic data, the XML database system can also be used for the management of GML. In this paper, we review the status of the art of XML database, including storage, index and query languages, management systems and so on, then move on to the GML database. At the end, the future prospect of GML database in GIS application is presented.

  14. Virtual Atomic and Molecular Data Center (VAMDC) and Stark-B Database

    NASA Astrophysics Data System (ADS)

    Dimitrijevic, M. S.; Sahal-Brechot, S.; Kovacevic, A.; Jevremovic, D.; Popovic, L. C.; VAMDC Consortium; Dubernet, Marie-Lise

    2012-01-01

    Virtual Atomic and Molecular Data Center (VAMDC) is an European FP7 project with aims to build a flexible and interoperable e-science environment based interface to the existing Atomic and Molecular data. The VAMDC will be built upon the expertise of existing Atomic and Molecular databases, data producers and service providers with the specific aim of creating an infrastructure that is easily tuned to the requirements of a wide variety of users in academic, governmental, industrial or public communities. In VAMDC will enter also STARK-B database, containing Stark broadening parameters for a large number of lines, obtained by the semiclassical perturbation method during more than 30 years of collaboration of authors of this work (MSD and SSB) and their co-workers. In this contribution we will review the VAMDC project, STARK-B database and discuss the benefits of both for the corresponding data users.

  15. Software Application Profile: Opal and Mica: open-source software solutions for epidemiological data management, harmonization and dissemination.

    PubMed

    Doiron, Dany; Marcon, Yannick; Fortier, Isabel; Burton, Paul; Ferretti, Vincent

    2017-10-01

    Improving the dissemination of information on existing epidemiological studies and facilitating the interoperability of study databases are essential to maximizing the use of resources and accelerating improvements in health. To address this, Maelstrom Research proposes Opal and Mica, two inter-operable open-source software packages providing out-of-the-box solutions for epidemiological data management, harmonization and dissemination. Opal and Mica are two standalone but inter-operable web applications written in Java, JavaScript and PHP. They provide web services and modern user interfaces to access them. Opal allows users to import, manage, annotate and harmonize study data. Mica is used to build searchable web portals disseminating study and variable metadata. When used conjointly, Mica users can securely query and retrieve summary statistics on geographically dispersed Opal servers in real-time. Integration with the DataSHIELD approach allows conducting more complex federated analyses involving statistical models. Opal and Mica are open-source and freely available at [www.obiba.org] under a General Public License (GPL) version 3, and the metadata models and taxonomies that accompany them are available under a Creative Commons licence. © The Author 2017; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association

  16. The Osseus platform: a prototype for advanced web-based distributed simulation

    NASA Astrophysics Data System (ADS)

    Franceschini, Derrick; Riecken, Mark

    2016-05-01

    Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.

  17. Protein Information Resource: a community resource for expert annotation of protein data

    PubMed Central

    Barker, Winona C.; Garavelli, John S.; Hou, Zhenglin; Huang, Hongzhan; Ledley, Robert S.; McGarvey, Peter B.; Mewes, Hans-Werner; Orcutt, Bruce C.; Pfeiffer, Friedhelm; Tsugita, Akira; Vinayaka, C. R.; Xiao, Chunlin; Yeh, Lai-Su L.; Wu, Cathy

    2001-01-01

    The Protein Information Resource, in collaboration with the Munich Information Center for Protein Sequences (MIPS) and the Japan International Protein Information Database (JIPID), produces the most comprehensive and expertly annotated protein sequence database in the public domain, the PIR-International Protein Sequence Database. To provide timely and high quality annotation and promote database interoperability, the PIR-International employs rule-based and classification-driven procedures based on controlled vocabulary and standard nomenclature and includes status tags to distinguish experimentally determined from predicted protein features. The database contains about 200 000 non-redundant protein sequences, which are classified into families and superfamilies and their domains and motifs identified. Entries are extensively cross-referenced to other sequence, classification, genome, structure and activity databases. The PIR web site features search engines that use sequence similarity and database annotation to facilitate the analysis and functional identification of proteins. The PIR-Inter­national databases and search tools are accessible on the PIR web site at http://pir.georgetown.edu/ and at the MIPS web site at http://www.mips.biochem.mpg.de. The PIR-International Protein Sequence Database and other files are also available by FTP. PMID:11125041

  18. 78 FR 43218 - Notice of Kidney Interagency Coordinating Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-19

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Notice of Kidney Interagency Coordinating Committee Meeting SUMMARY: The Kidney Interagency Coordinating Committee (KICC) will hold a meeting on September 27, 2013, about interagency collaboration to improve outcomes in Chronic Kidney...

  19. 78 FR 23970 - Interagency Task Force on Veterans Small Business Development

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-23

    ... SMALL BUSINESS ADMINISTRATION Interagency Task Force on Veterans Small Business Development AGENCY: U.S. Small Business Administration. ACTION: Notice of open Federal Interagency Task Force Meeting. SUMMARY: This document corrects the SBA's Interagency Task Force on Veterans Small Business Developments...

  20. 75 FR 62611 - Interagency Task Force on Veterans Small Business Development

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-12

    ... SMALL BUSINESS ADMINISTRATION Interagency Task Force on Veterans Small Business Development AGENCY: U.S. Small Business Administration. ACTION: Notice of open Federal Interagency Task Force meeting... public meeting of the Interagency Task Force on Veterans Small Business Development. The meeting will be...

  1. 77 FR 41472 - Interagency Task Force on Veterans Small Business Development

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-13

    ... SMALL BUSINESS ADMINISTRATION Interagency Task Force on Veterans Small Business Development AGENCY: U.S. Small Business Administration. ACTION: Notice of open Federal Interagency Task Force meeting... public meeting of the Interagency Task Force on Veterans Small Business Development. The meeting will be...

  2. 76 FR 8393 - Interagency Task Force on Veterans Small Business Development

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-14

    ... SMALL BUSINESS ADMINISTRATION Interagency Task Force on Veterans Small Business Development AGENCY: U.S. Small Business Administration. ACTION: Notice of open Federal Interagency Task Force meeting... public meeting of the Interagency Task Force on Veterans Small Business Development. The meeting will be...

  3. 75 FR 62438 - Interagency Task Force on Veterans Small Business Development Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-08

    ... SMALL BUSINESS ADMINISTRATION Interagency Task Force on Veterans Small Business Development Meeting AGENCY: U.S. Small Business Administration. ACTION: Notice of open Federal Interagency Task Force... first public meeting of the Interagency Task Force on Veterans Small Business Development. The meeting...

  4. 48 CFR 42.002 - Interagency agreements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Interagency agreements. 42... MANAGEMENT CONTRACT ADMINISTRATION AND AUDIT SERVICES 42.002 Interagency agreements. (a) Agencies shall avoid... agency, through the use of interagency agreements. (b) Subject to the fiscal regulations of the agencies...

  5. 48 CFR 42.002 - Interagency agreements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Interagency agreements. 42... MANAGEMENT CONTRACT ADMINISTRATION AND AUDIT SERVICES 42.002 Interagency agreements. (a) Agencies shall avoid... agency, through the use of interagency agreements. (b) Subject to the fiscal regulations of the agencies...

  6. 48 CFR 42.002 - Interagency agreements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Interagency agreements. 42... MANAGEMENT CONTRACT ADMINISTRATION AND AUDIT SERVICES 42.002 Interagency agreements. (a) Agencies shall avoid... agency, through the use of interagency agreements. (b) Subject to the fiscal regulations of the agencies...

  7. 78 FR 23935 - Federal Acquisition Regulation; Information Collection; Contractor Use of Interagency Fleet...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-23

    ...; Information Collection; Contractor Use of Interagency Fleet Management System Vehicles AGENCY: Department of... previously approved information collection requirement concerning contractor use of interagency fleet... Collection 9000- 0032, Contractor Use of Interagency Fleet Management System Vehicles, by any of the...

  8. 78 FR 64499 - Federal Acquisition Regulation; Submission for OMB Review; Contractor Use of Interagency Fleet...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-29

    ...; Submission for OMB Review; Contractor Use of Interagency Fleet Management System Vehicles AGENCY: Department... previously approved information collection requirement concerning contractor use of interagency fleet... comments identified by Information Collection 9000- 0032, Contractor Use of Interagency Fleet Management...

  9. 78 FR 7849 - Interagency Task Force on Veterans Small Business Development

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-04

    ... SMALL BUSINESS ADMINISTRATION Interagency Task Force on Veterans Small Business Development AGENCY: U.S. Small Business Administration. ACTION: Notice of open Federal Interagency Task Force Meeting... meeting of the Interagency Task Force on Veterans Small Business Development. The meeting will be open to...

  10. 78 FR 70087 - Interagency Task Force on Veterans Small Business Development

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... SMALL BUSINESS ADMINISTRATION Interagency Task Force on Veterans Small Business Development AGENCY: U.S. Small Business Administration. ACTION: Notice of open Federal Interagency Task Force meeting... meeting of the Interagency Task Force on Veterans Small Business Development. The meeting will be open to...

  11. 78 FR 45996 - Interagency Task Force on Veterans Small Business Development

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-30

    ... SMALL BUSINESS ADMINISTRATION Interagency Task Force on Veterans Small Business Development AGENCY: U.S. Small Business Administration. ACTION: Notice of open Federal Interagency Task Force meeting... meeting of the Interagency Task Force on Veterans Small Business Development. The meeting will be open to...

  12. 78 FR 21492 - Interagency Task Force on Veterans Small Business Development

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-10

    ... SMALL BUSINESS ADMINISTRATION Interagency Task Force on Veterans Small Business Development AGENCY: U.S. Small Business Administration. ACTION: Notice of open Federal Interagency Task Force Meeting... meeting of the Interagency Task Force on Veterans Small Business Development. The meeting will be open to...

  13. Implementing GermWatcher, an enterprise infection control application.

    PubMed

    Doherty, Joshua; Noirot, Laura A; Mayfield, Jennie; Ramiah, Sridhar; Huang, Christine; Dunagan, Wm Claiborne; Bailey, Thomas C

    2006-01-01

    Automated surveillance tools can provide significant advantages to infection control practitioners. When stored in a relational database, the data collected can also be used to support numerous research and quality improvement opportunities. A previously described electronic infection control surveillance system was remodeled to provide multi-hospital support, an XML based rule set, and interoperability with an enterprise terminology server. This paper describes the new architecture being used at hospitals across BJC HealthCare.

  14. 22 CFR 94.8 - Interagency coordinating group.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 22 Foreign Relations 1 2013-04-01 2013-04-01 false Interagency coordinating group. 94.8 Section 94... § 94.8 Interagency coordinating group. The U.S. Central Authority shall nominate federal employees and may, from time to time, nominate private citizens to serve on an interagency coordinating group to...

  15. 22 CFR 94.8 - Interagency coordinating group.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 22 Foreign Relations 1 2012-04-01 2012-04-01 false Interagency coordinating group. 94.8 Section 94... § 94.8 Interagency coordinating group. The U.S. Central Authority shall nominate federal employees and may, from time to time, nominate private citizens to serve on an interagency coordinating group to...

  16. 22 CFR 94.8 - Interagency coordinating group.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 22 Foreign Relations 1 2014-04-01 2014-04-01 false Interagency coordinating group. 94.8 Section 94... § 94.8 Interagency coordinating group. The U.S. Central Authority shall nominate federal employees and may, from time to time, nominate private citizens to serve on an interagency coordinating group to...

  17. 77 FR 71471 - Interagency Task Force on Veterans Small Business Development; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-30

    ... SMALL BUSINESS ADMINISTRATION Interagency Task Force on Veterans Small Business Development; Notice of Meeting AGENCY: U.S. Small Business Administration. ACTION: Notice of open Federal Interagency... agenda for its public meeting of the Interagency Task Force on Veterans Small Business Development. The...

  18. 75 FR 20237 - Interagency Group on Insular Areas

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-19

    ... Order 13537 of April 14, 2010 Interagency Group on Insular Areas By the authority vested in me as...: Section 1. Interagency Group on Insular Areas. (a) There is established, within the Department of the Interior for administrative purposes, the Interagency Group on Insular Areas (IGIA) to address policies...

  19. Interagency Training Catalog of Courses, 1972-1973.

    ERIC Educational Resources Information Center

    Civil Service Commission, Washington, DC. Bureau of Training.

    Training programs offered by various Federal agencies for Federal, State and local government employees are compiled in this catalog. Designed primarily for employees in the Washington, D.C. area, the catalog is divided into Open Interagency courses and Limited Interagency courses. Interagency courses are listed by major category such as Automatic…

  20. ACHP | News | Native Hawaiian Federal Interagency Working Group Created

    Science.gov Websites

    Search skip specific nav links Home arrow News arrow Native Hawaiian Federal Interagency Working Group Created Native Hawaiian Federal Interagency Working Group Created Improving consultations on unique issues involving Native Hawaiian organizations is the purpose of a new interagency working group established by the

  1. 15 CFR 750.4 - Procedures for processing license applications.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... interagency groups established to provide expertise and coordinate interagency consultation. These interagency groups consist of: (i) The Missile Technology Export Control Group (MTEC). The MTEC, chaired by the... paper copy. (d) Review by other agencies and/or interagency groups. (1) Within 10 days of receipt of a...

  2. 15 CFR 750.4 - Procedures for processing license applications.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... interagency groups established to provide expertise and coordinate interagency consultation. These interagency groups consist of: (i) The Missile Technology Export Control Group (MTEC). The MTEC, chaired by the... paper copy. (d) Review by other agencies and/or interagency groups. (1) Within 10 days of receipt of a...

  3. 15 CFR 750.4 - Procedures for processing license applications.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... interagency groups established to provide expertise and coordinate interagency consultation. These interagency groups consist of: (i) The Missile Technology Export Control Group (MTEC). The MTEC, chaired by the... paper copy. (d) Review by other agencies and/or interagency groups. (1) Within 10 days of receipt of a...

  4. 15 CFR 750.4 - Procedures for processing license applications.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... interagency groups established to provide expertise and coordinate interagency consultation. These interagency groups consist of: (i) The Missile Technology Export Control Group (MTEC). The MTEC, chaired by the... paper copy. (d) Review by other agencies and/or interagency groups. (1) Within 10 days of receipt of a...

  5. 15 CFR 750.4 - Procedures for processing license applications.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... interagency groups established to provide expertise and coordinate interagency consultation. These interagency groups consist of: (i) The Missile Technology Export Control Group (MTEC). The MTEC, chaired by the... paper copy. (d) Review by other agencies and/or interagency groups. (1) Within 10 days of receipt of a...

  6. Computational toxicology using the OpenTox application programming interface and Bioclipse

    PubMed Central

    2011-01-01

    Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173

  7. Argo: an integrative, interactive, text mining-based workbench supporting curation

    PubMed Central

    Rak, Rafal; Rowley, Andrew; Black, William; Ananiadou, Sophia

    2012-01-01

    Curation of biomedical literature is often supported by the automatic analysis of textual content that generally involves a sequence of individual processing components. Text mining (TM) has been used to enhance the process of manual biocuration, but has been focused on specific databases and tasks rather than an environment integrating TM tools into the curation pipeline, catering for a variety of tasks, types of information and applications. Processing components usually come from different sources and often lack interoperability. The well established Unstructured Information Management Architecture is a framework that addresses interoperability by defining common data structures and interfaces. However, most of the efforts are targeted towards software developers and are not suitable for curators, or are otherwise inconvenient to use on a higher level of abstraction. To overcome these issues we introduce Argo, an interoperable, integrative, interactive and collaborative system for text analysis with a convenient graphic user interface to ease the development of processing workflows and boost productivity in labour-intensive manual curation. Robust, scalable text analytics follow a modular approach, adopting component modules for distinct levels of text analysis. The user interface is available entirely through a web browser that saves the user from going through often complicated and platform-dependent installation procedures. Argo comes with a predefined set of processing components commonly used in text analysis, while giving the users the ability to deposit their own components. The system accommodates various areas and levels of user expertise, from TM and computational linguistics to ontology-based curation. One of the key functionalities of Argo is its ability to seamlessly incorporate user-interactive components, such as manual annotation editors, into otherwise completely automatic pipelines. As a use case, we demonstrate the functionality of an in-built manual annotation editor that is well suited for in-text corpus annotation tasks. Database URL: http://www.nactem.ac.uk/Argo PMID:22434844

  8. The LatHyS database for planetary plasma environment investigations: Overview and a case study of data/model comparisons

    NASA Astrophysics Data System (ADS)

    Modolo, R.; Hess, S.; Génot, V.; Leclercq, L.; Leblanc, F.; Chaufray, J.-Y.; Weill, P.; Gangloff, M.; Fedorov, A.; Budnik, E.; Bouchemit, M.; Steckiewicz, M.; André, N.; Beigbeder, L.; Popescu, D.; Toniutti, J.-P.; Al-Ubaidi, T.; Khodachenko, M.; Brain, D.; Curry, S.; Jakosky, B.; Holmström, M.

    2018-01-01

    We present the Latmos Hybrid Simulation (LatHyS) database, which is dedicated to the investigations of planetary plasma environment. Simulation results of several planetary objects (Mars, Mercury, Ganymede) are available in an online catalogue. The full description of the simulations and their results is compliant with a data model developped in the framework of the FP7 IMPEx project. The catalogue is interfaced with VO-visualization tools such AMDA, 3DView, TOPCAT, CLweb or the IMPEx portal. Web services ensure the possibilities of accessing and extracting simulated quantities/data. We illustrate the interoperability between the simulation database and VO-tools using a detailed science case that focuses on a three-dimensional representation of the solar wind interaction with the Martian upper atmosphere, combining MAVEN and Mars Express observations and simulation results.

  9. A portal for the ocean biogeographic information system

    USGS Publications Warehouse

    Zhang, Yunqing; Grassle, J. F.

    2002-01-01

    Since its inception in 1999 the Ocean Biogeographic Information System (OBIS) has developed into an international science program as well as a globally distributed network of biogeographic databases. An OBIS portal at Rutgers University provides the links and functional interoperability among member database systems. Protocols and standards have been established to support effective communication between the portal and these functional units. The portal provides distributed data searching, a taxonomy name service, a GIS with access to relevant environmental data, biological modeling, and education modules for mariners, students, environmental managers, and scientists. The portal will integrate Census of Marine Life field projects, national data archives, and other functional modules, and provides for network-wide analyses and modeling tools.

  10. The United States Special Operations Command Civil Military Engagement Program - A Model for Military-Interagency Low Cost / Small Footprint Activities

    DTIC Science & Technology

    2014-05-02

    Interagency Coordination Centers (JIACs), Interagency Task Forces ( IATFs ) are found within GCCs and subordinate military units in an attempt to bridge...Interagency Tasks Forces ( IATFs ) that exist at each Geographic Combatant Command (GCC). Rather, this chapter serves to highlight the Civil Military

  11. NOAA Atmospheric, Marine and Arctic Monitoring Using UASs (including Rapid Response)

    NASA Astrophysics Data System (ADS)

    Coffey, J. J.; Jacobs, T.

    2015-12-01

    Unmanned systems have the potential to efficiently, effectively, economically, and safely bridge critical observation requirements in an environmentally friendly manner. As the United States' Atmospheric, Marine and Arctic areas of interest expand and include hard-to-reach regions of the Earth (such as the Arctic and remote oceanic areas) optimizing unmanned capabilities will be needed to advance the United States' science, technology and security efforts. Through increased multi-mission and multi-agency operations using improved inter-operable and autonomous unmanned systems, the research and operations communities will better collect environmental intelligence and better protect our Country against hazardous weather, environmental, marine and polar hazards. This presentation will examine NOAA's Atmospheric, Marine and Arctic Monitoring Unmanned Aircraft System (UAS) strategies which includes developing a coordinated effort to maximize the efficiency and capabilities of unmanned systems across the federal government and research partners. Numerous intra- and inter-agency operational demonstrations and assessments have been made to verify and validated these strategies. This includes the introduction of the Targeted Autonomous Insitu Sensing and Rapid Response (TAISRR) with UAS concept of operations. The presentation will also discuss the requisite UAS capabilities and our experience in using them.

  12. Geothermal Energy; (USA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raridon, M.H.; Hicks, S.C.

    1991-01-01

    Geothermal Energy (GET) announces on a bimonthly basis the current worldwide information available on the technologies required for economic recovery of geothermal energy and its use as direct heat or for electric power production. This publication contains the abstracts of DOE reports, journal article, conference papers, patents, theses, and monographs added to the Energy Science and Technology Database (EDB) during the past two months. Also included are US information obtained through acquisition programs or interagency agreements and international information obtained through the International Energy Agency's Energy Technology Data Exchange or government-to-government agreements.

  13. 77 FR 75349 - Seventy-First Report of the TSCA Interagency Testing Committee to the Administrator of the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-19

    ... Report of the TSCA Interagency Testing Committee to the Administrator of the Environmental Protection...-2012-0820; FRL-9370-9] Seventy-First Report of the TSCA Interagency Testing Committee to the...) Interagency Testing Committee (ITC) transmitted its 71st ITC Report to the EPA Administrator on November 14...

  14. 36 CFR 73.11 - Federal Interagency Panel for World Heritage.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Federal Interagency Panel for World Heritage. 73.11 Section 73.11 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR WORLD HERITAGE CONVENTION § 73.11 Federal Interagency Panel for World Heritage. (a) Responsibilities. The Federal Interagency...

  15. Interagency Contracting: An Overview of Federal Procurement and Appropriations Law

    DTIC Science & Technology

    2010-08-30

    Management Reform Act of 1994 and other authorities creating franchise funds and interagency assisting entities. Unlike multi-agency contracts, GWACs...and the FSS, franchise funds and interagency assisting entities are not themselves contracting vehicles, but they play a prominent role in...Expansion of the FSS to State and Local Governments.................................................. 22 Franchise Funds and Interagency Assisting Entities

  16. Making Information Visible, Accessible, and Understandable: Meta-Data and Registries

    DTIC Science & Technology

    2007-07-01

    the data created, the length of play time, album name, and the genre. Without resource metadata, portable digital music players would not be so...notion of a catalog card in a library. An example of metadata is the description of a music file specifying the creator, the artist that performed the song...describe struc- ture and formatting which are critical to interoperability and the management of databases. Going back to the portable music player example

  17. An Overview of Genomic Sequence Variation Markup Language (GSVML)

    PubMed Central

    Nakaya, Jun; Hiroi, Kaei; Ido, Keisuke; Yang, Woosung; Kimura, Michio

    2006-01-01

    Internationally accumulated genomic sequence variation data on human requires the interoperable data exchanging format. We developed the GSVML as the data exchanging format. The GSVML is human health oriented and has three categories. Analyses on the use case in human health domain and the investigation on the databases and markup languages were conducted. An interface ability to Health Level Seven Genotype Model was examined. GSVML provides a sharable platform for both clinical and research applications.

  18. The ISO Data Archive and Interoperability with Other Archives

    NASA Astrophysics Data System (ADS)

    Salama, Alberto; Arviset, Christophe; Hernández, José; Dowson, John; Osuna, Pedro

    The ESA's Infrared Space Observatory (ISO), an unprecedented observatory for infrared astronomy launched in November 1995, successfully made nearly 30,000 scientific observations in its 2.5-year mission. The ISO data can be retrieved from the ISO Data Archive, available at ISO Data Archive , and comprised of about 150,000 observations, including parallel and serendipity mode observations. A user-friendly Java interface permits queries to the database and data retrieval. The interface currently offers a wide variety of links to other archives, such as name resolution with NED and SIMBAD, access to electronic articles from ADS and CDS/VizieR, and access to IRAS data. In the past year development has been focused on improving the IDA interoperability with other astronomical archives, either by accessing other relevant archives or by providing direct access to the ISO data for external services. A mechanism of information transfer has been developed, allowing direct query to the IDA via a Java Server Page, returning quick look ISO images and relevant, observation-specific information embedded in an HTML page. This method has been used to link from the CDS/Vizier Data Centre and ADS, and work with IPAC to allow access to the ISO Archive from IRSA, including display capabilities of the observed sky regions onto other mission images, is in progress. Prospects for further links to and from other archives and databases are also addressed.

  19. Automated Database Mediation Using Ontological Metadata Mappings

    PubMed Central

    Marenco, Luis; Wang, Rixin; Nadkarni, Prakash

    2009-01-01

    Objective To devise an automated approach for integrating federated database information using database ontologies constructed from their extended metadata. Background One challenge of database federation is that the granularity of representation of equivalent data varies across systems. Dealing effectively with this problem is analogous to dealing with precoordinated vs. postcoordinated concepts in biomedical ontologies. Model Description The authors describe an approach based on ontological metadata mapping rules defined with elements of a global vocabulary, which allows a query specified at one granularity level to fetch data, where possible, from databases within the federation that use different granularities. This is implemented in OntoMediator, a newly developed production component of our previously described Query Integrator System. OntoMediator's operation is illustrated with a query that accesses three geographically separate, interoperating databases. An example based on SNOMED also illustrates the applicability of high-level rules to support the enforcement of constraints that can prevent inappropriate curator or power-user actions. Summary A rule-based framework simplifies the design and maintenance of systems where categories of data must be mapped to each other, for the purpose of either cross-database query or for curation of the contents of compositional controlled vocabularies. PMID:19567801

  20. Nuclear Reactors and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cason, D.L.; Hicks, S.C.

    1992-01-01

    This publication Nuclear Reactors and Technology (NRT) announces on a monthly basis the current worldwide information available from the open literature on nuclear reactors and technology, including all aspects of power reactors, components and accessories, fuel elements, control systems, and materials. This publication contains the abstracts of DOE reports, journal articles, conference papers, patents, theses, and monographs added to the Energy Science and Technology Database during the past month. Also included are US information obtained through acquisition programs or interagency agreements and international information obtained through the International Energy Agency`s Energy Technology Data Exchange or government-to-government agreements. The digests inmore » NRT and other citations to information on nuclear reactors back to 1948 are available for online searching and retrieval on the Energy Science and Technology Database and Nuclear Science Abstracts (NSA) database. Current information, added daily to the Energy Science and Technology Database, is available to DOE and its contractors through the DOE Integrated Technical Information System. Customized profiles can be developed to provide current information to meet each user`s needs.« less

  1. Processing biological literature with customizable Web services supporting interoperable formats.

    PubMed

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. © The Author(s) 2014. Published by Oxford University Press.

  2. Harmonising phenomics information for a better interoperability in the rare disease field.

    PubMed

    Maiella, Sylvie; Olry, Annie; Hanauer, Marc; Lanneau, Valérie; Lourghi, Halima; Donadille, Bruno; Rodwell, Charlotte; Köhler, Sebastian; Seelow, Dominik; Jupp, Simon; Parkinson, Helen; Groza, Tudor; Brudno, Michael; Robinson, Peter N; Rath, Ana

    2018-02-07

    HIPBI-RD (Harmonising phenomics information for a better interoperability in the rare disease field) is a three-year project which started in 2016 funded via the E-Rare 3 ERA-NET program. This project builds on three resources largely adopted by the rare disease (RD) community: Orphanet, its ontology ORDO (the Orphanet Rare Disease Ontology), HPO (the Human Phenotype Ontology) as well as PhenoTips software for the capture and sharing of structured phenotypic data for RD patients. Our project is further supported by resources developed by the European Bioinformatics Institute and the Garvan Institute. HIPBI-RD aims to provide the community with an integrated, RD-specific bioinformatics ecosystem that will harmonise the way phenomics information is stored in databases and patient files worldwide, and thereby contribute to interoperability. This ecosystem will consist of a suite of tools and ontologies, optimized to work together, and made available through commonly used software repositories. The project workplan follows three main objectives: The HIPBI-RD ecosystem will contribute to the interpretation of variants identified through exome and full genome sequencing by harmonising the way phenotypic information is collected, thus improving diagnostics and delineation of RD. The ultimate goal of HIPBI-RD is to provide a resource that will contribute to bridging genome-scale biology and a disease-centered view on human pathobiology. Achievements in Year 1. Copyright © 2018. Published by Elsevier Masson SAS.

  3. Processing biological literature with customizable Web services supporting interoperable formats

    PubMed Central

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. PMID:25006225

  4. Interoperability in planetary research for geospatial data analysis

    NASA Astrophysics Data System (ADS)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  5. 32 CFR 148.1 - Interagency reciprocal acceptance .

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., MILITARY AND CIVILIAN NATIONAL POLICY AND IMPLEMENTATION OF RECIPROCITY OF FACILITIES National Policy on Reciprocity of Use and Inspections of Facilities § 148.1 Interagency reciprocal acceptance . Interagency...

  6. 32 CFR 148.1 - Interagency reciprocal acceptance .

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., MILITARY AND CIVILIAN NATIONAL POLICY AND IMPLEMENTATION OF RECIPROCITY OF FACILITIES National Policy on Reciprocity of Use and Inspections of Facilities § 148.1 Interagency reciprocal acceptance . Interagency...

  7. 32 CFR 148.1 - Interagency reciprocal acceptance .

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., MILITARY AND CIVILIAN NATIONAL POLICY AND IMPLEMENTATION OF RECIPROCITY OF FACILITIES National Policy on Reciprocity of Use and Inspections of Facilities § 148.1 Interagency reciprocal acceptance . Interagency...

  8. Interagency Coordination Structures in Stabilization and Reconstruction Operations

    DTIC Science & Technology

    2010-06-11

    poor interagency coordination and lack of unity as significant problems that compromise USG efforts in Afghanistan. This lack of interagency...objectives into consolidated tactical level goals and initiatives. Lack of interagency coordination structures at this level causes a variety of...Iraqi Reconstruction noticed a similar series of issues in an audit of civil police training in Afghanistan and Iraq. The lack of coordination

  9. The US Arctic Observing Network - Mobilizing Interagency Observing Actions in an Era of Rapid Change

    NASA Astrophysics Data System (ADS)

    Starkweather, S.

    2017-12-01

    US agencies have long relied upon sustained Arctic observing to achieve their missions, be they in support of long-term monitoring, operationalized forecasts, or long-term process studies. One inventory of Arctic observing activities (arcticobservingviewer.org) suggests that there are more than 10,000 sustained data collection sites that have been supported by US agencies. Yet despite calls from academia (e.g. National Research Council, 2006) and agency leadership (e.g. IARPC, 2007) for more integrated approaches, such coherence - in the form of a US Arctic Observing Network (US AON) - has been slow and ad hoc in emerging. Two approaches have been invoked in systematically creating networks of greater coherence. One involves solving the "backward problem" or drawing existing observations into interoperable, multi-sensor, value-added data products. These approaches have the benefit that they build from existing assets and extend observations over greater time and space scales than individual efforts can approach. They suffer from being high-energy undertakings, often proceeding through voluntary efforts, and are limited by the observational assets already in place. Solving the "forward problem", or designing the network that is "needed" entails its own challenges of aligning multiple agency needs and capabilities into coordinated frameworks, often tied into a societal benefit structure. The solutions to the forward problem are greatly constrained by financial and technical feasibility. The benefit of such approaches is that interoperability and user-needs are baked into the network design, and some critical prioritization has been invoked. In September 2016, NOAA and other US agencies advanced plans to formally establish and fund the coordination of a US AON initiative. This US AON initiative brings new coordination capabilities on-line to support and strengthen US engagement in sustained and coordinated pan-Arctic observing and data sharing systems that serve societal needs. This work describes the capabilities of the new US AON initiative and how those capabilities are being mobilized towards both the "backward" and "forward" problems of Arctic observing.

  10. A Lifecycle Approach to Brokered Data Management for Hydrologic Modeling Data Using Open Standards.

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Booth, N.; Kunicki, T.; Walker, J.

    2012-12-01

    The U.S. Geological Survey Center for Integrated Data Analytics has formalized an information management-architecture to facilitate hydrologic modeling and subsequent decision support throughout a project's lifecycle. The architecture is based on open standards and open source software to decrease the adoption barrier and to build on existing, community supported software. The components of this system have been developed and evaluated to support data management activities of the interagency Great Lakes Restoration Initiative, Department of Interior's Climate Science Centers and WaterSmart National Water Census. Much of the research and development of this system has been in cooperation with international interoperability experiments conducted within the Open Geospatial Consortium. Community-developed standards and software, implemented to meet the unique requirements of specific disciplines, are used as a system of interoperable, discipline specific, data types and interfaces. This approach has allowed adoption of existing software that satisfies the majority of system requirements. Four major features of the system include: 1) assistance in model parameter and forcing creation from large enterprise data sources; 2) conversion of model results and calibrated parameters to standard formats, making them available via standard web services; 3) tracking a model's processes, inputs, and outputs as a cohesive metadata record, allowing provenance tracking via reference to web services; and 4) generalized decision support tools which rely on a suite of standard data types and interfaces, rather than particular manually curated model-derived datasets. Recent progress made in data and web service standards related to sensor and/or model derived station time series, dynamic web processing, and metadata management are central to this system's function and will be presented briefly along with a functional overview of the applications that make up the system. As the separate pieces of this system progress, they will be combined and generalized to form a sort of social network for nationally consistent hydrologic modeling.

  11. Benchmarking desktop and mobile handwriting across COTS devices: The e-BioSign biometric database

    PubMed Central

    Tolosana, Ruben; Vera-Rodriguez, Ruben; Fierrez, Julian; Morales, Aythami; Ortega-Garcia, Javier

    2017-01-01

    This paper describes the design, acquisition process and baseline evaluation of the new e-BioSign database, which includes dynamic signature and handwriting information. Data is acquired from 5 different COTS devices: three Wacom devices (STU-500, STU-530 and DTU-1031) specifically designed to capture dynamic signatures and handwriting, and two general purpose tablets (Samsung Galaxy Note 10.1 and Samsung ATIV 7). For the two Samsung tablets, data is collected using both pen stylus and also the finger in order to study the performance of signature verification in a mobile scenario. Data was collected in two sessions for 65 subjects, and includes dynamic information of the signature, the full name and alpha numeric sequences. Skilled forgeries were also performed for signatures and full names. We also report a benchmark evaluation based on e-BioSign for person verification under three different real scenarios: 1) intra-device, 2) inter-device, and 3) mixed writing-tool. We have experimented the proposed benchmark using the main existing approaches for signature verification: feature- and time functions-based. As a result, new insights into the problem of signature biometrics in sensor-interoperable scenarios have been obtained, namely: the importance of specific methods for dealing with device interoperability, and the necessity of a deeper analysis on signatures acquired using the finger as the writing tool. This e-BioSign public database allows the research community to: 1) further analyse and develop signature verification systems in realistic scenarios, and 2) investigate towards a better understanding of the nature of the human handwriting when captured using electronic COTS devices in realistic conditions. PMID:28475590

  12. Benchmarking desktop and mobile handwriting across COTS devices: The e-BioSign biometric database.

    PubMed

    Tolosana, Ruben; Vera-Rodriguez, Ruben; Fierrez, Julian; Morales, Aythami; Ortega-Garcia, Javier

    2017-01-01

    This paper describes the design, acquisition process and baseline evaluation of the new e-BioSign database, which includes dynamic signature and handwriting information. Data is acquired from 5 different COTS devices: three Wacom devices (STU-500, STU-530 and DTU-1031) specifically designed to capture dynamic signatures and handwriting, and two general purpose tablets (Samsung Galaxy Note 10.1 and Samsung ATIV 7). For the two Samsung tablets, data is collected using both pen stylus and also the finger in order to study the performance of signature verification in a mobile scenario. Data was collected in two sessions for 65 subjects, and includes dynamic information of the signature, the full name and alpha numeric sequences. Skilled forgeries were also performed for signatures and full names. We also report a benchmark evaluation based on e-BioSign for person verification under three different real scenarios: 1) intra-device, 2) inter-device, and 3) mixed writing-tool. We have experimented the proposed benchmark using the main existing approaches for signature verification: feature- and time functions-based. As a result, new insights into the problem of signature biometrics in sensor-interoperable scenarios have been obtained, namely: the importance of specific methods for dealing with device interoperability, and the necessity of a deeper analysis on signatures acquired using the finger as the writing tool. This e-BioSign public database allows the research community to: 1) further analyse and develop signature verification systems in realistic scenarios, and 2) investigate towards a better understanding of the nature of the human handwriting when captured using electronic COTS devices in realistic conditions.

  13. Research on Historic Bim of Built Heritage in Taiwan - a Case Study of Huangxi Academy

    NASA Astrophysics Data System (ADS)

    Lu, Y. C.; Shih, T. Y.; Yen, Y. N.

    2018-05-01

    Digital archiving technology for conserving cultural heritage is an important subject nowadays. The Taiwanese Ministry of Culture continues to try to converge the concept and technology of conservation towards international conventions. However, the products from these different technologies are not yet integrated due to the lack of research and development in this field. There is currently no effective schema in HBIM for Taiwanese cultural heritage. The aim of this research is to establish an HBIM schema for Chinese built heritage in Taiwan. The proposed method starts from the perspective of the components of built heritage buildings, up to the investigation of the important properties of the components through important international charters and Taiwanese laws of cultural heritage conservation. Afterwards, object-oriented class diagram and ontology from the scale of components were defined to clarify the concept and increase the interoperability. A historical database was then established for the historical information of components and to bring it into the concept of BIM in order to build a 3D model of heritage objects which can be used for visualization. An integration platform was developed for the users to browse and manipulate the database and 3D model simultaneously. In addition, this research also evaluated the feasibility of this method using the study case at the Huangxi academy located in Taiwan. The conclusion showed that class diagram could help the establishment of database and even its application for different Chinese built heritage objects. The establishment of ontology helped to convey knowledge and increase interoperability. In comparison to traditional documentation methods, the querying result of the platform was more accurate and less prone to human error.

  14. BioHackathon series in 2011 and 2012: penetration of ontology and linked data in life science domains

    PubMed Central

    2014-01-01

    The application of semantic technologies to the integration of biological data and the interoperability of bioinformatics analysis and visualization tools has been the common theme of a series of annual BioHackathons hosted in Japan for the past five years. Here we provide a review of the activities and outcomes from the BioHackathons held in 2011 in Kyoto and 2012 in Toyama. In order to efficiently implement semantic technologies in the life sciences, participants formed various sub-groups and worked on the following topics: Resource Description Framework (RDF) models for specific domains, text mining of the literature, ontology development, essential metadata for biological databases, platforms to enable efficient Semantic Web technology development and interoperability, and the development of applications for Semantic Web data. In this review, we briefly introduce the themes covered by these sub-groups. The observations made, conclusions drawn, and software development projects that emerged from these activities are discussed. PMID:24495517

  15. A vital signs telemonitoring system - interoperability supported by a personal health record systema and a cloud service.

    PubMed

    Gutiérrez, Miguel F; Cajiao, Alejandro; Hidalgo, José A; Cerón, Jesús D; López, Diego M; Quintero, Víctor M; Rendón, Alvaro

    2014-01-01

    This article presents the development process of an acquisition and data storage system managing clinical variables through a cloud storage service and a Personal Health Record (PHR) System. First, the paper explains how a Wireless Body Area Network (WBAN) that captures data from two sensors corresponding to arterial pressure and heart rate is designed. Second, this paper illustrates how data collected by the WBAN are transmitted to a cloud storage service. It is worth mentioning that this cloud service allows the data to be stored in a persistent way on an online database system. Finally, the paper describes, how the data stored in the cloud service are sent to the Indivo PHR System, where they are registered and charted for future revision by health professionals. The research demonstrated the feasibility of implementing WBAN networks for the acquisition of clinical data, and particularly for the use of Web technologies and standards to provide interoperability with PHR Systems at technical and syntactic levels.

  16. Multi-disciplinary interoperability challenges (Ian McHarg Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Annoni, Alessandro

    2013-04-01

    Global sustainability research requires multi-disciplinary efforts to address the key research challenges to increase our understanding of the complex relationships between environment and society. For this reason dependence on ICT systems interoperability is rapidly growing but, despite some relevant technological improvement is observed, in practice operational interoperable solutions are still lacking. Among the causes is the absence of a generally accepted definition of "interoperability" in all its broader aspects. In fact the concept of interoperability is just a concept and the more popular definitions are not addressing all challenges to realize operational interoperable solutions. The problem become even more complex when multi-disciplinary interoperability is required because in that case solutions for interoperability of different interoperable solution should be envisaged. In this lecture the following definition will be used: "interoperability is the ability to exchange information and to use it". In the lecture the main challenges for addressing multi-disciplinary interoperability will be presented and a set of proposed approaches/solutions shortly introduced.

  17. A Story of a Crashed Plane in US-Mexican border

    NASA Astrophysics Data System (ADS)

    Bermudez, Luis; Hobona, Gobe; Vretanos, Peter; Peterson, Perry

    2013-04-01

    A plane has crashed on the US-Mexican border. The search and rescue command center planner needs to find information about the crash site, a mountain, nearby mountains for the establishment of a communications tower, as well as ranches for setting up a local incident center. Events like this one occur all over the world and exchanging information seamlessly is key to save lives and prevent further disasters. This abstract describes an interoperability testbed that applied this scenario using technologies based on Open Geospatial Consortium (OGC) standards. The OGC, which has about 500 members, serves as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC Interoperability Program conducts international interoperability testbeds, such as the OGC Web Services Phase 9 (OWS-9), that encourages rapid development, testing, validation, demonstration and adoption of open, consensus based standards and best practices. The Cross-Community Interoperability (CCI) thread in OWS-9 advanced the Web Feature Service for Gazetteers (WFS-G) by providing a Single Point of Entry Global Gazetteer (SPEGG), where a user can submit a single query and access global geographic names data across multiple Federal names databases. Currently users must make two queries with differing input parameters against two separate databases to obtain authoritative cross border geographic names data. The gazetteers in this scenario included: GNIS and GNS. GNIS or Geographic Names Information System is managed by USGS. It was first developed in 1964 and contains information about domestic and Antarctic names. GNS or GeoNET Names Server provides the Geographic Names Data Base (GNDB) and it is managed by National Geospatial Intelligence Agency (NGA). GNS has been in service since 1994, and serves names for areas outside the United States and its dependent areas, as well as names for undersea features. The following challenges were advanced: Cascaded WFS-G servers (allowing to query multiple WFSs with a "parent" WFS), implemented query names filters (e.g. fuzzy search, text search), implemented dealing with multilingualism and diacritics, implemented advanced spatial constraints (e.g. search by radial search and nearest neighbor) and semantically mediated feature types (e.g. mountain vs. hill). To enable semantic mediation, a series of semantic mappings were defined between the NGA GNS, USGS GNIS and the Alexandria Digital Library (ADL) Gazetteer. The mappings were encoded in the Web Ontology Language (OWL) to enable them to be used by semantic web technologies. The semantic mappings were then published for ingestion into a semantic mediator that used the mappings to associate location types from one gazetteer with location types in another. The semantic mediator was then able to transform requests on the fly, providing a single point of entry WFS-G to multiple gazetteers. The presentation will provide a live presentation of the work performed, highlight main developments, and discuss future development.

  18. Development of Web-based Distributed Cooperative Development Environmentof Sign-Language Animation System and its Evaluation

    NASA Astrophysics Data System (ADS)

    Yuizono, Takaya; Hara, Kousuke; Nakayama, Shigeru

    A web-based distributed cooperative development environment of sign-language animation system has been developed. We have extended the system from the previous animation system that was constructed as three tiered system which consists of sign-language animation interface layer, sign-language data processing layer, and sign-language animation database. Two components of a web client using VRML plug-in and web servlet are added to the previous system. The systems can support humanoid-model avatar for interoperability, and can use the stored sign language animation data shared on the database. It is noted in the evaluation of this system that the inverse kinematics function of web client improves the sign-language animation making.

  19. Military Interoperable Digital Hospital Testbed (MIDHT)

    DTIC Science & Technology

    2011-10-01

    MED Rm-Bed. 1001 -2 AdmitDt 08/02/2011 MRN: 000160138 MIS ASSESSMENT . HED I DATABAS Tr<"Jnscription I wound/ Ostomy Consu v.ol · HEO Pre-op Checklist...Attending: WILSON. MICHAEL ... Fac- Dept Age: 65 yr Diagnosis: ~ ADMINRX, SUE Revtew Charting SessiOn DATABASE PART 1 Wound/ Ostomy Consult E NIH...I Lab Wound/ Ostomy Consult Eval· HED I Home Health Intake Consult· HED I Cardiac Rehab Pt Teaching· HED NIH Stloke Scale· HED I ICU/CCU Daily

  20. Tool and data interoperability in the SSE system

    NASA Technical Reports Server (NTRS)

    Shotton, Chuck

    1988-01-01

    Information is given in viewgraph form on tool and data interoperability in the Software Support Environment (SSE). Information is given on industry problems, SSE system interoperability issues, SSE solutions to tool and data interoperability, and attainment of heterogeneous tool/data interoperability.

  1. The 2006 Cape Canaveral Air Force Station Range Reference Atmosphere Model Validation Study and Sensitivity Analysis to the National Aeronautics and Space Administration's Space Shuttle

    NASA Technical Reports Server (NTRS)

    Decker, Ryan K.; Burns, Lee; Merry, Carl; Harrington, Brian

    2008-01-01

    Atmospheric parameters are essential in assessing the flight performance of aerospace vehicles. The effects of the Earth's atmosphere on aerospace vehicles influence various aspects of the vehicle during ascent ranging from its flight trajectory to the structural dynamics and aerodynamic heatmg on the vehicle. Atmospheric databases charactenzing the wind and thermodynamic environments, known as Range Reference Atmospheres (RRA), have been developed at space launch ranges by a governmental interagency working group for use by aerospace vehicle programs. The National Aeronantics and Space Administration's (NASA) Space Shuttle Program (SSP), which launches from Kennedy Space Center, utilizes atmosphenc statistics derived from the Cape Canaveral Air Force Station Range Reference Atmosphere (CCAFS RRA) database to evaluate environmental constraints on various aspects of the vehlcle during ascent.

  2. UHF (Ultra High Frequency) Military Satellite Communications Ground Equipment Interoperability.

    DTIC Science & Technology

    1986-10-06

    crisis management requires interoperability between various services. These short-term crises often arise from unforeseen circumstances in which...Scheduler Qualcomm has prepared an interoperability study for the JTC3A (Reference 15) as a TA/CE for USCINCLANT ROC 5-84 requirements. It has defined a...interoperability is fundamental. A number of operational crises have occurred where interoperable communications or the lack of interoperable

  3. Towards G2G: Systems of Technology Database Systems

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Bell, David

    2005-01-01

    We present an approach and methodology for developing Government-to-Government (G2G) Systems of Technology Database Systems. G2G will deliver technologies for distributed and remote integration of technology data for internal use in analysis and planning as well as for external communications. G2G enables NASA managers, engineers, operational teams and information systems to "compose" technology roadmaps and plans by selecting, combining, extending, specializing and modifying components of technology database systems. G2G will interoperate information and knowledge that is distributed across organizational entities involved that is ideal for NASA future Exploration Enterprise. Key contributions of the G2G system will include the creation of an integrated approach to sustain effective management of technology investments that supports the ability of various technology database systems to be independently managed. The integration technology will comply with emerging open standards. Applications can thus be customized for local needs while enabling an integrated management of technology approach that serves the global needs of NASA. The G2G capabilities will use NASA s breakthrough in database "composition" and integration technology, will use and advance emerging open standards, and will use commercial information technologies to enable effective System of Technology Database systems.

  4. NCBI2RDF: enabling full RDF-based access to NCBI databases.

    PubMed

    Anguita, Alberto; García-Remesal, Miguel; de la Iglesia, Diana; Maojo, Victor

    2013-01-01

    RDF has become the standard technology for enabling interoperability among heterogeneous biomedical databases. The NCBI provides access to a large set of life sciences databases through a common interface called Entrez. However, the latter does not provide RDF-based access to such databases, and, therefore, they cannot be integrated with other RDF-compliant databases and accessed via SPARQL query interfaces. This paper presents the NCBI2RDF system, aimed at providing RDF-based access to the complete NCBI data repository. This API creates a virtual endpoint for servicing SPARQL queries over different NCBI repositories and presenting to users the query results in SPARQL results format, thus enabling this data to be integrated and/or stored with other RDF-compliant repositories. SPARQL queries are dynamically resolved, decomposed, and forwarded to the NCBI-provided E-utilities programmatic interface to access the NCBI data. Furthermore, we show how our approach increases the expressiveness of the native NCBI querying system, allowing several databases to be accessed simultaneously. This feature significantly boosts productivity when working with complex queries and saves time and effort to biomedical researchers. Our approach has been validated with a large number of SPARQL queries, thus proving its reliability and enhanced capabilities in biomedical environments.

  5. Towards technical interoperability in telemedicine.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard Layne, II

    2004-05-01

    For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.

  6. Interagency Cooperation for Irregular Warfare at the Combatant Command

    DTIC Science & Technology

    2009-01-01

    Directorate, and the USSOCOM Interagency Task Force ( IATF ) offer examples of JIACGs coping with the issues of IW. Each organization possesses strengths...46 USSOCOM IATF History...Force for Irregular Warfare ( IATF IW).33 EUCOM conducts interagency

  7. The interoperability force in the ERP field

    NASA Astrophysics Data System (ADS)

    Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon

    2015-04-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.

  8. Challenges to the Standardization of Burn Data Collection: A Call for Common Data Elements for Burn Care.

    PubMed

    Schneider, Jeffrey C; Chen, Liang; Simko, Laura C; Warren, Katherine N; Nguyen, Brian Phu; Thorpe, Catherine R; Jeng, James C; Hickerson, William L; Kazis, Lewis E; Ryan, Colleen M

    2018-02-20

    The use of common data elements (CDEs) is growing in medical research; CDEs have demonstrated benefit in maximizing the impact of existing research infrastructure and funding. However, the field of burn care does not have a standard set of CDEs. The objective of this study is to examine the extent of common data collected in current burn databases.This study examines the data dictionaries of six U.S. burn databases to ascertain the extent of common data. This was assessed from a quantitative and qualitative perspective. Thirty-two demographic and clinical data elements were examined. The number of databases that collect each data element was calculated. The data values for each data element were compared across the six databases for common terminology. Finally, the data prompts of the data elements were examined for common language and structure.Five (16%) of the 32 data elements are collected by all six burn databases; additionally, five data elements (16%) are present in only one database. Furthermore, there are considerable variations in data values and prompts used among the burn databases. Only one of the 32 data elements (age) contains the same data values across all databases.The burn databases examined show minimal evidence of common data. There is a need to develop CDEs and standardized coding to enhance interoperability of burn databases.

  9. 48 CFR 242.002 - Interagency agreements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Interagency agreements. 242.002 Section 242.002 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT CONTRACT ADMINISTRATION AND AUDIT SERVICES 242.002 Interagency...

  10. Application of Coalition Battle Management Language (C-BML) and C-BML Services to Live, Virtual, and Constructive (LVC) Simulation Environments

    DTIC Science & Technology

    2011-12-01

    Task Based Approach to Planning.” Paper 08F- SIW -033. In Proceed- ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability...Paper 06F- SIW -003. In Proceed- 2597 Blais ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability Standards Organi...MSDL).” Paper 10S- SIW -003. In Proceedings of the Spring Simulation Interoperability Workshop. Simulation Interoperability Standards Organization

  11. Maturity model for enterprise interoperability

    NASA Astrophysics Data System (ADS)

    Guédria, Wided; Naudet, Yannick; Chen, David

    2015-01-01

    Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.

  12. EMPACT: THE LAS VEGAS INTERAGENCY PILOT PROGRAM

    EPA Science Inventory

    ENPACT: The Las Vegas Interagency Pilot Project

    The Las Vegas Interagency Pilot Project of the EMPACT program has involved eleven efforts. These efforts are described in brief on the poster presentation. They include: Las Vegas Environmental Monitoring Inventory, the Qual...

  13. 48 CFR 51.201 - Policy.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... GOVERNMENT SOURCES BY CONTRACTORS Contractor Use of Interagency Fleet Management System (IFMS) 51.201 Policy... contractors to obtain, for official purposes only, interagency fleet management system (IFMS) vehicles and... instance. (c) Government contractors shall not be authorized to obtain interagency fleet management system...

  14. Evaluation of the Inhalation Carcinogenicity of Ethylene Oxide ...

    EPA Pesticide Factsheets

    In December 2016, EPA finalized its Evaluation of the Inhalation Carcinogenicity of Ethylene Oxide. EPA’s evaluation was reviewed internally by EPA and by other federal agencies and White House Offices in October 2016, before public release. Consistent with the May 2009 IRIS assessment development process, all written comments on IRIS assessments submitted by other federal agencies and White House Offices are made publicly available. Accordingly, interagency comments and the interagency science discussion materials provided to other agencies, including interagency review drafts of the EPA’s Evaluation of the Inhalation Carcinogenicity of Ethylene Oxide, are posted on this site. Note: No major science comments were received on the Interagency Science Discussion Draft. The Toxicological Review and charge were reviewed internally by EPA and by other federal agencies and White House Offices before public release. Consistent with the May 2009 IRIS assessment development process, all written comments on IRIS assessments submitted by other federal agencies and White House Offices are made publicly available. Accordingly, interagency comments and the interagency science consultation materials provided to other agencies, including interagency review drafts of the IRIS Toxicological Review of Ammonia and the charge to external peer reviewers, are posted on this site.

  15. 7 CFR 201.75 - Interagency certification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) FEDERAL SEED ACT FEDERAL SEED ACT REGULATIONS Certified Seed § 201.75 Interagency certification. Interagency certification may be accomplished... certify a lot of seed. (a) The certifying agency issuing labels for all classes of certified seed shall...

  16. 7 CFR 201.75 - Interagency certification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) FEDERAL SEED ACT FEDERAL SEED ACT REGULATIONS Certified Seed § 201.75 Interagency certification. Interagency certification may be accomplished... certify a lot of seed. (a) The certifying agency issuing labels for all classes of certified seed shall...

  17. 7 CFR 201.75 - Interagency certification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) FEDERAL SEED ACT FEDERAL SEED ACT REGULATIONS Certified Seed § 201.75 Interagency certification. Interagency certification may be accomplished... certify a lot of seed. (a) The certifying agency issuing labels for all classes of certified seed shall...

  18. 7 CFR 201.75 - Interagency certification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) FEDERAL SEED ACT FEDERAL SEED ACT REGULATIONS Certified Seed § 201.75 Interagency certification. Interagency certification may be accomplished... certify a lot of seed. (a) The certifying agency issuing labels for all classes of certified seed shall...

  19. 7 CFR 201.75 - Interagency certification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) FEDERAL SEED ACT FEDERAL SEED ACT REGULATIONS Certified Seed § 201.75 Interagency certification. Interagency certification may be accomplished... certify a lot of seed. (a) The certifying agency issuing labels for all classes of certified seed shall...

  20. 76 FR 67747 - Interagency Autism Coordinating Committee; Call for Nominations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-02

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Interagency Autism Coordinating Committee; Call for Nominations In accordance with Public Law 112-32, The Combating Autism... Interagency Autism Coordinating Committee (IACC) until September 30, 2014 and is seeking nominations for...

  1. 41 CFR 101-39.300 - General.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FEDERAL PROPERTY MANAGEMENT REGULATIONS AVIATION, TRANSPORTATION, AND MOTOR VEHICLES 39-INTERAGENCY FLEET MANAGEMENT SYSTEMS 39.3-Use and Care of GSA Interagency Fleet Management System Vehicles § 101-39.300 General. (a) The objective of the General Services Administration (GSA) Interagency Fleet Management System...

  2. 41 CFR 101-39.302 - Rotation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FEDERAL PROPERTY MANAGEMENT REGULATIONS AVIATION, TRANSPORTATION, AND MOTOR VEHICLES 39-INTERAGENCY FLEET MANAGEMENT SYSTEMS 39.3-Use and Care of GSA Interagency Fleet Management System Vehicles § 101-39.302 Rotation. GSA Interagency Fleet Management System (IFMS) vehicles on high mileage assignments may be...

  3. 48 CFR 41.206 - Interagency agreements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Interagency agreements. 41... agreements. Agencies shall use interagency agreements (e.g., consolidated purchase, joint use, or cross-service agreements) when acquiring utility service or facilities from other Government agencies and shall...

  4. 48 CFR 41.206 - Interagency agreements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Interagency agreements. 41... agreements. Agencies shall use interagency agreements (e.g., consolidated purchase, joint use, or cross-service agreements) when acquiring utility service or facilities from other Government agencies and shall...

  5. 48 CFR 41.206 - Interagency agreements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Interagency agreements. 41... agreements. Agencies shall use interagency agreements (e.g., consolidated purchase, joint use, or cross-service agreements) when acquiring utility service or facilities from other Government agencies and shall...

  6. 77 FR 64347 - Notice of Diabetes Mellitus Interagency Coordinating Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-19

    ... November 15, 2012, DMICC meeting will focus on ``Federal Initiatives To Address Gestational Diabetes... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Notice of Diabetes Mellitus Interagency Coordinating Committee Meeting SUMMARY: The Diabetes Mellitus Interagency Coordinating Committee...

  7. Common Data Model for Neuroscience Data and Data Model Exchange

    PubMed Central

    Gardner, Daniel; Knuth, Kevin H.; Abato, Michael; Erde, Steven M.; White, Thomas; DeBellis, Robert; Gardner, Esther P.

    2001-01-01

    Objective: Generalizing the data models underlying two prototype neurophysiology databases, the authors describe and propose the Common Data Model (CDM) as a framework for federating a broad spectrum of disparate neuroscience information resources. Design: Each component of the CDM derives from one of five superclasses—data, site, method, model, and reference—or from relations defined between them. A hierarchic attribute-value scheme for metadata enables interoperability with variable tree depth to serve specific intra- or broad inter-domain queries. To mediate data exchange between disparate systems, the authors propose a set of XML-derived schema for describing not only data sets but data models. These include biophysical description markup language (BDML), which mediates interoperability between data resources by providing a meta-description for the CDM. Results: The set of superclasses potentially spans data needs of contemporary neuroscience. Data elements abstracted from neurophysiology time series and histogram data represent data sets that differ in dimension and concordance. Site elements transcend neurons to describe subcellular compartments, circuits, regions, or slices; non-neuroanatomic sites include sequences to patients. Methods and models are highly domain-dependent. Conclusions: True federation of data resources requires explicit public description, in a metalanguage, of the contents, query methods, data formats, and data models of each data resource. Any data model that can be derived from the defined superclasses is potentially conformant and interoperability can be enabled by recognition of BDML-described compatibilities. Such metadescriptions can buffer technologic changes. PMID:11141510

  8. Solutions for data integration in functional genomics: a critical assessment and case study.

    PubMed

    Smedley, Damian; Swertz, Morris A; Wolstencroft, Katy; Proctor, Glenn; Zouberakis, Michael; Bard, Jonathan; Hancock, John M; Schofield, Paul

    2008-11-01

    The torrent of data emerging from the application of new technologies to functional genomics and systems biology can no longer be contained within the traditional modes of data sharing and publication with the consequence that data is being deposited in, distributed across and disseminated through an increasing number of databases. The resulting fragmentation poses serious problems for the model organism community which increasingly rely on data mining and computational approaches that require gathering of data from a range of sources. In the light of these problems, the European Commission has funded a coordination action, CASIMIR (coordination and sustainability of international mouse informatics resources), with a remit to assess the technical and social aspects of database interoperability that currently prevent the full realization of the potential of data integration in mouse functional genomics. In this article, we assess the current problems with interoperability, with particular reference to mouse functional genomics, and critically review the technologies that can be deployed to overcome them. We describe a typical use-case where an investigator wishes to gather data on variation, genomic context and metabolic pathway involvement for genes discovered in a genome-wide screen. We go on to develop an automated approach involving an in silico experimental workflow tool, Taverna, using web services, BioMart and MOLGENIS technologies for data retrieval. Finally, we focus on the current impediments to adopting such an approach in a wider context, and strategies to overcome them.

  9. A step-by-step methodology for enterprise interoperability projects

    NASA Astrophysics Data System (ADS)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  10. 75 FR 71792 - Federal Interagency Committee on Emergency Medical Services Meeting Notice

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-24

    ..., Directorate of Emergency Preparedness and Response of the Department of Homeland Security, to provide.... NHTSA-2010-0156] Federal Interagency Committee on Emergency Medical Services Meeting Notice AGENCY... Committee on Emergency Medical Services. SUMMARY: NHTSA announces a meeting of the Federal Interagency...

  11. 10 CFR 473.23 - Interagency review panel.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Interagency review panel. 473.23 Section 473.23 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION AUTOMOTIVE PROPULSION RESEARCH AND DEVELOPMENT Review and Certification of Grants, Cooperative Agreements, Contracts, and Projects § 473.23 Interagency review panel. (a...

  12. 10 CFR 473.23 - Interagency review panel.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Interagency review panel. 473.23 Section 473.23 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION AUTOMOTIVE PROPULSION RESEARCH AND DEVELOPMENT Review and Certification of Grants, Cooperative Agreements, Contracts, and Projects § 473.23 Interagency review panel. (a...

  13. 10 CFR 473.23 - Interagency review panel.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Interagency review panel. 473.23 Section 473.23 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION AUTOMOTIVE PROPULSION RESEARCH AND DEVELOPMENT Review and Certification of Grants, Cooperative Agreements, Contracts, and Projects § 473.23 Interagency review panel. (a...

  14. 10 CFR 473.23 - Interagency review panel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Interagency review panel. 473.23 Section 473.23 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION AUTOMOTIVE PROPULSION RESEARCH AND DEVELOPMENT Review and Certification of Grants, Cooperative Agreements, Contracts, and Projects § 473.23 Interagency review panel. (a...

  15. 10 CFR 473.23 - Interagency review panel.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Interagency review panel. 473.23 Section 473.23 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION AUTOMOTIVE PROPULSION RESEARCH AND DEVELOPMENT Review and Certification of Grants, Cooperative Agreements, Contracts, and Projects § 473.23 Interagency review panel. (a...

  16. 41 CFR 101-39.202 - Contractor authorized services.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... VEHICLES 39-INTERAGENCY FLEET MANAGEMENT SYSTEMS 39.2-GSA Interagency Fleet Management System Services... related GSA Interagency Fleet Management System (IFMS) services solely for official purposes. (b) To the... -leased equipment which is not controlled by a GSA IFMS fleet management center, or for authorized...

  17. 78 FR 12334 - Proposed Collection; Comment Request: Federal Interagency Traumatic Brain Injury Research (FITBIR...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-22

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Proposed Collection; Comment Request: Federal Interagency Traumatic Brain Injury Research (FITBIR) Informatics System Data Access...-days of the date of this publication. Proposed Collection: Federal Interagency Traumatic Brain Injury...

  18. Perspectives on Interagency Collaboration.

    ERIC Educational Resources Information Center

    Western States Technical Assistance Resource, Monmouth, OR.

    Seven papers are presented from a 1982 conference on interagency collaboration in special education. Participants at the conference represented a variety of disciplines. B. McNulty and E. Soper cite 14 critical elements of successful interagency practice in chapter I, including communication, conflict resolution strategies, commitment to an…

  19. Meaningful use of health information technology and declines in in-hospital adverse drug events.

    PubMed

    Furukawa, Michael F; Spector, William D; Rhona Limcangco, M; Encinosa, William E

    2017-07-01

    Nationwide initiatives have promoted greater adoption of health information technology as a means to reduce adverse drug events (ADEs). Hospital adoption of electronic health records with Meaningful Use (MU) capabilities expected to improve medication safety has grown rapidly. However, evidence that MU capabilities are associated with declines in in-hospital ADEs is lacking. Data came from the 2010-2013 Medicare Patient Safety Monitoring System and the 2008-2013 Healthcare Information and Management Systems Society (HIMSS) Analytics Database. Two-level random intercept logistic regression was used to estimate the association of MU capabilities and occurrence of ADEs, adjusting for patient characteristics, hospital characteristics, and year of observation. Rates of in-hospital ADEs declined by 19% from 2010 to 2013. Adoption of MU capabilities was associated with 11% lower odds of an ADE (95% confidence interval [CI], 0.84-0.96). Interoperability capability was associated with 19% lower odds of an ADE (95% CI, 0.67- 0.98). Adoption of MU capabilities explained 22% of the observed reduction in ADEs, or 67,000 fewer ADEs averted by MU. Concurrent with the rapid uptake of MU and interoperability, occurrence of in-hospital ADEs declined significantly from 2010 to 2013. MU capabilities and interoperability were associated with lower occurrence of ADEs, but the effects did not vary by experience with MU. About one-fifth of the decline in ADEs from 2010 to 2013 was attributable to MU capabilities. Findings support the contention that adoption of MU capabilities and interoperability spurred by the Health Information Technology for Economic and Clinical Health Act contributed in part to the recent decline in ADEs. Published by Oxford University Press on behalf of the American Medical Informatics Association 2017. This work is written by US Government employees and is in the public domain in the United States.

  20. Infrastructure for Planetary Sciences: Universal planetary database development project

    NASA Astrophysics Data System (ADS)

    Kasaba, Yasumasa; Capria, M. T.; Crichton, D.; Zender, J.; Beebe, R.

    The International Planetary Data Alliance (IPDA), formally formed under COSPAR (Formal start: from the COSPAR 2008 at Montreal), is a joint international effort to enable global access and exchange of high quality planetary science data, and to establish archive stan-dards that make it easier to share the data across international boundaries. In 2008-2009, thanks to the many players from several agencies and institutions, we got fruitful results in 6 projects: (1) Inter-operable Planetary Data Access Protocol (PDAP) implementations [led by J. Salgado@ESA], (2) Small bodies interoperability [led by I. Shinohara@JAXA N. Hirata@U. Aizu], (3) PDAP assessment [led by Y. Yamamoto@JAXA], (4) Architecture and standards definition [led by D. Crichton@NASA], (5) Information model and data dictionary [led by S. Hughes@NASA], and (6) Venus Express Interoperability [led by N. Chanover@NMSU]. 'IPDA 2009-2010' is important, especially because the NASA/PDS system reformation is now reviewed as it develops for application at the international level. IPDA is the gate for the establishment of the future infrastructure. We are running 8 projects: (1) IPDA Assessment of PDS4 Data Standards [led by S. Hughes (NASA/JPL)], (2) IPDA Archive Guide [led by M.T. Capria (IASF/INAF) and D. Heather (ESA/PSA)], (3) IPDA Standards Identification [led by E. Rye (NASA/PDS) and G. Krishna (ISRO)], (4) Ancillary Data Standards [led by C. Acton (NASA/JPL)], (5) IPDA Registries Definition [led by D. Crichton (NASA/JPL)], (6) PDAP Specification [led by J. Salgado (ESA/PSA) and Y. Yamamoto (JAXA)], (7) In-teroperability Assessment [R. Beebe (NMSU) and D. Heather (ESA/PSA)], and (8) PDAP Geographic Information System (GIS) extension [N. Hirata (Univ. Aizu) and T. Hare (USGS: thare@usgs.gov)]. This paper presents our achievements and plans summarized in the IPDA 5th Steering Com-mittee meeting at DLR in July 2010. We are now just the gate for the establishment of the Infrastructure.

  1. Progress of Interoperability in Planetary Research for Geospatial Data Analysis

    NASA Astrophysics Data System (ADS)

    Hare, T. M.; Gaddis, L. R.

    2015-12-01

    For nearly a decade there has been a push in the planetary science community to support interoperable methods of accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (i.e., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized image formats that retain geographic information (e.g., GeoTiff, GeoJpeg2000), digital geologic mapping conventions, planetary extensions for symbols that comply with U.S. Federal Geographic Data Committee cartographic and geospatial metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they have been modified to support the planetary domain. The motivation to support common, interoperable data format and delivery standards is not only to improve access for higher-level products but also to address the increasingly distributed nature of the rapidly growing volumes of data. The strength of using an OGC approach is that it provides consistent access to data that are distributed across many facilities. While data-steaming standards are well-supported by both the more sophisticated tools used in Geographic Information System (GIS) and remote sensing industries, they are also supported by many light-weight browsers which facilitates large and small focused science applications and public use. Here we provide an overview of the interoperability initiatives that are currently ongoing in the planetary research community, examples of their successful application, and challenges that remain.

  2. Big issues, small systems: managing with information in medical research.

    PubMed

    Jones, J; Preston, H

    2000-08-01

    This subject of this article is the design of a database system for handling files related to the work of the Molecular Genetics Department of the International Blood Group Reference Laboratory. It examines specialist information needs identified within this organization and it indicates how the design of the Rhesus Information Tracking System was able to meet current needs. Rapid Applications Development prototyping forms the basis of the investigation, linked to interview, questionnaire, and observation techniques in order to establish requirements for interoperability. In particular, the place of this specialist database within the much broader information strategy of the National Blood Service will be examined. This unique situation is analogous to management activities in broader environments and a number of generic issues are highlighted by the research.

  3. 76 FR 50234 - National Institute of Environmental Health Sciences Notice of Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-12

    ..., as amended (5 U.S.C. App.), notice is hereby given of meetings of the Interagency Breast Cancer and... the meeting. Name of Committee: Interagency Breast Cancer and Environmental Research [email protected] . Name of Committee: Interagency Breast Cancer and Environmental Research Coordinating...

  4. 76 FR 59147 - National Institute of Environmental Health Sciences Notice of Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-23

    ..., as amended (5 U.S.C. App.), notice is hereby given of meetings of the Interagency Breast Cancer and... the meeting. Name of Committee: Interagency Breast Cancer and Environmental Research [email protected] . Name of Committee: Interagency Breast Cancer and Environmental Research Coordinating...

  5. 22 CFR 94.8 - Interagency coordinating group.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Interagency coordinating group. 94.8 Section 94.8 Foreign Relations DEPARTMENT OF STATE LEGAL AND RELATED SERVICES INTERNATIONAL CHILD ABDUCTION § 94.8 Interagency coordinating group. The U.S. Central Authority shall nominate federal employees and...

  6. 41 CFR 109-39.105-2 - Agency requests to withdraw participation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... AVIATION, TRANSPORTATION, AND MOTOR VEHICLES 39-INTERAGENCY FLEET MANAGEMENT SYSTEMS 39.1-Establishment, Modification, and Discontinuance of Interagency Fleet Management Systems § 109-39.105-2 Agency requests to... of participation by a DOE organization of a given interagency fleet management system, the...

  7. 41 CFR 101-39.201 - Services available.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-INTERAGENCY FLEET MANAGEMENT SYSTEMS 39.2-GSA Interagency Fleet Management System Services § 101-39.201 Services available. GSA Interagency Fleet Management System (IFMS) vehicles and services shall be used in... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Services available. 101...

  8. 78 FR 14561 - Notice of Diabetes Mellitus Interagency Coordinating Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-06

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Notice of Diabetes Mellitus Interagency Coordinating Committee Meeting SUMMARY: The Diabetes Mellitus Interagency Coordinating Committee... Diabetes Mellitus.'' The meeting is open to the public. DATES: The meeting will be held on March 28, 2013...

  9. 18 CFR 701.54 - Interagency Liaison Committee.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 2 2013-04-01 2012-04-01 true Interagency Liaison Committee. 701.54 Section 701.54 Conservation of Power and Water Resources WATER RESOURCES COUNCIL COUNCIL ORGANIZATION Headquarters Organization § 701.54 Interagency Liaison Committee. There is established within the...

  10. 18 CFR 701.54 - Interagency Liaison Committee.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 2 2014-04-01 2014-04-01 false Interagency Liaison Committee. 701.54 Section 701.54 Conservation of Power and Water Resources WATER RESOURCES COUNCIL COUNCIL ORGANIZATION Headquarters Organization § 701.54 Interagency Liaison Committee. There is established within the...

  11. 18 CFR 701.54 - Interagency Liaison Committee.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 2 2010-04-01 2010-04-01 false Interagency Liaison Committee. 701.54 Section 701.54 Conservation of Power and Water Resources WATER RESOURCES COUNCIL COUNCIL ORGANIZATION Headquarters Organization § 701.54 Interagency Liaison Committee. There is established within the...

  12. 18 CFR 701.54 - Interagency Liaison Committee.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 2 2012-04-01 2012-04-01 false Interagency Liaison Committee. 701.54 Section 701.54 Conservation of Power and Water Resources WATER RESOURCES COUNCIL COUNCIL ORGANIZATION Headquarters Organization § 701.54 Interagency Liaison Committee. There is established within the...

  13. 18 CFR 701.54 - Interagency Liaison Committee.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 2 2011-04-01 2011-04-01 false Interagency Liaison Committee. 701.54 Section 701.54 Conservation of Power and Water Resources WATER RESOURCES COUNCIL COUNCIL ORGANIZATION Headquarters Organization § 701.54 Interagency Liaison Committee. There is established within the...

  14. 78 FR 37834 - Submission for OMB review; 30-Day Comment Request; Federal Interagency Traumatic Brain Injury...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-24

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Submission for OMB review; 30-Day Comment Request; Federal Interagency Traumatic Brain Injury Research (FITBIR) Informatics... Interagency Traumatic Brain Injury Research (FITBIR) Informatics System Data Access Request. 0925-NEW...

  15. Obtaining Related Services through Local Interagency Collaboration.

    ERIC Educational Resources Information Center

    Olsen, Kenneth R.

    Designed as a resource for local school administrators, the report describes the experiences of 15 local special education agencies in providing related services at reasonable cost through interagency cooperation. An introductory chapter discusses the role of interagency committees (both policy and direct service types), and provides information…

  16. A Demonstration Model of Interagency Collaboration for Students with Disabilities: A Multilevel Approach

    ERIC Educational Resources Information Center

    Flowers, Claudia; Test, David W.; Povenmire-Kirk, Tiana C.; Diegelmann, Karen M.; Bunch-Crump, Kimberly R.; Kemp-Inman, Amy; Goodnight, Crystalyn I.

    2018-01-01

    Communicating Interagency Relationships and Collaborative Linkages for Exceptional Students (CIRCLES) is a transition-planning service delivery model designed to guide schools in implementing interagency collaboration. This study examined the impact of CIRCLES on students' self-determination and participation in individualized education program…

  17. How State and Local Interagency Partnerships Work.

    ERIC Educational Resources Information Center

    Rachal, Patricia, Ed.

    1996-01-01

    This newsletter theme issue describes a state-local team partnership model for interagency transition efforts for young adults with deaf-blindness. Excepts from a presentation by Jane M. Everson identify key aspects and characteristics of effective state and local interagency partnerships. These include: (1) strategies for initiating and…

  18. SPANG: a SPARQL client supporting generation and reuse of queries for distributed RDF databases.

    PubMed

    Chiba, Hirokazu; Uchiyama, Ikuo

    2017-02-08

    Toward improved interoperability of distributed biological databases, an increasing number of datasets have been published in the standardized Resource Description Framework (RDF). Although the powerful SPARQL Protocol and RDF Query Language (SPARQL) provides a basis for exploiting RDF databases, writing SPARQL code is burdensome for users including bioinformaticians. Thus, an easy-to-use interface is necessary. We developed SPANG, a SPARQL client that has unique features for querying RDF datasets. SPANG dynamically generates typical SPARQL queries according to specified arguments. It can also call SPARQL template libraries constructed in a local system or published on the Web. Further, it enables combinatorial execution of multiple queries, each with a distinct target database. These features facilitate easy and effective access to RDF datasets and integrative analysis of distributed data. SPANG helps users to exploit RDF datasets by generation and reuse of SPARQL queries through a simple interface. This client will enhance integrative exploitation of biological RDF datasets distributed across the Web. This software package is freely available at http://purl.org/net/spang .

  19. Interagency registry for mechanically assisted circulatory support report on the total artificial heart.

    PubMed

    Arabía, Francisco A; Cantor, Ryan S; Koehl, Devin A; Kasirajan, Vigneshwar; Gregoric, Igor; Moriguchi, Jaime D; Esmailian, Fardad; Ramzy, Danny; Chung, Joshua S; Czer, Lawrence S; Kobashigawa, Jon A; Smith, Richard G; Kirklin, James K

    2018-04-26

    We sought to better understand the patient population who receive a temporary total artificial heart (TAH) as bridge to transplant or as bridge to decision by evaluating data from the Interagency Registry for Mechanically Assisted Circulatory Support (INTERMACS) database. We examined data related to survival, adverse events, and competing outcomes from patients who received TAHs between June 2006 and April 2017 and used hazard function analysis to explore risk factors for mortality. Data from 450 patients (87% men; mean age, 50 years) were available in the INTERMACS database. The 2 most common diagnoses were dilated cardiomyopathy (50%) and ischemic cardiomyopathy (20%). Risk factors for right heart failure were present in 82% of patients. Most patients were INTERMACS Profile 1 (43%) or 2 (37%) at implantation. There were 266 patients who eventually underwent transplantation, and 162 died. Overall 3-, 6-, and 12-month actuarial survival rates were 73%, 62%, and 53%, respectively. Risk factors for death included older age (p = 0.001), need for pre-implantation dialysis (p = 0.006), higher creatinine (p = 0.008) and lower albumin (p < 0.001) levels, and implantation at a low-volume center (≤10 TAHs; p < 0.001). Competing-outcomes analysis showed 71% of patients in high-volume centers were alive on the device or had undergone transplantation at 12 months after TAH implantation vs 57% in low-volume centers (p = 0.003). Patients receiving TAHs have rapidly declining cardiac function and require prompt intervention. Experienced centers have better outcomes, likely related to patient selection, timing of implantation, patient care, and device management. Organized transfer of knowledge to low-volume centers could improve outcomes. Copyright © 2018 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.

  20. 76 FR 7225 - National Institute of Environmental Health Sciences; Notice of Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-09

    ... below at least 10 days in advance of the meeting. Name of Committee: Interagency Breast Cancer and... 27709, (919) 541-4980, [email protected] . Name of Committee: Interagency Breast Cancer and... 27709, (919) 541-4980, [email protected] . Name of Committee: Interagency Breast Cancer and...

  1. 76 FR 24899 - National Institute of Environmental Health Sciences; Notice of Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-03

    ... Act, as amended (5 U.S.C. App.), notice is hereby given of meetings of the Interagency Breast Cancer... listed below in advance of the meeting. Name of Committee: Interagency Breast Cancer and Environmental...: Interagency Breast Cancer and Environmental Research Coordinating Committee (IBCERC) State of the Science...

  2. 76 FR 24896 - National Institute of Environmental Health Sciences; Notice of Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-03

    ... Act, as amended (5 U.S.C. App.), notice is hereby given of meetings of the Interagency Breast Cancer... listed below in advance of the meeting. Name of Committee: Interagency Breast Cancer and Environmental...: Interagency Breast Cancer and Environmental Research Coordinating Committee (IBCERC) Research Process...

  3. The Relationship between Mental Health, Vocational Rehabilitation Interagency Functioning, and Outcome of Psychiatrically Disabled Persons.

    ERIC Educational Resources Information Center

    Dellario, Donald J.

    1985-01-01

    Conducted structured interviews of seven selected Mental Health (MH) and Vocational Rehabilitation (VR) dyads to assess interagency functioning, and compared results to selected interagency performance indicators. Results suggested that improved MH-VR linkages can increase the probability of successful rehabilitation outcomes for psychiatrically…

  4. 48 CFR 18.113 - Interagency acquisitions under the Economy Act.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Interagency acquisitions under the Economy Act. 18.113 Section 18.113 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Available Acquisition Flexibilities 18.113 Interagency acquisitions under...

  5. 41 CFR 101-39.304 - Modification or installation of accessory equipment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., TRANSPORTATION, AND MOTOR VEHICLES 39-INTERAGENCY FLEET MANAGEMENT SYSTEMS 39.3-Use and Care of GSA Interagency Fleet Management System Vehicles § 101-39.304 Modification or installation of accessory equipment. The modification of a GSA Interagency Fleet Management System (IFMS) vehicle or the permanent installation of...

  6. 41 CFR 101-39.204 - Obtaining motor vehicles for indefinite assignment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., TRANSPORTATION, AND MOTOR VEHICLES 39-INTERAGENCY FLEET MANAGEMENT SYSTEMS 39.2-GSA Interagency Fleet Management... related services of the GSA Interagency Fleet Management System (IFMS) are provided to requesting agencies... have been consolidated into the supporting GSA IFMS fleet management center, and no agency-owned...

  7. 41 CFR 109-39.300 - General.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 39-INTERAGENCY FLEET MANAGEMENT SYSTEMS 39.3-Use and Care of GSA Interagency Fleet Management System... operators and passengers in GSA Interagency Fleet Management System (IFMS) motor vehicles are aware of the... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false General. 109-39.300...

  8. 75 FR 32942 - National Toxicology Program (NTP); NTP Interagency Center for the Evaluation of Alternative...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-10

    ... Progress Report of the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM... of the Biennial Progress Report 2008-2009: Interagency Coordinating Committee on the Validation of...) 919-541-0947, (e-mail) [email protected] . FOR FURTHER INFORMATION CONTACT: Dr. William S. Stokes...

  9. 36 CFR 73.11 - Federal Interagency Panel for World Heritage.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... World Heritage. 73.11 Section 73.11 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR WORLD HERITAGE CONVENTION § 73.11 Federal Interagency Panel for World Heritage. (a) Responsibilities. The Federal Interagency Panel for World Heritage is established to advise the Department of the...

  10. Interagency Plan for Children with Special Needs.

    ERIC Educational Resources Information Center

    Maryland State Dept. of Health and Mental Hygiene, Baltimore.

    The Interagency Plan for Children with Special Needs for Maryland residents has three major purposes: (1) to set priorities for developing or expanding services required by special needs children and their families; (2) to ensure that resources targeted for special needs children are administered effectively by increasing interagency coordination…

  11. Advanced SPARQL querying in small molecule databases.

    PubMed

    Galgonek, Jakub; Hurt, Tomáš; Michlíková, Vendula; Onderka, Petr; Schwarz, Jan; Vondrášek, Jiří

    2016-01-01

    In recent years, the Resource Description Framework (RDF) and the SPARQL query language have become more widely used in the area of cheminformatics and bioinformatics databases. These technologies allow better interoperability of various data sources and powerful searching facilities. However, we identified several deficiencies that make usage of such RDF databases restrictive or challenging for common users. We extended a SPARQL engine to be able to use special procedures inside SPARQL queries. This allows the user to work with data that cannot be simply precomputed and thus cannot be directly stored in the database. We designed an algorithm that checks a query against data ontology to identify possible user errors. This greatly improves query debugging. We also introduced an approach to visualize retrieved data in a user-friendly way, based on templates describing visualizations of resource classes. To integrate all of our approaches, we developed a simple web application. Our system was implemented successfully, and we demonstrated its usability on the ChEBI database transformed into RDF form. To demonstrate procedure call functions, we employed compound similarity searching based on OrChem. The application is publicly available at https://bioinfo.uochb.cas.cz/projects/chemRDF.

  12. Spatial cyberinfrastructures, ontologies, and the humanities.

    PubMed

    Sieber, Renee E; Wellen, Christopher C; Jin, Yuan

    2011-04-05

    We report on research into building a cyberinfrastructure for Chinese biographical and geographic data. Our cyberinfrastructure contains (i) the McGill-Harvard-Yenching Library Ming Qing Women's Writings database (MQWW), the only online database on historical Chinese women's writings, (ii) the China Biographical Database, the authority for Chinese historical people, and (iii) the China Historical Geographical Information System, one of the first historical geographic information systems. Key to this integration is that linked databases retain separate identities as bases of knowledge, while they possess sufficient semantic interoperability to allow for multidatabase concepts and to support cross-database queries on an ad hoc basis. Computational ontologies create underlying semantics for database access. This paper focuses on the spatial component in a humanities cyberinfrastructure, which includes issues of conflicting data, heterogeneous data models, disambiguation, and geographic scale. First, we describe the methodology for integrating the databases. Then we detail the system architecture, which includes a tier of ontologies and schema. We describe the user interface and applications that allow for cross-database queries. For instance, users should be able to analyze the data, examine hypotheses on spatial and temporal relationships, and generate historical maps with datasets from MQWW for research, teaching, and publication on Chinese women writers, their familial relations, publishing venues, and the literary and social communities. Last, we discuss the social side of cyberinfrastructure development, as people are considered to be as critical as the technical components for its success.

  13. tmBioC: improving interoperability of text-mining tools with BioC.

    PubMed

    Khare, Ritu; Wei, Chih-Hsuan; Mao, Yuqing; Leaman, Robert; Lu, Zhiyong

    2014-01-01

    The lack of interoperability among biomedical text-mining tools is a major bottleneck in creating more complex applications. Despite the availability of numerous methods and techniques for various text-mining tasks, combining different tools requires substantial efforts and time owing to heterogeneity and variety in data formats. In response, BioC is a recent proposal that offers a minimalistic approach to tool interoperability by stipulating minimal changes to existing tools and applications. BioC is a family of XML formats that define how to present text documents and annotations, and also provides easy-to-use functions to read/write documents in the BioC format. In this study, we introduce our text-mining toolkit, which is designed to perform several challenging and significant tasks in the biomedical domain, and repackage the toolkit into BioC to enhance its interoperability. Our toolkit consists of six state-of-the-art tools for named-entity recognition, normalization and annotation (PubTator) of genes (GenNorm), diseases (DNorm), mutations (tmVar), species (SR4GN) and chemicals (tmChem). Although developed within the same group, each tool is designed to process input articles and output annotations in a different format. We modify these tools and enable them to read/write data in the proposed BioC format. We find that, using the BioC family of formats and functions, only minimal changes were required to build the newer versions of the tools. The resulting BioC wrapped toolkit, which we have named tmBioC, consists of our tools in BioC, an annotated full-text corpus in BioC, and a format detection and conversion tool. Furthermore, through participation in the 2013 BioCreative IV Interoperability Track, we empirically demonstrate that the tools in tmBioC can be more efficiently integrated with each other as well as with external tools: Our experimental results show that using BioC reduces >60% in lines of code for text-mining tool integration. The tmBioC toolkit is publicly available at http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/tmTools/. Database URL: http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/tmTools/. Published by Oxford University Press 2014. This work is written by US Government employees and is in the public domain in the US.

  14. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    DTIC Science & Technology

    2017-10-01

    Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...Sept 2016 – 20 Sept 2017 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology ...efficiency through interoperable medical technologies . We played a leadership role on interoperability safety standards (AAMI, AAMI/UL Joint

  15. Documenting Models for Interoperability and Reusability ...

    EPA Pesticide Factsheets

    Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration between scientific communities, since component-based modeling can integrate models from different disciplines. Integrated Environmental Modeling (IEM) systems focus on transferring information between components by capturing a conceptual site model; establishing local metadata standards for input/output of models and databases; managing data flow between models and throughout the system; facilitating quality control of data exchanges (e.g., checking units, unit conversions, transfers between software languages); warning and error handling; and coordinating sensitivity/uncertainty analyses. Although many computational software systems facilitate communication between, and execution of, components, there are no common approaches, protocols, or standards for turn-key linkages between software systems and models, especially if modifying components is not the intent. Using a standard ontology, this paper reviews how models can be described for discovery, understanding, evaluation, access, and implementation to facilitate interoperability and reusability. In the proceedings of the International Environmental Modelling and Software Society (iEMSs), 8th International Congress on Environmental Mod

  16. The Protein Information Resource: an integrated public resource of functional annotation of proteins

    PubMed Central

    Wu, Cathy H.; Huang, Hongzhan; Arminski, Leslie; Castro-Alvear, Jorge; Chen, Yongxing; Hu, Zhang-Zhi; Ledley, Robert S.; Lewis, Kali C.; Mewes, Hans-Werner; Orcutt, Bruce C.; Suzek, Baris E.; Tsugita, Akira; Vinayaka, C. R.; Yeh, Lai-Su L.; Zhang, Jian; Barker, Winona C.

    2002-01-01

    The Protein Information Resource (PIR) serves as an integrated public resource of functional annotation of protein data to support genomic/proteomic research and scientific discovery. The PIR, in collaboration with the Munich Information Center for Protein Sequences (MIPS) and the Japan International Protein Information Database (JIPID), produces the PIR-International Protein Sequence Database (PSD), the major annotated protein sequence database in the public domain, containing about 250 000 proteins. To improve protein annotation and the coverage of experimentally validated data, a bibliography submission system is developed for scientists to submit, categorize and retrieve literature information. Comprehensive protein information is available from iProClass, which includes family classification at the superfamily, domain and motif levels, structural and functional features of proteins, as well as cross-references to over 40 biological databases. To provide timely and comprehensive protein data with source attribution, we have introduced a non-redundant reference protein database, PIR-NREF. The database consists of about 800 000 proteins collected from PIR-PSD, SWISS-PROT, TrEMBL, GenPept, RefSeq and PDB, with composite protein names and literature data. To promote database interoperability, we provide XML data distribution and open database schema, and adopt common ontologies. The PIR web site (http://pir.georgetown.edu/) features data mining and sequence analysis tools for information retrieval and functional identification of proteins based on both sequence and annotation information. The PIR databases and other files are also available by FTP (ftp://nbrfa.georgetown.edu/pir_databases). PMID:11752247

  17. Open system environment procurement

    NASA Technical Reports Server (NTRS)

    Fisher, Gary

    1994-01-01

    Relationships between the request for procurement (RFP) process and open system environment (OSE) standards are described. A guide was prepared to help Federal agency personnel overcome problems in writing an adequate statement of work and developing realistic evaluation criteria when transitioning to an OSE. The guide contains appropriate decision points and transition strategies for developing applications that are affordable, scalable and interoperable across a broad range of computing environments. While useful, the guide does not eliminate the requirement that agencies posses in-depth expertise in software development, communications, and database technology in order to evaluate open systems.

  18. NCBI2RDF: Enabling Full RDF-Based Access to NCBI Databases

    PubMed Central

    Anguita, Alberto; García-Remesal, Miguel; de la Iglesia, Diana; Maojo, Victor

    2013-01-01

    RDF has become the standard technology for enabling interoperability among heterogeneous biomedical databases. The NCBI provides access to a large set of life sciences databases through a common interface called Entrez. However, the latter does not provide RDF-based access to such databases, and, therefore, they cannot be integrated with other RDF-compliant databases and accessed via SPARQL query interfaces. This paper presents the NCBI2RDF system, aimed at providing RDF-based access to the complete NCBI data repository. This API creates a virtual endpoint for servicing SPARQL queries over different NCBI repositories and presenting to users the query results in SPARQL results format, thus enabling this data to be integrated and/or stored with other RDF-compliant repositories. SPARQL queries are dynamically resolved, decomposed, and forwarded to the NCBI-provided E-utilities programmatic interface to access the NCBI data. Furthermore, we show how our approach increases the expressiveness of the native NCBI querying system, allowing several databases to be accessed simultaneously. This feature significantly boosts productivity when working with complex queries and saves time and effort to biomedical researchers. Our approach has been validated with a large number of SPARQL queries, thus proving its reliability and enhanced capabilities in biomedical environments. PMID:23984425

  19. 76 FR 7574 - National Institute of Environmental Health Sciences; Notice of Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-10

    ... below at least 10 days in advance of the meeting. Name of Committee: Interagency Breast Cancer and... . Name of Committee: Interagency Breast Cancer and Environmental Research Coordinating Committee (IBCERC... Act, as amended (5 U.S.C. Appendix 2), notice is hereby given of a meeting of the Interagency Breast...

  20. 76 FR 50235 - National Institute of Environmental Health Sciences; Notice of Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-12

    ... Act, as amended (5 U.S.C. App.), notice is hereby given of meetings of the Interagency Breast Cancer... listed below in advance of the meeting. Name of Committee: Interagency Breast Cancer and Environmental...: Interagency Breast Cancer and Environmental Research Coordinating Committee. Date: November 8, 2011. Time: 3 p...

  1. Interagency Collaboration for Young Adults with Deaf-Blindness: Toward a Common Transition Goal.

    ERIC Educational Resources Information Center

    Everson, Jane M.; And Others

    This monograph is a compilation of the knowledge gained by the Technical Assistance Center (TAC) of the Helen Keller National Center, from training and technical assistance activities conducted with state interagency teams serving youth and young adults with deaf-blindness. The book views interagency collaboration as essential in achieving…

  2. 10 CFR 1015.103 - Antitrust, fraud, tax, interagency, transportation account audit, acquisition contract, and...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... does not apply to tax debts. (c) Part 1015 does not apply to claims between Federal agencies. Federal... 10 Energy 4 2010-01-01 2010-01-01 false Antitrust, fraud, tax, interagency, transportation account... General § 1015.103 Antitrust, fraud, tax, interagency, transportation account audit, acquisition contract...

  3. 78 FR 5477 - Agency Information Collection Activities: Inter-Agency Alien Witness and Informant Record, Form I...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ...-0046] Agency Information Collection Activities: Inter-Agency Alien Witness and Informant Record, Form I... collection. (2) Title of the Form/Collection: Inter-Agency Alien Witness and Informant Record. (3) Agency...- 854 is used by law enforcement agencies to bring alien witnesses and informants to the United States...

  4. 45 CFR 30.3 - Antitrust, fraud, exception in the account of an accountable official, and interagency claims...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... accountable official, and interagency claims excluded. 30.3 Section 30.3 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CLAIMS COLLECTION General Provisions § 30.3 Antitrust, fraud, exception in the account of an accountable official, and interagency claims excluded. (a) Claims involving...

  5. 78 FR 7215 - Disclosure and Delivery Requirements for Copies of Appraisals and Other Written Valuations Under...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-31

    ... described above, the inter-agency group is issuing a final rule under section 129H of TILA (2103 Interagency... Communication Group, Inc. (Kleimann), which specializes in consumer financial disclosures. The Bureau and... Interagency Appraisals Proposal and involved a large bank, a trade group of smaller depository institutions...

  6. Schools and the Community: A Necessary Partnership: A Guide to Interagency Collaboration.

    ERIC Educational Resources Information Center

    Alberta Dept. of Education, Edmonton. Education Response Centre.

    The problems facing students and families in Alberta, Canada, have been recognized as community problems that require community solutions. Interagency collaboration has become a necessity indicative of the changing times and the global focus on integration rather than isolation. Interagency collaboration is an arrangement in which agencies work…

  7. 76 FR 30174 - Interagency Committee on Smoking and Health: Notice of Charter Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-24

    ... Committee on Smoking and Health: Notice of Charter Renewal This gives notice under the Federal Advisory Committee Act (Pub. L. 92-463) of October 6, 1972, that the Interagency Committee on Smoking and Health... information, contact Dana Shelton, Designated Federal Officer, Interagency Committee on Smoking and Health...

  8. 77 FR 37060 - National Institute of Diabetes and Digestive and Kidney Diseases Diabetes Mellitus Interagency...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-20

    ... Diabetes and Digestive and Kidney Diseases Diabetes Mellitus Interagency Coordinating Committee; Notice of Meeting The Diabetes Mellitus Interagency Coordinating Committee (DMICC) will hold a web conference on July 18, 2012, from 1 to 3:30 p.m. The public is invited to participate in the web conference. For...

  9. Interagency Rare Plant Team inventory results - 1998 through 2003

    Treesearch

    Deborah J. Clark; David A. Tait

    2007-01-01

    Fishlake National Forest, Dixie National Forest, Bureau of Land Management - Richfield Field Office, and Capitol Reef National Park became partners in an Interagency Agreement to inventory and monitor threatened, endangered, and sensitive plant species shared by these agencies. From 1998 to 2003, the Interagency Rare Plant Team surveyed and recorded over 650 new...

  10. Closing the Inter-Agency Gap: Role of the Marine Infantry Battalion on the Future Battlefield

    DTIC Science & Technology

    2010-04-26

    Master of Military Studies Research Paper September 2009- April2010 4. TITLE AND SUBTITLE Sa. CONTRACT NUMBER Closing the Inter-Agency Gap : Role of the...battlalion the future battlefield will be to close the inter-agency gap by utilizing a comprehensive government approach to locate, close with, and...

  11. Children, Families and Interagency Work: Experiences of Partnership Work in Primary Education Settings

    ERIC Educational Resources Information Center

    Milbourne, Linda

    2005-01-01

    Despite UK government initiatives intended to address social exclusion, those with poor access to social and economic resources continue to experience unresponsive services. In these circumstances, small inter-agency projects may offer accessible alternatives. This article explores the implementation of inter-agency work at a local level, focusing…

  12. 75 FR 14335 - Revisions to the Export Administration Regulations To Enhance U.S. Homeland Security: Addition of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-25

    ... identified by an interagency working group that is reviewing export control issues related to homeland security. The interagency working group is made up of representatives from the Departments of Commerce, Defense, Homeland Security and State. The purpose of the interagency working group is to ensure that...

  13. Putting the Pieces Together: Making Interagency Collaboration Work. Preschool Interagency Council: A Model.

    ERIC Educational Resources Information Center

    Morgan, Jan

    The manual suggests steps, procedures, best practices, and key concepts to create and maintain successful interagency collaboration in the delivery of services to preschool handicapped children. Considered are the concept of collaboration, the rationale for its development, and a means of assessing a community to determine the needs and potential…

  14. Scientific Use Cases for the Virtual Atomic and Molecular Data Center

    NASA Astrophysics Data System (ADS)

    Dubernet, M. L.; Aboudarham, J.; Ba, Y. A.; Boiziot, M.; Bottinelli, S.; Caux, E.; Endres, C.; Glorian, J. M.; Henry, F.; Lamy, L.; Le Sidaner, P.; Møller, T.; Moreau, N.; Rénié, C.; Roueff, E.; Schilke, P.; Vastel, C.; Zwoelf, C. M.

    2014-12-01

    VAMDC Consortium is a worldwide consortium which federates interoperable Atomic and Molecular databases through an e-science infrastructure. The contained data are of the highest scientific quality and are crucial for many applications: astrophysics, atmospheric physics, fusion, plasma and lighting technologies, health, etc. In this paper we present astrophysical scientific use cases in relation to the use of the VAMDC e-infrastructure. Those will cover very different applications such as: (i) modeling the spectra of interstellar objects using the myXCLASS software tool implemented in the Common Astronomy Software Applications package (CASA) or using the CASSIS software tool, in its stand-alone version or implemented in the Herschel Interactive Processing Environment (HIPE); (ii) the use of Virtual Observatory tools accessing VAMDC databases; (iii) the access of VAMDC from the Paris solar BASS2000 portal; (iv) the combination of tools and database from the APIS service (Auroral Planetary Imaging and Spectroscopy); (v) combination of heterogeneous data for the application to the interstellar medium from the SPECTCOL tool.

  15. The NorWeST Stream Temperature Database, Model, and Climate Scenarios for the Northwest U.S. (Invited)

    NASA Astrophysics Data System (ADS)

    Isaak, D.; Wenger, S.; Peterson, E.; Ver Hoef, J.; Luce, C.; Hostetler, S. W.; Kershner, J.; Dunham, J.; Nagel, D.; Roper, B.

    2013-12-01

    Anthropogenic climate change is warming the Earth's rivers and streams and threatens significant changes to aquatic biodiversity. Effective threat response will require prioritization of limited conservation resources and coordinated interagency efforts guided by accurate information about climate, and climate change, at scales relevant to the distributions of species across landscapes. Here, we describe the NorWeST (i.e., NorthWest Stream Temperature) project to develop a comprehensive interagency stream temperature database and high-resolution climate scenarios across Washington, Oregon, Idaho, Montana, and Wyoming (~400,000 stream kilometers). The NorWeST database consists of stream temperature data contributed by >60 state, federal, tribal, and private resource agencies and may be the largest of its kind in the world (>45,000,000 hourly temperature recordings at >15,000 unique monitoring sites). These data are being used with spatial statistical network models to accurately downscale (R2 = 90%; RMSE < 1 C) global climate patterns to all perennially flowing reaches within river networks at 1-kilometer resolution. Historic stream temperature scenarios are developed using air temperature data from RegCM3 runs for the NCEP historical reanalysis and future scenarios (2040s and 2080s) are developed by applying bias corrected air temperature and discharge anomalies from ensemble climate and hydrology model runs for A1B and A2 warming trajectories. At present, stream temperature climate scenarios have been developed for 230,000 stream kilometers across Idaho and western Montana using data from more than 7,000 monitoring sites. The raw temperature data and stream climate scenarios are made available as ArcGIS geospatial products for download through the NorWeST website as individual river basins are completed (http://www.fs.fed.us/rm/boise/AWAE/projects/NorWeST.shtml). By providing open access to temperature data and scenarios, the project is fostering new research on stream temperatures and better collaborative management of aquatic resources through improved: 1) climate vulnerability assessments for sensitive species, 2) decision support tools that use regionally consistent scenarios, 3) water quality assessments, and 4) temperature and biological monitoring programs. Additional project details are contained in this Great Northern Landscape Conservation Cooperative newsletter (http://greatnorthernlcc.org/features/streamtemp-database).

  16. A Model of Federal Interagency Cooperation: The National Interagency Confederation for Biological Research

    PubMed Central

    Wright, Mary; Clifford Lane, H.; Schoomaker, Eric B.

    2014-01-01

    The terrorist attacks of September 11 and the anthrax mailings a month later prompted a sweeping response by the federal government to improve the preparedness of the US to meet the potential threat posed by a terrorist using a biological agent. This response transcended traditional interagency boundaries, creating new opportunities while producing unique fiscal and leadership challenges. The National Interagency Confederation for Biological Research has made significant progress over the past 12 years because of its ability to adapt to the need for interagency cooperation and overcome many of these challenges. As construction of the National Interagency Biodefense Campus at Fort Detrick nears completion, the US has the capability to pursue a unique whole-of-government approach to the development of medical measures to counter the threat of bioterrorism. In addition to the high-level support of many in the federal government, the key success factors for this effort have been (1) a critical mass of leaders with the right leadership characteristics, (2) development of a compelling vision and accompanying narrative understood and articulated by all partnering organizations, and (3) recognition of the need for a partnership office to do the important communication and collaboration work in the organization to synchronize the information available to all the partners. The major barrier to interagency cooperative efforts of this kind is the inability to comingle funds from different appropriations. PMID:24819736

  17. A model of federal interagency cooperation: the National Interagency Confederation for Biological Research.

    PubMed

    Gilman, James K; Wright, Mary; Clifford Lane, H; Schoomaker, Eric B

    2014-01-01

    The terrorist attacks of September 11 and the anthrax mailings a month later prompted a sweeping response by the federal government to improve the preparedness of the US to meet the potential threat posed by a terrorist using a biological agent. This response transcended traditional interagency boundaries, creating new opportunities while producing unique fiscal and leadership challenges. The National Interagency Confederation for Biological Research has made significant progress over the past 12 years because of its ability to adapt to the need for interagency cooperation and overcome many of these challenges. As construction of the National Interagency Biodefense Campus at Fort Detrick nears completion, the US has the capability to pursue a unique whole-of-government approach to the development of medical measures to counter the threat of bioterrorism. In addition to the high-level support of many in the federal government, the key success factors for this effort have been (1) a critical mass of leaders with the right leadership characteristics, (2) development of a compelling vision and accompanying narrative understood and articulated by all partnering organizations, and (3) recognition of the need for a partnership office to do the important communication and collaboration work in the organization to synchronize the information available to all the partners. The major barrier to interagency cooperative efforts of this kind is the inability to comingle funds from different appropriations.

  18. Supply Chain Interoperability Measurement

    DTIC Science & Technology

    2015-06-19

    Supply Chain Interoperability Measurement DISSERTATION June 2015 Christos E. Chalyvidis, Major, Hellenic Air...ENS-DS-15-J-001 SUPPLY CHAIN INTEROPERABILITY MEASUREMENT DISSERTATION Presented to the Faculty Department of Operational Sciences...INTEROPERABILITY MEASUREMENT Christos E. Chalyvidis, BS, MSc. Major, Hellenic Air Force Committee Membership: Dr. A.W. Johnson Chair

  19. Disease management as a performance improvement strategy.

    PubMed

    McClatchey, S

    2001-11-01

    Disease management is a strategy of organizing care and services for a patient population across the continuum. It is characterized by a population database, interdisciplinary and interagency collaboration, and evidence-based clinical information. The effectiveness of a disease management program has been measured by a combination of clinical, financial, and quality of life outcomes. In early 1997, driven by a strategic planning process that established three Centers of Excellence (COE), we implemented disease management as the foundation for a new approach to performance improvement utilizing five key strategies. The five implementation strategies are outlined, in addition to a review of the key elements in outcome achievement.

  20. 78 FR 76889 - Proposed Addendum to the Interagency Policy Statement on Income Tax Allocation in a Holding...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-19

    ... Interagency Policy Statement on Income Tax Allocation in a Holding Company Structure AGENCY: Board of... ``Interagency Policy Statement on Income Tax Allocation in a Holding Company Structure'' (63 FR 64757, Nov. 23... appropriate relationship regarding the payment of taxes and treatment of tax refunds. The Proposed Addendum...

  1. 75 FR 62399 - Public Meeting To Solicit Input for a Strategic Plan for Federal Youth Policy

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-08

    ... that the Interagency Working Group on Youth Programs solicit input from young people, State children's... Services, in its role as the Chair of the Interagency Working Group on Youth Programs, is announcing a... site for the Interagency Working Group on Youth Programs at http://www.FindYouthInfo.gov for...

  2. 75 FR 80054 - Input for a Strategic Plan for Federal Youth Policy

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-21

    ... role as the Chair of the Interagency Working Group on Youth Programs requests public comments to inform...: Visit the Web site for the Interagency Working Group on Youth Programs at http://www.FindYouthInfo.gov... to FindYouth[email protected] . SUPPLEMENTARY INFORMATION: I. Overview of the Interagency Working Group on...

  3. 75 FR 48690 - Public Meeting To Solicit Input for a Strategic Plan for Federal Youth Policy

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-11

    ... directing the Interagency Working Group on Youth Programs to solicit input from young people, State children... Services, in its role as the Chair of the Interagency Working Group on Youth Programs, is announcing a... site for the Interagency Working Group on Youth Programs at http://www.FindYouthInfo.gov for...

  4. 75 FR 60756 - Public Meeting to Solicit Input for a Strategic Plan for Federal Youth Policy

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-01

    ... Services, in its role as the Chair of the Interagency Working Group on Youth Programs, is announcing a... FURTHER INFORMATION CONTACT: Visit the Web site for the Interagency Working Group on Youth Programs at http://www.FindYouthInfo.gov for information on how to register, or contact the Interagency Working...

  5. 49 CFR 801.55 - Interagency and intra-agency exchanges.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Interagency and intra-agency exchanges. 801.55... Interagency and intra-agency exchanges. (a) Pursuant to 5 U.S.C. 552(b)(5), any record prepared by an NTSB..., or contracting officer. (b) The purpose of this section is to protect the full and frank exchange of...

  6. 78 FR 24429 - Agency Information Collection Activities: Inter-Agency Alien Witness and Informant Record, Form I...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-25

    ...-0046] Agency Information Collection Activities: Inter-Agency Alien Witness and Informant Record, Form I... the Form/Collection: Inter-Agency Alien Witness and Informant Record. (3) Agency form number, if any... Government. Form I-854 is used by law enforcement agencies to bring alien witnesses and informants to the...

  7. Interoperability and information discovery

    USGS Publications Warehouse

    Christian, E.

    2001-01-01

    In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.

  8. 80 FR 46010 - Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2015-08-03

    ...] Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments AGENCY: Food... workshop entitled ``FDA/CDC/NLM Workshop on Promoting Semantic Interoperability of Laboratory Data.'' The... to promoting the semantic interoperability of laboratory data between in vitro diagnostic devices and...

  9. Metadata mapping and reuse in caBIG.

    PubMed

    Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis

    2009-02-05

    This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG framework or other frameworks that use metadata repositories. The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG framework and potentially any framework that uses a metadata repository. This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG. This effort contributes to facilitating the development of interoperable systems within caBIG as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies.

  10. Designing for scale: optimising the health information system architecture for mobile maternal health messaging in South Africa (MomConnect)

    PubMed Central

    Seebregts, Christopher; Dane, Pierre; Parsons, Annie Neo; Fogwill, Thomas; Rogers, Debbie; Bekker, Marcha; Shaw, Vincent; Barron, Peter

    2018-01-01

    MomConnect is a national initiative coordinated by the South African National Department of Health that sends text-based mobile phone messages free of charge to pregnant women who voluntarily register at any public healthcare facility in South Africa. We describe the system design and architecture of the MomConnect technical platform, planned as a nationally scalable and extensible initiative. It uses a health information exchange that can connect any standards-compliant electronic front-end application to any standards-compliant electronic back-end database. The implementation of the MomConnect technical platform, in turn, is a national reference application for electronic interoperability in line with the South African National Health Normative Standards Framework. The use of open content and messaging standards enables the architecture to include any application adhering to the selected standards. Its national implementation at scale demonstrates both the use of this technology and a key objective of global health information systems, which is to achieve implementation scale. The system’s limited clinical information, initially, allowed the architecture to focus on the base standards and profiles for interoperability in a resource-constrained environment with limited connectivity and infrastructural capacity. Maintenance of the system requires mobilisation of national resources. Future work aims to use the standard interfaces to include data from additional applications as well as to extend and interface the framework with other public health information systems in South Africa. The development of this platform has also shown the benefits of interoperability at both an organisational and technical level in South Africa. PMID:29713506

  11. Designing for scale: optimising the health information system architecture for mobile maternal health messaging in South Africa (MomConnect).

    PubMed

    Seebregts, Christopher; Dane, Pierre; Parsons, Annie Neo; Fogwill, Thomas; Rogers, Debbie; Bekker, Marcha; Shaw, Vincent; Barron, Peter

    2018-01-01

    MomConnect is a national initiative coordinated by the South African National Department of Health that sends text-based mobile phone messages free of charge to pregnant women who voluntarily register at any public healthcare facility in South Africa. We describe the system design and architecture of the MomConnect technical platform, planned as a nationally scalable and extensible initiative. It uses a health information exchange that can connect any standards-compliant electronic front-end application to any standards-compliant electronic back-end database. The implementation of the MomConnect technical platform, in turn, is a national reference application for electronic interoperability in line with the South African National Health Normative Standards Framework. The use of open content and messaging standards enables the architecture to include any application adhering to the selected standards. Its national implementation at scale demonstrates both the use of this technology and a key objective of global health information systems, which is to achieve implementation scale. The system's limited clinical information, initially, allowed the architecture to focus on the base standards and profiles for interoperability in a resource-constrained environment with limited connectivity and infrastructural capacity. Maintenance of the system requires mobilisation of national resources. Future work aims to use the standard interfaces to include data from additional applications as well as to extend and interface the framework with other public health information systems in South Africa. The development of this platform has also shown the benefits of interoperability at both an organisational and technical level in South Africa.

  12. Filling the gaps between tools and users: a tool comparator, using protein-protein interaction as an example.

    PubMed

    Kano, Yoshinobu; Nguyen, Ngan; Saetre, Rune; Yoshida, Kazuhiro; Miyao, Yusuke; Tsuruoka, Yoshimasa; Matsubayashi, Yuichiro; Ananiadou, Sophia; Tsujii, Jun'ichi

    2008-01-01

    Recently, several text mining programs have reached a near-practical level of performance. Some systems are already being used by biologists and database curators. However, it has also been recognized that current Natural Language Processing (NLP) and Text Mining (TM) technology is not easy to deploy, since research groups tend to develop systems that cater specifically to their own requirements. One of the major reasons for the difficulty of deployment of NLP/TM technology is that re-usability and interoperability of software tools are typically not considered during development. While some effort has been invested in making interoperable NLP/TM toolkits, the developers of end-to-end systems still often struggle to reuse NLP/TM tools, and often opt to develop similar programs from scratch instead. This is particularly the case in BioNLP, since the requirements of biologists are so diverse that NLP tools have to be adapted and re-organized in a much more extensive manner than was originally expected. Although generic frameworks like UIMA (Unstructured Information Management Architecture) provide promising ways to solve this problem, the solution that they provide is only partial. In order for truly interoperable toolkits to become a reality, we also need sharable type systems and a developer-friendly environment for software integration that includes functionality for systematic comparisons of available tools, a simple I/O interface, and visualization tools. In this paper, we describe such an environment that was developed based on UIMA, and we show its feasibility through our experience in developing a protein-protein interaction (PPI) extraction system.

  13. Assessing the readiness of precision medicine interoperabilty: An exploratory study of the National Institutes of Health genetic testing registry.

    PubMed

    Ronquillo, Jay G; Weng, Chunhua; Lester, William T

    2017-11-17

      Precision medicine involves three major innovations currently taking place in healthcare:  electronic health records, genomics, and big data.  A major challenge for healthcare providers, however, is understanding the readiness for practical application of initiatives like precision medicine.   To better understand the current state and challenges of precision medicine interoperability using a national genetic testing registry as a starting point, placed in the context of established interoperability formats.   We performed an exploratory analysis of the National Institutes of Health Genetic Testing Registry.  Relevant standards included Health Level Seven International Version 3 Implementation Guide for Family History, the Human Genome Organization Gene Nomenclature Committee (HGNC) database, and Systematized Nomenclature of Medicine - Clinical Terms (SNOMED CT).  We analyzed the distribution of genetic testing laboratories, genetic test characteristics, and standardized genome/clinical code mappings, stratified by laboratory setting. There were a total of 25472 genetic tests from 240 laboratories testing for approximately 3632 distinct genes.  Most tests focused on diagnosis, mutation confirmation, and/or risk assessment of germline mutations that could be passed to offspring.  Genes were successfully mapped to all HGNC identifiers, but less than half of tests mapped to SNOMED CT codes, highlighting significant gaps when linking genetic tests to standardized clinical codes that explain the medical motivations behind test ordering.  Conclusion:  While precision medicine could potentially transform healthcare, successful practical and clinical application will first require the comprehensive and responsible adoption of interoperable standards, terminologies, and formats across all aspects of the precision medicine pipeline.

  14. Inter-University Upper Atmosphere Global Observation Network (IUGONET) Metadata Database and Its Interoperability

    NASA Astrophysics Data System (ADS)

    Yatagai, A. I.; Iyemori, T.; Ritschel, B.; Koyama, Y.; Hori, T.; Abe, S.; Tanaka, Y.; Shinbori, A.; Umemura, N.; Sato, Y.; Yagi, M.; Ueno, S.; Hashiguchi, N. O.; Kaneda, N.; Belehaki, A.; Hapgood, M. A.

    2013-12-01

    The IUGONET is a Japanese program to build a metadata database for ground-based observations of the upper atmosphere [1]. The project began in 2009 with five Japanese institutions which archive data observed by radars, magnetometers, photometers, radio telescopes and helioscopes, and so on, at various altitudes from the Earth's surface to the Sun. Systems have been developed to allow searching of the above described metadata. We have been updating the system and adding new and updated metadata. The IUGONET development team adopted the SPASE metadata model [2] to describe the upper atmosphere data. This model is used as the common metadata format by the virtual observatories for solar-terrestrial physics. It includes metadata referring to each data file (called a 'Granule'), which enable a search for data files as well as data sets. Further details are described in [2] and [3]. Currently, three additional Japanese institutions are being incorporated in IUGONET. Furthermore, metadata of observations of the troposphere, taken at the observatories of the middle and upper atmosphere radar at Shigaraki and the Meteor radar in Indonesia, have been incorporated. These additions will contribute to efficient interdisciplinary scientific research. In the beginning of 2013, the registration of the 'Observatory' and 'Instrument' metadata was completed, which makes it easy to overview of the metadata database. The number of registered metadata as of the end of July, totalled 8.8 million, including 793 observatories and 878 instruments. It is important to promote interoperability and/or metadata exchange between the database development groups. A memorandum of agreement has been signed with the European Near-Earth Space Data Infrastructure for e-Science (ESPAS) project, which has similar objectives to IUGONET with regard to a framework for formal collaboration. Furthermore, observations by satellites and the International Space Station are being incorporated with a view for making/linking metadata databases. The development of effective data systems will contribute to the progress of scientific research on solar terrestrial physics, climate and the geophysical environment. Any kind of cooperation, metadata input and feedback, especially for linkage of the databases, is welcomed. References 1. Hayashi, H. et al., Inter-university Upper Atmosphere Global Observation Network (IUGONET), Data Sci. J., 12, WDS179-184, 2013. 2. King, T. et al., SPASE 2.0: A standard data model for space physics. Earth Sci. Inform. 3, 67-73, 2010, doi:10.1007/s12145-010-0053-4. 3. Hori, T., et al., Development of IUGONET metadata format and metadata management system. J. Space Sci. Info. Jpn., 105-111, 2012. (in Japanese)

  15. BioWarehouse: a bioinformatics database warehouse toolkit

    PubMed Central

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David WJ; Tenenbaum, Jessica D; Karp, Peter D

    2006-01-01

    Background This article addresses the problem of interoperation of heterogeneous bioinformatics databases. Results We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. Conclusion BioWarehouse embodies significant progress on the database integration problem for bioinformatics. PMID:16556315

  16. BioWarehouse: a bioinformatics database warehouse toolkit.

    PubMed

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David W J; Tenenbaum, Jessica D; Karp, Peter D

    2006-03-23

    This article addresses the problem of interoperation of heterogeneous bioinformatics databases. We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. BioWarehouse embodies significant progress on the database integration problem for bioinformatics.

  17. 75 FR 63462 - Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-15

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid Interoperability Standards October 7, 2010... directs the development of a framework to achieve interoperability of smart grid devices and systems...

  18. 81 FR 68435 - Workshop on Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2016-10-04

    ...] Workshop on Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments... Semantic Interoperability of Laboratory Data.'' The purpose of this public workshop is to receive and... Semantic Interoperability of Laboratory Data.'' Received comments will be placed in the docket and, except...

  19. Evaluation Methodology for UML and GML Application Schemas Quality

    NASA Astrophysics Data System (ADS)

    Chojka, Agnieszka

    2014-05-01

    INSPIRE Directive implementation in Poland has caused the significant increase of interest in making spatial data and services available, particularly among public administration and private institutions. This entailed a series of initiatives that aim to harmonise different spatial data sets, so to ensure their internal logical and semantic coherence. Harmonisation lets to reach the interoperability of spatial databases, then among other things enables joining them together. The process of harmonisation requires either working out new data structures or adjusting existing data structures of spatial databases to INSPIRE guidelines and recommendations. Data structures are described with the use of UML and GML application schemas. Although working out accurate and correct application schemas isn't an easy task. There should be considered many issues, for instance recommendations of ISO 19100 series of Geographic Information Standards, appropriate regulations for given problem or topic, production opportunities and limitations (software, tools). In addition, GML application schema is deeply connected with UML application schema, it should be its translation. Not everything that can be expressed in UML, though can be directly expressed in GML, and this can have significant influence on the spatial data sets interoperability, and thereby the ability to valid data exchange. For these reasons, the capability to examine and estimate UML and GML application schemas quality, therein also the capability to explore their entropy, would be very important. The principal subject of this research is to propose an evaluation methodology for UML and GML application schemas quality prepared in the Head Office of Geodesy and Cartography in Poland within the INSPIRE Directive implementation works.

  20. A Hybrid EAV-Relational Model for Consistent and Scalable Capture of Clinical Research Data.

    PubMed

    Khan, Omar; Lim Choi Keung, Sarah N; Zhao, Lei; Arvanitis, Theodoros N

    2014-01-01

    Many clinical research databases are built for specific purposes and their design is often guided by the requirements of their particular setting. Not only does this lead to issues of interoperability and reusability between research groups in the wider community but, within the project itself, changes and additions to the system could be implemented using an ad hoc approach, which may make the system difficult to maintain and even more difficult to share. In this paper, we outline a hybrid Entity-Attribute-Value and relational model approach for modelling data, in light of frequently changing requirements, which enables the back-end database schema to remain static, improving the extensibility and scalability of an application. The model also facilitates data reuse. The methods used build on the modular architecture previously introduced in the CURe project.

  1. Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101

    DTIC Science & Technology

    2012-07-01

    interoperability, although they are supported by some interoperability attributes  For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p

  2. Alternatives to animal testing: information resources via the Internet and World Wide Web.

    PubMed

    Hakkinen, P J Bert; Green, Dianne K

    2002-04-25

    Many countries, including the United States, Canada, European Union member states, and others, require that a comprehensive search for possible alternatives be completed before beginning some or all research involving animals. Completing comprehensive alternatives searches and keeping current with information associated with alternatives to animal testing is a challenge that will be made easier as people throughout the world gain access to the Internet and World Wide Web. Numerous Internet and World Wide Web resources are available to provide guidance and other information on in vitro and other alternatives to animal testing. A comprehensive Web site is Alternatives to Animal Testing on the Web (Altweb), which serves as an online clearinghouse for resources, information, and news about alternatives to animal testing. Examples of other important Web sites include the joint one for the (US) Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) and the National Toxicology Program (NTP) Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM) and the Norwegian Reference Centre for Laboratory Animal Science and Alternatives (The NORINA database). Internet mailing lists and online access to bulletin boards, discussion areas, newsletters, and journals are other ways to access and share information to stay current with alternatives to animal testing.

  3. Interprofessional and Interagency Training for Working with Young People with Harmful Sexual Behaviours: An Evaluation of Outcomes

    ERIC Educational Resources Information Center

    Hackett, Simon; Carpenter, John; Patsios, Demi; Szilassy, Eszter

    2013-01-01

    This study evaluates the outcomes of short interagency training courses provided by six Local Safeguarding Children Boards in England. The aim was to develop practical skills in recognising and responding to the needs of children with harmful sexual behaviour in an interagency context. The courses all employed interactive learning and teaching…

  4. 75 FR 62400 - Public Meeting To Solicit Input for a Strategic Plan for Federal Youth Policy

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-08

    ... directing the Interagency Working Group on Youth Programs to solicit input from young people, State children... Services, in its role as the Chair of the Interagency Working Group on Youth Programs, is announcing a... the Web site for the Interagency Working Group on Youth Programs at http://www.FindYouthInfo.gov for...

  5. Redefining HHS International Response: Challenges and Recommendations for Interagency Partnerships

    DTIC Science & Technology

    2009-04-01

    Approach and Enhance Interagency Planning, 1. 22 David Hinson , U.S. Military Interaction with Humanitarian Assistance Organizations During Small-Scale...Defense University Press, 1996); http://www.dodccrp.org/files/Hayes_Interagency.pdf (accessed March 2, 2009. 31 Hinson , Military Interaction with...Humanitarian Assistance Organizations, 29. 32 Benton and Ware, “Haiti: A Case Study”. 33 Hinson , Military Interaction with Humanitarian Assistance

  6. Applying Goldwater-Nichols Reforms to Foster Interagency Cooperation Between Public Safety Agencies in New York City

    DTIC Science & Technology

    2007-03-01

    release; distribution is unlimited APPLYING GOLDWATER-NICHOLS REFORMS TO FOSTER INTERAGENCY COOPERATION BETWEEN PUBLIC SAFTEY AGENCIES IN NEW YORK...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited APPLYING GOLDWATER... Applying Goldwater-Nichols Reforms to Foster Interagency Cooperation Between Public Safety Agencies in New York City 6. AUTHOR(S) Joseph P

  7. Keeping it wild 2: An updated interagency strategy to monitor trends in wilderness character across the National Wilderness Preservation System

    Treesearch

    Peter Landres; Chris Barns; Steve Boutcher; Tim Devine; Peter Dratch; Adrienne Lindholm; Linda Merigliano; Nancy Roeper; Emily Simpson

    2015-01-01

    Keeping It Wild 2 is an interagency strategy to monitor trends in selected attributes of wilderness character based on lessons learned from 15 years of developing and implementing wilderness character monitoring across the National Wilderness Preservation System. This document updates and replaces Keeping It Wild: An Interagency Strategy for Monitoring Wilderness...

  8. Federal Interagency Coordination for Invasive Plant Issues -- The Federal Interagency Committee for the Management of Noxious and Exotic Weeds (FICMNEW)

    USGS Publications Warehouse

    Westbrooks, Randy G.

    2011-01-01

    The U.S. Federal Interagency Committee for the Management of Noxious and Exotic Weeds (FICMNEW) is a formal partnership between 16 federal agencies that have invasive plant management and regulatory responsibilities for the United States and its territories. Efforts to develop a national level federal interagency committee to coordinate federal activities were initiated by national weed program managers with the USDA Forest Service and the Bureau of Land Management in 1989. FICMNEW was formally established through a Memorandum of Understanding that was signed by agency administrators of member agencies in August, 1994.

  9. 76 FR 66040 - NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-25

    ...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...

  10. Warfighter IT Interoperability Standards Study

    DTIC Science & Technology

    2012-07-22

    data (e.g. messages) between systems ? ii) What process did you used to validate and certify semantic interoperability between your...other systems at this time There was no requirement to validate and certify semantic interoperability The DLS program exchanges data with... semantics Testing for System Compliance with Data Models Verify and Certify Interoperability Using Data

  11. Arctic Observing Network Data Management: Current Capabilities and Their Promise for the Future

    NASA Astrophysics Data System (ADS)

    Collins, J.; Fetterer, F.; Moore, J. A.

    2008-12-01

    CADIS (the Cooperative Arctic Data and Information Service) serves as the data management, discovery and delivery component of the Arctic Observing Network (AON). As an International Polar Year (IPY) initiative, AON comprises 34 land, atmosphere and ocean observation sites, and will acquire much of the data coming from the interagency Study of Environmental Arctic Change (SEARCH). CADIS is tasked with ensuring that these observational data are managed for long term use by members of the entire Earth System Science community. Portions of CADIS are either in use by the community or available for testing. We now have an opportunity to evaluate the feedback received from our users, to identify any design shortcomings, and to identify those elements which serve their purpose well and will support future development. This presentation will focus on the nuts-and-bolts of the CADIS development to date, with an eye towards presenting lessons learned and best practices based on our experiences so far. The topics include: - How did we assess our users' needs, and how are those contributions reflected in the end product and its capabilities? - Why did we develop a CADIS metadata profile, and how does it allow CADIS to support preservation and scientific interoperability? - How can we shield the user from metadata complexities (especially those associated with various standards) while still obtaining the metadata needed to support an effective data management system? - How can we bridge the gap between the data storage formats considered convenient by researchers in the field, and those which are necessary to provide data interoperability? - What challenges have been encountered in our efforts to provide access to federated data (data stored outside of the CADIS system)? - What are the data browsing and visualization needs of the AON community, and which tools and technologies are most promising in terms of supporting those needs? A live demonstration of the current capabilities of the CADIS system will be included as time and logistics allow. CADIS is a joint effort of the University Corporation for Atmospheric Research (UCAR), the National Snow and Ice Data Center (NSIDC), and the National Center for Atmospheric Research (NCAR).

  12. Enabling interoperability in planetary sciences and heliophysics: The case for an information model

    NASA Astrophysics Data System (ADS)

    Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.

    2018-01-01

    The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.

  13. Spatial cyberinfrastructures, ontologies, and the humanities

    PubMed Central

    Sieber, Renee E.; Wellen, Christopher C.; Jin, Yuan

    2011-01-01

    We report on research into building a cyberinfrastructure for Chinese biographical and geographic data. Our cyberinfrastructure contains (i) the McGill-Harvard-Yenching Library Ming Qing Women's Writings database (MQWW), the only online database on historical Chinese women's writings, (ii) the China Biographical Database, the authority for Chinese historical people, and (iii) the China Historical Geographical Information System, one of the first historical geographic information systems. Key to this integration is that linked databases retain separate identities as bases of knowledge, while they possess sufficient semantic interoperability to allow for multidatabase concepts and to support cross-database queries on an ad hoc basis. Computational ontologies create underlying semantics for database access. This paper focuses on the spatial component in a humanities cyberinfrastructure, which includes issues of conflicting data, heterogeneous data models, disambiguation, and geographic scale. First, we describe the methodology for integrating the databases. Then we detail the system architecture, which includes a tier of ontologies and schema. We describe the user interface and applications that allow for cross-database queries. For instance, users should be able to analyze the data, examine hypotheses on spatial and temporal relationships, and generate historical maps with datasets from MQWW for research, teaching, and publication on Chinese women writers, their familial relations, publishing venues, and the literary and social communities. Last, we discuss the social side of cyberinfrastructure development, as people are considered to be as critical as the technical components for its success. PMID:21444819

  14. SAADA: Astronomical Databases Made Easier

    NASA Astrophysics Data System (ADS)

    Michel, L.; Nguyen, H. N.; Motch, C.

    2005-12-01

    Many astronomers wish to share datasets with their community but have not enough manpower to develop databases having the functionalities required for high-level scientific applications. The SAADA project aims at automatizing the creation and deployment process of such databases. A generic but scientifically relevant data model has been designed which allows one to build databases by providing only a limited number of product mapping rules. Databases created by SAADA rely on a relational database supporting JDBC and covered by a Java layer including a lot of generated code. Such databases can simultaneously host spectra, images, source lists and plots. Data are grouped in user defined collections whose content can be seen as one unique set per data type even if their formats differ. Datasets can be correlated one with each other using qualified links. These links help, for example, to handle the nature of a cross-identification (e.g., a distance or a likelihood) or to describe their scientific content (e.g., by associating a spectrum to a catalog entry). The SAADA query engine is based on a language well suited to the data model which can handle constraints on linked data, in addition to classical astronomical queries. These constraints can be applied on the linked objects (number, class and attributes) and/or on the link qualifier values. Databases created by SAADA are accessed through a rich WEB interface or a Java API. We are currently developing an inter-operability module implanting VO protocols.

  15. Interagency Task Forces: The Right Tools for the Job

    DTIC Science & Technology

    2011-01-01

    shortcomings. This analysis discusses four organizational reform models and recommends the interagency task force ( IATF ) as the preferred structure...model.64 Still others recommend creating and deploying ad hoc IATFs for crisis operations. These interagency task forces would be task- organized to...forces assigned for planning, exercises, and mission execution.65 A 2005 article in Policy Review recommended developing IATFs as needed for specific

  16. The Adequacy of Current Interagency Doctrine

    DTIC Science & Technology

    2007-06-15

    proposed new initiatives, the CSIS model calls for the establishment of an Interagency Task Force ( IATF ) to achieve unity of effort at the tactical...level, specifically 15 for reconstruction and stability operations. The IATF would assume the lead from the COCOM once major combat operations were...complete in a given area or region. The IATF would be civilian-led, directly control full interagency resources, and have dedicated funding authority

  17. Synchronizing U.S. Government Efforts Toward Collaborative Health Care Policymaking in Iraq

    DTIC Science & Technology

    2010-03-01

    Cerami and Boggs, eds., The Interagency and Counterinsurgency Warfare, pp. 25-46; see also Amanda Smith, “Strategic Communication: Interagency Rhetoric...Security Presidential Directive 44, Management of Interagency Efforts, December 7, 2005; see also Douglas C. Lovelace , Jr., “Foreword” in Greg Kaufmann...U.S. ARMY WAR COLLEGE Major General Robert M. Williams Commandant ***** STRATEGIC STUDIES INSTITUTE Director Professor Douglas C. Lovelace , Jr

  18. Keeping it wild: an interagency strategy to monitor trends in wilderness character across the National Wilderness Preservation System

    Treesearch

    Peter Landres; Chris Barns; John G. Dennis; Tim Devine; Paul Geissler; Curtis S. McCasland; Linda Merigliano; Justin Seastrand; Ralph Swain

    2008-01-01

    The Interagency Wilderness Character Monitoring Team--representing the Department of the Interior (DOI) Bureau of Land Management, DOI Fish and Wildlife Service, DOI National Park Service, DOI U.S. Geological Survey, and the U.S. Forest Service-offers in this document an interagency strategy to monitor trends in wilderness character across the National Wilderness...

  19. 76 FR 51271 - Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the 700 MHz Band

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-18

    ... Docket 07-100; FCC 11-6] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the... interoperable public safety broadband network. The establishment of a common air interface for 700 MHz public safety broadband networks will create a foundation for interoperability and provide a clear path for the...

  20. Political, policy and social barriers to health system interoperability: emerging opportunities of Web 2.0 and 3.0.

    PubMed

    Juzwishin, Donald W M

    2009-01-01

    Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.

  1. D-ATM, a working example of health care interoperability: From dirt path to gravel road.

    PubMed

    DeClaris, John-William

    2009-01-01

    For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.

  2. Publication of nuclear magnetic resonance experimental data with semantic web technology and the application thereof to biomedical research of proteins.

    PubMed

    Yokochi, Masashi; Kobayashi, Naohiro; Ulrich, Eldon L; Kinjo, Akira R; Iwata, Takeshi; Ioannidis, Yannis E; Livny, Miron; Markley, John L; Nakamura, Haruki; Kojima, Chojiro; Fujiwara, Toshimichi

    2016-05-05

    The nuclear magnetic resonance (NMR) spectroscopic data for biological macromolecules archived at the BioMagResBank (BMRB) provide a rich resource of biophysical information at atomic resolution. The NMR data archived in NMR-STAR ASCII format have been implemented in a relational database. However, it is still fairly difficult for users to retrieve data from the NMR-STAR files or the relational database in association with data from other biological databases. To enhance the interoperability of the BMRB database, we present a full conversion of BMRB entries to two standard structured data formats, XML and RDF, as common open representations of the NMR-STAR data. Moreover, a SPARQL endpoint has been deployed. The described case study demonstrates that a simple query of the SPARQL endpoints of the BMRB, UniProt, and Online Mendelian Inheritance in Man (OMIM), can be used in NMR and structure-based analysis of proteins combined with information of single nucleotide polymorphisms (SNPs) and their phenotypes. We have developed BMRB/XML and BMRB/RDF and demonstrate their use in performing a federated SPARQL query linking the BMRB to other databases through standard semantic web technologies. This will facilitate data exchange across diverse information resources.

  3. Interoperability of Information Systems Managed and Used by the Local Health Departments.

    PubMed

    Shah, Gulzar H; Leider, Jonathon P; Luo, Huabin; Kaur, Ravneet

    2016-01-01

    In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide.

  4. National electronic health record interoperability chronology.

    PubMed

    Hufnagel, Stephen P

    2009-05-01

    The federal initiative for electronic health record (EHR) interoperability began in 2000 and set the stage for the establishment of the 2004 Executive Order for EHR interoperability by 2014. This article discusses the chronology from the 2001 e-Government Consolidated Health Informatics (CHI) initiative through the current congressional mandates for an aligned, interoperable, and agile DoD AHLTA and VA VistA.

  5. Force of Choice: Optimizing Theater Special Operations Commands to Achieve Synchronized Effects

    DTIC Science & Technology

    2012-12-01

    GCC Geographic Combatant Command GFM Global Force Management GSN Global SOF Network (aka EGSN) IA Interagency IATF Interagency Task Force...and through African partners.75 SOCOM NCR was chosen because it is a primary outgrowth of the SOCOM Interagency Task Force ( IATF ), and the...result, SOCOM established the IATF and Special Operations Support Teams (SOST). While the IATF remained at SOCOM Headquarters at MacDill AFB, the

  6. 7 CFR Exhibit D to Subpart B of... - Fact Sheet-The Federal Interagency Task Force on Food and Shelter for the Homeless

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 14 2010-01-01 2009-01-01 true Fact Sheet-The Federal Interagency Task Force on Food... (CONTINUED) PROPERTY MANAGEMENT Management of Property Exhibit D to Subpart B of Part 1955—Fact Sheet—The Federal Interagency Task Force on Food and Shelter for the Homeless Editorial Note: Exhibit D is not...

  7. On the formal definition of the systems' interoperability capability: an anthropomorphic approach

    NASA Astrophysics Data System (ADS)

    Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav

    2017-03-01

    The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.

  8. Employing Semantic Technologies for the Orchestration of Government Services

    NASA Astrophysics Data System (ADS)

    Sabol, Tomáš; Furdík, Karol; Mach, Marián

    The main aim of the eGovernment is to provide efficient, secure, inclusive services for its citizens and businesses. The necessity to integrate services and information resources, to increase accessibility, to reduce the administrative burden on citizens and enterprises - these are only a few reasons why the paradigm of the eGovernment has been shifted from the supply-driven approach toward the connected governance, emphasizing the concept of interoperability (Archmann and Nielsen 2008). On the EU level, the interoperability is explicitly addressed as one of the four main challenges, including in the i2010 strategy (i2010 2005). The Commission's Communication (Interoperability for Pan-European eGovernment Services 2006) strongly emphasizes the necessity of interoperable eGovernment services, based on standards, open specifications, and open interfaces. The Pan-European interoperability initiatives, such as the European Interoperability Framework (2004) and IDABC, as well as many projects supported by the European Commission within the IST Program and the Competitiveness and Innovation Program (CIP), illustrate the importance of interoperability on the EU level.

  9. Empowering open systems through cross-platform interoperability

    NASA Astrophysics Data System (ADS)

    Lyke, James C.

    2014-06-01

    Most of the motivations for open systems lie in the expectation of interoperability, sometimes referred to as "plug-and-play". Nothing in the notion of "open-ness", however, guarantees this outcome, which makes the increased interest in open architecture more perplexing. In this paper, we explore certain themes of open architecture. We introduce the concept of "windows of interoperability", which can be used to align disparate portions of architecture. Such "windows of interoperability", which concentrate on a reduced set of protocol and interface features, might achieve many of the broader purposes assigned as benefits in open architecture. Since it is possible to engineer proprietary systems that interoperate effectively, this nuanced definition of interoperability may in fact be a more important concept to understand and nurture for effective systems engineering and maintenance.

  10. The ChIP-Seq tools and web server: a resource for analyzing ChIP-seq and other types of genomic data.

    PubMed

    Ambrosini, Giovanna; Dreos, René; Kumar, Sunil; Bucher, Philipp

    2016-11-18

    ChIP-seq and related high-throughput chromatin profilig assays generate ever increasing volumes of highly valuable biological data. To make sense out of it, biologists need versatile, efficient and user-friendly tools for access, visualization and itegrative analysis of such data. Here we present the ChIP-Seq command line tools and web server, implementing basic algorithms for ChIP-seq data analysis starting with a read alignment file. The tools are optimized for memory-efficiency and speed thus allowing for processing of large data volumes on inexpensive hardware. The web interface provides access to a large database of public data. The ChIP-Seq tools have a modular and interoperable design in that the output from one application can serve as input to another one. Complex and innovative tasks can thus be achieved by running several tools in a cascade. The various ChIP-Seq command line tools and web services either complement or compare favorably to related bioinformatics resources in terms of computational efficiency, ease of access to public data and interoperability with other web-based tools. The ChIP-Seq server is accessible at http://ccg.vital-it.ch/chipseq/ .

  11. The Banana Genome Hub

    PubMed Central

    Droc, Gaëtan; Larivière, Delphine; Guignon, Valentin; Yahiaoui, Nabila; This, Dominique; Garsmeur, Olivier; Dereeper, Alexis; Hamelin, Chantal; Argout, Xavier; Dufayard, Jean-François; Lengelle, Juliette; Baurens, Franc-Christophe; Cenci, Alberto; Pitollat, Bertrand; D’Hont, Angélique; Ruiz, Manuel; Rouard, Mathieu; Bocs, Stéphanie

    2013-01-01

    Banana is one of the world’s favorite fruits and one of the most important crops for developing countries. The banana reference genome sequence (Musa acuminata) was recently released. Given the taxonomic position of Musa, the completed genomic sequence has particular comparative value to provide fresh insights about the evolution of the monocotyledons. The study of the banana genome has been enhanced by a number of tools and resources that allows harnessing its sequence. First, we set up essential tools such as a Community Annotation System, phylogenomics resources and metabolic pathways. Then, to support post-genomic efforts, we improved banana existing systems (e.g. web front end, query builder), we integrated available Musa data into generic systems (e.g. markers and genetic maps, synteny blocks), we have made interoperable with the banana hub, other existing systems containing Musa data (e.g. transcriptomics, rice reference genome, workflow manager) and finally, we generated new results from sequence analyses (e.g. SNP and polymorphism analysis). Several uses cases illustrate how the Banana Genome Hub can be used to study gene families. Overall, with this collaborative effort, we discuss the importance of the interoperability toward data integration between existing information systems. Database URL: http://banana-genome.cirad.fr/ PMID:23707967

  12. Clinical Predictive Modeling Development and Deployment through FHIR Web Services.

    PubMed

    Khalilia, Mohammed; Choi, Myung; Henderson, Amelia; Iyengar, Sneha; Braunstein, Mark; Sun, Jimeng

    2015-01-01

    Clinical predictive modeling involves two challenging tasks: model development and model deployment. In this paper we demonstrate a software architecture for developing and deploying clinical predictive models using web services via the Health Level 7 (HL7) Fast Healthcare Interoperability Resources (FHIR) standard. The services enable model development using electronic health records (EHRs) stored in OMOP CDM databases and model deployment for scoring individual patients through FHIR resources. The MIMIC2 ICU dataset and a synthetic outpatient dataset were transformed into OMOP CDM databases for predictive model development. The resulting predictive models are deployed as FHIR resources, which receive requests of patient information, perform prediction against the deployed predictive model and respond with prediction scores. To assess the practicality of this approach we evaluated the response and prediction time of the FHIR modeling web services. We found the system to be reasonably fast with one second total response time per patient prediction.

  13. [Security specifications for electronic medical records on the Internet].

    PubMed

    Mocanu, Mihai; Mocanu, Carmen

    2007-01-01

    The extension for the Web applications of the Electronic Medical Record seems both interesting and promising. Correlated with the expansion of Internet in our country, it allows the interconnection of physicians of different specialties and their collaboration for better treatment of patients. In this respect, the ophthalmologic medical applications consider the increased possibilities for monitoring chronic ocular diseases and for the identification of some elements for early diagnosis and risk factors supervision. We emphasize in this survey some possible solutions to the problems of interconnecting medical information systems to the Internet: the achievement of interoperability within medical organizations through the use of open standards, the automated input and processing for ocular imaging, the use of data reduction techniques in order to increase the speed of image retrieval in large databases, and, last but not least, the resolution of security and confidentiality problems in medical databases.

  14. EcoliWiki: a wiki-based community resource for Escherichia coli

    PubMed Central

    McIntosh, Brenley K.; Renfro, Daniel P.; Knapp, Gwendowlyn S.; Lairikyengbam, Chanchala R.; Liles, Nathan M.; Niu, Lili; Supak, Amanda M.; Venkatraman, Anand; Zweifel, Adrienne E.; Siegele, Deborah A.; Hu, James C.

    2012-01-01

    EcoliWiki is the community annotation component of the PortEco (http://porteco.org; formerly EcoliHub) project, an online data resource that integrates information on laboratory strains of Escherichia coli, its phages, plasmids and mobile genetic elements. As one of the early adopters of the wiki approach to model organism databases, EcoliWiki was designed to not only facilitate community-driven sharing of biological knowledge about E. coli as a model organism, but also to be interoperable with other data resources. EcoliWiki content currently covers genes from five laboratory E. coli strains, 21 bacteriophage genomes, F plasmid and eight transposons. EcoliWiki integrates the Mediawiki wiki platform with other open-source software tools and in-house software development to extend how wikis can be used for model organism databases. EcoliWiki can be accessed online at http://ecoliwiki.net. PMID:22064863

  15. Clinical Predictive Modeling Development and Deployment through FHIR Web Services

    PubMed Central

    Khalilia, Mohammed; Choi, Myung; Henderson, Amelia; Iyengar, Sneha; Braunstein, Mark; Sun, Jimeng

    2015-01-01

    Clinical predictive modeling involves two challenging tasks: model development and model deployment. In this paper we demonstrate a software architecture for developing and deploying clinical predictive models using web services via the Health Level 7 (HL7) Fast Healthcare Interoperability Resources (FHIR) standard. The services enable model development using electronic health records (EHRs) stored in OMOP CDM databases and model deployment for scoring individual patients through FHIR resources. The MIMIC2 ICU dataset and a synthetic outpatient dataset were transformed into OMOP CDM databases for predictive model development. The resulting predictive models are deployed as FHIR resources, which receive requests of patient information, perform prediction against the deployed predictive model and respond with prediction scores. To assess the practicality of this approach we evaluated the response and prediction time of the FHIR modeling web services. We found the system to be reasonably fast with one second total response time per patient prediction. PMID:26958207

  16. Effectiveness of Interagency Cooperation at the Provincial Reconstruction Team Level in Afghanistan

    DTIC Science & Technology

    2011-04-07

    Carlisle, PA: Strategic Studies Institute, U.S. Army War College, 2009. Marcella , Gabriel, ed. Affairs of State: The Interagency and National Security...Joint Operations, vi. 12 O’Neil, 4. 13 Gabriel Marcella , ed., Affairs of State: The Interagency and National Security (Carlisle, PA: Strategic Studies...O’Neil, 4. 16 Marcella , 4-6. 17 Michele A. Flournoy, "Prepared Statement For House Armed Services Subcommittee on Oversight and Investigations

  17. Managing Interoperability for GEOSS - A Report from the SIF

    NASA Astrophysics Data System (ADS)

    Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.

    2009-04-01

    The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of GEOSS.

  18. Interoperability of Information Systems Managed and Used by the Local Health Departments

    PubMed Central

    Leider, Jonathon P.; Luo, Huabin; Kaur, Ravneet

    2016-01-01

    Background: In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). Objectives: To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. Data and Methods: This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. Results: For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Conclusion: Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide. PMID:27684616

  19. Metadata mapping and reuse in caBIG™

    PubMed Central

    Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis

    2009-01-01

    Background This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG™). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG™ framework or other frameworks that use metadata repositories. Results The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG™ framework and potentially any framework that uses a metadata repository. Conclusion This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG™. This effort contributes to facilitating the development of interoperable systems within caBIG™ as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies. PMID:19208192

  20. Content Model Use and Development to Redeem Thin Section Records

    NASA Astrophysics Data System (ADS)

    Hills, D. J.

    2014-12-01

    The National Geothermal Data System (NGDS) is a catalog of documents and datasets that provide information about geothermal resources located primarily within the United States. The goal of NGDS is to make large quantities of geothermal-relevant geoscience data available to the public by creating a national, sustainable, distributed, and interoperable network of data providers. The Geological Survey of Alabama (GSA) has been a data provider in the initial phase of NGDS. One method by which NGDS facilitates interoperability is through the use of content models. Content models provide a schema (structure) for submitted data. Schemas dictate where and how data should be entered. Content models use templates that simplify data formatting to expedite use by data providers. These methodologies implemented by NGDS can extend beyond geothermal data to all geoscience data. The GSA, using the NGDS physical samples content model, has tested and refined a content model for thin sections and thin section photos. Countless thin sections have been taken from oil and gas well cores housed at the GSA, and many of those thin sections have related photomicrographs. Record keeping for these thin sections has been scattered at best, and it is critical to capture their metadata while the content creators are still available. A next step will be to register the GSA's thin sections with SESAR (System for Earth Sample Registration) and assign an IGSN (International Geo Sample Number) to each thin section. Additionally, the thin section records will be linked to the GSA's online record database. When complete, the GSA's thin sections will be more readily discoverable and have greater interoperability. Moving forward, the GSA is implementing use of NGDS-like content models and registration with SESAR and IGSN to improve collection maintenance and management of additional physical samples.

  1. A case study in open source innovation: developing the Tidepool Platform for interoperability in type 1 diabetes management.

    PubMed

    Neinstein, Aaron; Wong, Jenise; Look, Howard; Arbiter, Brandon; Quirk, Kent; McCanne, Steve; Sun, Yao; Blum, Michael; Adi, Saleh

    2016-03-01

    Develop a device-agnostic cloud platform to host diabetes device data and catalyze an ecosystem of software innovation for type 1 diabetes (T1D) management. An interdisciplinary team decided to establish a nonprofit company, Tidepool, and build open-source software. Through a user-centered design process, the authors created a software platform, the Tidepool Platform, to upload and host T1D device data in an integrated, device-agnostic fashion, as well as an application ("app"), Blip, to visualize the data. Tidepool's software utilizes the principles of modular components, modern web design including REST APIs and JavaScript, cloud computing, agile development methodology, and robust privacy and security. By consolidating the currently scattered and siloed T1D device data ecosystem into one open platform, Tidepool can improve access to the data and enable new possibilities and efficiencies in T1D clinical care and research. The Tidepool Platform decouples diabetes apps from diabetes devices, allowing software developers to build innovative apps without requiring them to design a unique back-end (e.g., database and security) or unique ways of ingesting device data. It allows people with T1D to choose to use any preferred app regardless of which device(s) they use. The authors believe that the Tidepool Platform can solve two current problems in the T1D device landscape: 1) limited access to T1D device data and 2) poor interoperability of data from different devices. If proven effective, Tidepool's open source, cloud model for health data interoperability is applicable to other healthcare use cases. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  2. Clinical information modeling processes for semantic interoperability of electronic health records: systematic review and inductive analysis.

    PubMed

    Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak

    2015-07-01

    This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Implementation of a scalable, web-based, automated clinical decision support risk-prediction tool for chronic kidney disease using C-CDA and application programming interfaces.

    PubMed

    Samal, Lipika; D'Amore, John D; Bates, David W; Wright, Adam

    2017-11-01

    Clinical decision support tools for risk prediction are readily available, but typically require workflow interruptions and manual data entry so are rarely used. Due to new data interoperability standards for electronic health records (EHRs), other options are available. As a clinical case study, we sought to build a scalable, web-based system that would automate calculation of kidney failure risk and display clinical decision support to users in primary care practices. We developed a single-page application, web server, database, and application programming interface to calculate and display kidney failure risk. Data were extracted from the EHR using the Consolidated Clinical Document Architecture interoperability standard for Continuity of Care Documents (CCDs). EHR users were presented with a noninterruptive alert on the patient's summary screen and a hyperlink to details and recommendations provided through a web application. Clinic schedules and CCDs were retrieved using existing application programming interfaces to the EHR, and we provided a clinical decision support hyperlink to the EHR as a service. We debugged a series of terminology and technical issues. The application was validated with data from 255 patients and subsequently deployed to 10 primary care clinics where, over the course of 1 year, 569 533 CCD documents were processed. We validated the use of interoperable documents and open-source components to develop a low-cost tool for automated clinical decision support. Since Consolidated Clinical Document Architecture-based data extraction extends to any certified EHR, this demonstrates a successful modular approach to clinical decision support. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  4. A case study in open source innovation: developing the Tidepool Platform for interoperability in type 1 diabetes management

    PubMed Central

    Wong, Jenise; Look, Howard; Arbiter, Brandon; Quirk, Kent; McCanne, Steve; Sun, Yao; Blum, Michael; Adi, Saleh

    2016-01-01

    Objective Develop a device-agnostic cloud platform to host diabetes device data and catalyze an ecosystem of software innovation for type 1 diabetes (T1D) management. Materials and Methods An interdisciplinary team decided to establish a nonprofit company, Tidepool, and build open-source software. Results Through a user-centered design process, the authors created a software platform, the Tidepool Platform, to upload and host T1D device data in an integrated, device-agnostic fashion, as well as an application (“app”), Blip, to visualize the data. Tidepool’s software utilizes the principles of modular components, modern web design including REST APIs and JavaScript, cloud computing, agile development methodology, and robust privacy and security. Discussion By consolidating the currently scattered and siloed T1D device data ecosystem into one open platform, Tidepool can improve access to the data and enable new possibilities and efficiencies in T1D clinical care and research. The Tidepool Platform decouples diabetes apps from diabetes devices, allowing software developers to build innovative apps without requiring them to design a unique back-end (e.g., database and security) or unique ways of ingesting device data. It allows people with T1D to choose to use any preferred app regardless of which device(s) they use. Conclusion The authors believe that the Tidepool Platform can solve two current problems in the T1D device landscape: 1) limited access to T1D device data and 2) poor interoperability of data from different devices. If proven effective, Tidepool’s open source, cloud model for health data interoperability is applicable to other healthcare use cases. PMID:26338218

  5. Investigating the capabilities of semantic enrichment of 3D CityEngine data

    NASA Astrophysics Data System (ADS)

    Solou, Dimitra; Dimopoulou, Efi

    2016-08-01

    In recent years the development of technology and the lifting of several technical limitations, has brought the third dimension to the fore. The complexity of urban environments and the strong need for land administration, intensify the need of using a three-dimensional cadastral system. Despite the progress in the field of geographic information systems and 3D modeling techniques, there is no fully digital 3D cadastre. The existing geographic information systems and the different methods of three-dimensional modeling allow for better management, visualization and dissemination of information. Nevertheless, these opportunities cannot be totally exploited because of deficiencies in standardization and interoperability in these systems. Within this context, CityGML was developed as an international standard of the Open Geospatial Consortium (OGC) for 3D city models' representation and exchange. CityGML defines geometry and topology for city modeling, also focusing on semantic aspects of 3D city information. The scope of CityGML is to reach common terminology, also addressing the imperative need for interoperability and data integration, taking into account the number of available geographic information systems and modeling techniques. The aim of this paper is to develop an application for managing semantic information of a model generated based on procedural modeling. The model was initially implemented in CityEngine ESRI's software, and then imported to ArcGIS environment. Final goal was the original model's semantic enrichment and then its conversion to CityGML format. Semantic information management and interoperability seemed to be feasible by the use of the 3DCities Project ESRI tools, since its database structure ensures adding semantic information to the CityEngine model and therefore automatically convert to CityGML for advanced analysis and visualization in different application areas.

  6. Ontology of Earth's nonlinear dynamic complex systems

    NASA Astrophysics Data System (ADS)

    Babaie, Hassan; Davarpanah, Armita

    2017-04-01

    As a complex system, Earth and its major integrated and dynamically interacting subsystems (e.g., hydrosphere, atmosphere) display nonlinear behavior in response to internal and external influences. The Earth Nonlinear Dynamic Complex Systems (ENDCS) ontology formally represents the semantics of the knowledge about the nonlinear system element (agent) behavior, function, and structure, inter-agent and agent-environment feedback loops, and the emergent collective properties of the whole complex system as the result of interaction of the agents with other agents and their environment. It also models nonlinear concepts such as aperiodic, random chaotic behavior, sensitivity to initial conditions, bifurcation of dynamic processes, levels of organization, self-organization, aggregated and isolated functionality, and emergence of collective complex behavior at the system level. By incorporating several existing ontologies, the ENDCS ontology represents the dynamic system variables and the rules of transformation of their state, emergent state, and other features of complex systems such as the trajectories in state (phase) space (attractor and strange attractor), basins of attractions, basin divide (separatrix), fractal dimension, and system's interface to its environment. The ontology also defines different object properties that change the system behavior, function, and structure and trigger instability. ENDCS will help to integrate the data and knowledge related to the five complex subsystems of Earth by annotating common data types, unifying the semantics of shared terminology, and facilitating interoperability among different fields of Earth science.

  7. Approaching semantic interoperability in Health Level Seven

    PubMed Central

    Alschuler, Liora

    2010-01-01

    ‘Semantic Interoperability’ is a driving objective behind many of Health Level Seven's standards. The objective in this paper is to take a step back, and consider what semantic interoperability means, assess whether or not it has been achieved, and, if not, determine what concrete next steps can be taken to get closer. A framework for measuring semantic interoperability is proposed, using a technique called the ‘Single Logical Information Model’ framework, which relies on an operational definition of semantic interoperability and an understanding that interoperability improves incrementally. Whether semantic interoperability tomorrow will enable one computer to talk to another, much as one person can talk to another person, is a matter for speculation. It is assumed, however, that what gets measured gets improved, and in that spirit this framework is offered as a means to improvement. PMID:21106995

  8. Designing Reliable Cohorts of Cardiac Patients across MIMIC and eICU

    PubMed Central

    Chronaki, Catherine; Shahin, Abdullah; Mark, Roger

    2016-01-01

    The design of the patient cohort is an essential and fundamental part of any clinical patient study. Knowledge of the Electronic Health Records, underlying Database Management System, and the relevant clinical workflows are central to an effective cohort design. However, with technical, semantic, and organizational interoperability limitations, the database queries associated with a patient cohort may need to be reconfigured in every participating site. i2b2 and SHRINE advance the notion of patient cohorts as first class objects to be shared, aggregated, and recruited for research purposes across clinical sites. This paper reports on initial efforts to assess the integration of Medical Information Mart for Intensive Care (MIMIC) and Philips eICU, two large-scale anonymized intensive care unit (ICU) databases, using standard terminologies, i.e. LOINC, ICD9-CM and SNOMED-CT. Focus of this work is lab and microbiology observations and key demographics for patients with a primary cardiovascular ICD9-CM diagnosis. Results and discussion reflecting on reference core terminology standards, offer insights on efforts to combine detailed intensive care data from multiple ICUs worldwide. PMID:27774488

  9. Statewide Inventories of Heritage Resources: Macris and the Experience in Massachusetts

    NASA Astrophysics Data System (ADS)

    Stott, P. H.

    2017-08-01

    The Massachusetts Historical Commission (MHC) is the State Historic Preservation Office for Massachusetts. Established in 1963, MHC has been inventorying historic properties for over half a century. Since 1987, it has maintained a heritage database, the Massachusetts Cultural Resource Information System, or MACRIS. Today MACRIS holds over 206,000 records from the 351 towns and cities across the Commonwealth. Since 2004, a selection of the more than 150 MACRIS fields has been available online at mhcmacris. net. MACRIS is widely used by independent consultants preparing project review files, by MHC staff in its regulatory responsibilities, by local historical commissions monitoring threats to their communities, as well as by scholars, historical organizations, genealogists, property owners, reporters, and the general public interested in the history of the built environment. In 2016 MACRIS began migration off of its three-decade old Pick multivalue database to SQL Server, and in 2017, the first redesign of its thirteen-year old web interface should start to improve usability. Longer-term improvements have the goal of standardizing terminology and ultimately bringing interoperability with other heritage databases closer to reality.

  10. Towards semantic interoperability for electronic health records.

    PubMed

    Garde, Sebastian; Knaup, Petra; Hovenga, Evelyn; Heard, Sam

    2007-01-01

    In the field of open electronic health records (EHRs), openEHR as an archetype-based approach is being increasingly recognised. It is the objective of this paper to shortly describe this approach, and to analyse how openEHR archetypes impact on health professionals and semantic interoperability. Analysis of current approaches to EHR systems, terminology and standards developments. In addition to literature reviews, we organised face-to-face and additional telephone interviews and tele-conferences with members of relevant organisations and committees. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability -- both important prerequisites for semantic interoperability. Archetypes enable the formal definition of clinical content by clinicians. To enable comprehensive semantic interoperability, the development and maintenance of archetypes needs to be coordinated internationally and across health professions. Domain knowledge governance comprises a set of processes that enable the creation, development, organisation, sharing, dissemination, use and continuous maintenance of archetypes. It needs to be supported by information technology. To enable EHRs, semantic interoperability is essential. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability. However, without coordinated archetype development and maintenance, 'rank growth' of archetypes would jeopardize semantic interoperability. We therefore believe that openEHR archetypes and domain knowledge governance together create the knowledge environment required to adopt EHRs.

  11. The Health Service Bus: an architecture and case study in achieving interoperability in healthcare.

    PubMed

    Ryan, Amanda; Eklund, Peter

    2010-01-01

    Interoperability in healthcare is a requirement for effective communication between entities, to ensure timely access to up to-date patient information and medical knowledge, and thus facilitate consistent patient care. An interoperability framework called the Health Service Bus (HSB), based on the Enterprise Service Bus (ESB) middleware software architecture is presented here as a solution to all three levels of interoperability as defined by the HL7 EHR Interoperability Work group in their definitive white paper "Coming to Terms". A prototype HSB system was implemented based on the Mule Open-Source ESB and is outlined and discussed, followed by a clinically-based example.

  12. PACS/information systems interoperability using Enterprise Communication Framework.

    PubMed

    alSafadi, Y; Lord, W P; Mankovich, N J

    1998-06-01

    Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).

  13. The role of architecture and ontology for interoperability.

    PubMed

    Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka

    2010-01-01

    Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.

  14. Glocal clinical registries: pacemaker registry design and implementation for global and local integration--methodology and case study.

    PubMed

    da Silva, Kátia Regina; Costa, Roberto; Crevelari, Elizabeth Sartori; Lacerda, Marianna Sobral; de Moraes Albertini, Caio Marcos; Filho, Martino Martinelli; Santana, José Eduardo; Vissoci, João Ricardo Nickenig; Pietrobon, Ricardo; Barros, Jacson V

    2013-01-01

    The ability to apply standard and interoperable solutions for implementing and managing medical registries as well as aggregate, reproduce, and access data sets from legacy formats and platforms to advanced standard formats and operating systems are crucial for both clinical healthcare and biomedical research settings. Our study describes a reproducible, highly scalable, standard framework for a device registry implementation addressing both local data quality components and global linking problems. We developed a device registry framework involving the following steps: (1) Data standards definition and representation of the research workflow, (2) Development of electronic case report forms using REDCap (Research Electronic Data Capture), (3) Data collection according to the clinical research workflow and, (4) Data augmentation by enriching the registry database with local electronic health records, governmental database and linked open data collections, (5) Data quality control and (6) Data dissemination through the registry Web site. Our registry adopted all applicable standardized data elements proposed by American College Cardiology / American Heart Association Clinical Data Standards, as well as variables derived from cardiac devices randomized trials and Clinical Data Interchange Standards Consortium. Local interoperability was performed between REDCap and data derived from Electronic Health Record system. The original data set was also augmented by incorporating the reimbursed values paid by the Brazilian government during a hospitalization for pacemaker implantation. By linking our registry to the open data collection repository Linked Clinical Trials (LinkedCT) we found 130 clinical trials which are potentially correlated with our pacemaker registry. This study demonstrates how standard and reproducible solutions can be applied in the implementation of medical registries to constitute a re-usable framework. Such approach has the potential to facilitate data integration between healthcare and research settings, also being a useful framework to be used in other biomedical registries.

  15. Creating a data exchange strategy for radiotherapy research: towards federated databases and anonymised public datasets.

    PubMed

    Skripcak, Tomas; Belka, Claus; Bosch, Walter; Brink, Carsten; Brunner, Thomas; Budach, Volker; Büttner, Daniel; Debus, Jürgen; Dekker, Andre; Grau, Cai; Gulliford, Sarah; Hurkmans, Coen; Just, Uwe; Krause, Mechthild; Lambin, Philippe; Langendijk, Johannes A; Lewensohn, Rolf; Lühr, Armin; Maingon, Philippe; Masucci, Michele; Niyazi, Maximilian; Poortmans, Philip; Simon, Monique; Schmidberger, Heinz; Spezi, Emiliano; Stuschke, Martin; Valentini, Vincenzo; Verheij, Marcel; Whitfield, Gillian; Zackrisson, Björn; Zips, Daniel; Baumann, Michael

    2014-12-01

    Disconnected cancer research data management and lack of information exchange about planned and ongoing research are complicating the utilisation of internationally collected medical information for improving cancer patient care. Rapidly collecting/pooling data can accelerate translational research in radiation therapy and oncology. The exchange of study data is one of the fundamental principles behind data aggregation and data mining. The possibilities of reproducing the original study results, performing further analyses on existing research data to generate new hypotheses or developing computational models to support medical decisions (e.g. risk/benefit analysis of treatment options) represent just a fraction of the potential benefits of medical data-pooling. Distributed machine learning and knowledge exchange from federated databases can be considered as one beyond other attractive approaches for knowledge generation within "Big Data". Data interoperability between research institutions should be the major concern behind a wider collaboration. Information captured in electronic patient records (EPRs) and study case report forms (eCRFs), linked together with medical imaging and treatment planning data, are deemed to be fundamental elements for large multi-centre studies in the field of radiation therapy and oncology. To fully utilise the captured medical information, the study data have to be more than just an electronic version of a traditional (un-modifiable) paper CRF. Challenges that have to be addressed are data interoperability, utilisation of standards, data quality and privacy concerns, data ownership, rights to publish, data pooling architecture and storage. This paper discusses a framework for conceptual packages of ideas focused on a strategic development for international research data exchange in the field of radiation therapy and oncology. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  16. Creating a data exchange strategy for radiotherapy research: Towards federated databases and anonymised public datasets

    PubMed Central

    Skripcak, Tomas; Belka, Claus; Bosch, Walter; Brink, Carsten; Brunner, Thomas; Budach, Volker; Büttner, Daniel; Debus, Jürgen; Dekker, Andre; Grau, Cai; Gulliford, Sarah; Hurkmans, Coen; Just, Uwe; Krause, Mechthild; Lambin, Philippe; Langendijk, Johannes A.; Lewensohn, Rolf; Lühr, Armin; Maingon, Philippe; Masucci, Michele; Niyazi, Maximilian; Poortmans, Philip; Simon, Monique; Schmidberger, Heinz; Spezi, Emiliano; Stuschke, Martin; Valentini, Vincenzo; Verheij, Marcel; Whitfield, Gillian; Zackrisson, Björn; Zips, Daniel; Baumann, Michael

    2015-01-01

    Disconnected cancer research data management and lack of information exchange about planned and ongoing research are complicating the utilisation of internationally collected medical information for improving cancer patient care. Rapidly collecting/pooling data can accelerate translational research in radiation therapy and oncology. The exchange of study data is one of the fundamental principles behind data aggregation and data mining. The possibilities of reproducing the original study results, performing further analyses on existing research data to generate new hypotheses or developing computational models to support medical decisions (e.g. risk/benefit analysis of treatment options) represent just a fraction of the potential benefits of medical data-pooling. Distributed machine learning and knowledge exchange from federated databases can be considered as one beyond other attractive approaches for knowledge generation within “Big Data”. Data interoperability between research institutions should be the major concern behind a wider collaboration. Information captured in electronic patient records (EPRs) and study case report forms (eCRFs), linked together with medical imaging and treatment planning data, are deemed to be fundamental elements for large multi-centre studies in the field of radiation therapy and oncology. To fully utilise the captured medical information, the study data have to be more than just an electronic version of a traditional (un-modifiable) paper CRF. Challenges that have to be addressed are data interoperability, utilisation of standards, data quality and privacy concerns, data ownership, rights to publish, data pooling architecture and storage. This paper discusses a framework for conceptual packages of ideas focused on a strategic development for international research data exchange in the field of radiation therapy and oncology. PMID:25458128

  17. 48 CFR 1417.502 - General.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... shall develop procedures governing the use of interagency acquisitions under the Economy Act that are... CONTRACT TYPES SPECIAL CONTRACTING METHODS Interagency Acquisitions Under the Economy Act 1417.502 General...

  18. Building a portable data and information interoperability infrastructure-framework for a standard Taiwan Electronic Medical Record Template.

    PubMed

    Jian, Wen-Shan; Hsu, Chien-Yeh; Hao, Te-Hui; Wen, Hsyien-Chia; Hsu, Min-Huei; Lee, Yen-Liang; Li, Yu-Chuan; Chang, Polun

    2007-11-01

    Traditional electronic health record (EHR) data are produced from various hospital information systems. They could not have existed independently without an information system until the incarnation of XML technology. The interoperability of a healthcare system can be divided into two dimensions: functional interoperability and semantic interoperability. Currently, no single EHR standard exists that provides complete EHR interoperability. In order to establish a national EHR standard, we developed a set of local EHR templates. The Taiwan Electronic Medical Record Template (TMT) is a standard that aims to achieve semantic interoperability in EHR exchanges nationally. The TMT architecture is basically composed of forms, components, sections, and elements. Data stored in the elements which can be referenced by the code set, data type, and narrative block. The TMT was established with the following requirements in mind: (1) transformable to international standards; (2) having a minimal impact on the existing healthcare system; (3) easy to implement and deploy, and (4) compliant with Taiwan's current laws and regulations. The TMT provides a basis for building a portable, interoperable information infrastructure for EHR exchange in Taiwan.

  19. US Interagency Regional Foreign Policy Implementation: A Survey of Current Practice and an Analysis of Options for Improvement

    DTIC Science & Technology

    2014-06-01

    Culture 143 Congressional Support and Legislation 144 What If There Is No Appetite for Interagency Reform? 148 Appendix Interagency Reform at the... environmental disasters. Humanitarian and military operations will often depend on access rights in many different countries.”3 As US foreign policy...dination. However, because the JIACG is located in one agency (the DOD) and has no presidential directive or legislative sanction, other agencies are

  20. System and methods of resource usage using an interoperable management framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.

    Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.

  1. Information Management Challenges in Achieving Coalition Interoperability

    DTIC Science & Technology

    2001-12-01

    by J. Dyer SESSION I: ARCHITECTURES AND STANDARDS: FUNDAMENTAL ISSUES Chairman: Dr I. WHITE (UK) Planning for Interoperability 1 by W.M. Gentleman...framework – a crucial step toward achieving coalition C4I interoperability. TOPICS TO BE COVERED: 1 ) Maintaining secure interoperability 2) Command...d’une coalition. SUJETS À EXAMINER : 1 ) Le maintien d’une interopérabilité sécurisée 2) Les interfaces des systèmes de commandement : 2a

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.

    The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levelsmore » to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.« less

  3. Space Network Interoperability Panel (SNIP) study

    NASA Technical Reports Server (NTRS)

    Ryan, Thomas; Lenhart, Klaus; Hara, Hideo

    1991-01-01

    The Space Network Interoperability Panel (SNIP) study is a tripartite study that involves the National Aeronautics and Space Administration (NASA), the European Space Agency (ESA), and the National Space Development Agency (NASDA) of Japan. SNIP involves an ongoing interoperability study of the Data Relay Satellite (DRS) Systems of the three organizations. The study is broken down into two parts; Phase one deals with S-band (2 GHz) interoperability and Phase two deals with Ka-band (20/30 GHz) interoperability (in addition to S-band). In 1987 the SNIP formed a Working Group to define and study operations concepts and technical subjects to assure compatibility of the international data relay systems. Since that time a number of Panel and Working Group meetings have been held to continue the study. Interoperability is of interest to the three agencies because it offers a number of potential operation and economic benefits. This paper presents the history and status of the SNIP study.

  4. The IAGOS Information System

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Thouret, Valérie; Brissebrat, Guillaume

    2017-04-01

    IAGOS (In-service Aircraft for a Global Observing System) is a European Research Infrastructure which aims at the provision of long-term, regular and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft and do measurements of aerosols, cloud particles, greenhouse gases, ozone, water vapor and nitrogen oxides from the surface to the lower stratosphere. The IAGOS database is an essential part of the global atmospheric monitoring network. It contains IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data. The IAGOS Data Portal http://www.iagos.org, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data center AERIS (http://www.aeris-data.fr). In 2016 the new IAGOS Data Portal has been released. In addition to the data download the portal provides improved and new services such as download in NetCDF or NASA Ames formats and plotting tools (maps, time series, vertical profiles, etc.). New added value products are or will be soon available through the portal: back trajectories, origin of air masses, co-location with satellite data, etc. Web services allow to download IAGOS metadata such as flights and airports information. Administration tools have been implemented for users management and instruments monitoring. A major improvement is the interoperability with international portals or other databases in order to improve IAGOS data discovery. In the frame of the IGAS project (IAGOS for the Copernicus Atmospheric Service), a data network has been setup. It is composed of three data centers: the IAGOS database in Toulouse, the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de) and the CAMS (Copernicus Atmosphere Monitoring Service) data center in Jülich (http://join.iek.fz-juelich.de). The link with the CAMS data center, through the JOIN interface, allows to combine model outputs with IAGOS data for inter-comparison. The CAMS project is a prominent user of the IGAS data network. During the year IAGOS will improved metadata standardization and dissemination through different collaborations with the AERIS data center, GAW for which IAGOS is a contributing network and the ENVRI+ European project. Metadata about measurements traceability and quality will be available, DOI will be implemented and interoperability with other European Infrastructures will be set up through standardized web services.

  5. Intersectoral interagency partnerships to promote financial capability in older people.

    PubMed

    Hean, Sarah; Fenge, Lee Ann; Worswick, Louise; Wilkinson, Charlie; Fearnley, Stella

    2012-09-01

    From the second quarter of 2008, the UK economy entered a period of economic decline. Older people are particularly vulnerable during these times. To promote ways in which older people can be better supported to maintain their financial well-being, this study explored the sources older people utilize to keep themselves financially informed. Interviews with older people (n = 28) showed that older people access trusted sources of information (e.g. healthcare professionals) rather than specialist financial information providers (e.g. financial advisors) which highlighted the need for interagency working between financial services in the private, public and voluntary sectors. An example of how such interagency partnerships might be achieved in practice is presented with some recommendations on directions for future research into interagency working that spans public, private and voluntary sectors.

  6. Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.

    PubMed

    Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert

    2017-08-01

    Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.

  7. Interagency field manual for the collection of water-quality data

    USGS Publications Warehouse

    Lurry, Dee L.; Kolbe, Christine M.

    2000-01-01

    The USEPA, IBWC, USGS, and Texas Natural Resource Conservation Commission (TNRCC) have been working cooperatively to establish a Water-Quality Monitoring Council for the international reach of the Rio Grande (Río Bravo). A similar effort is occurring along the western international boundary with interested partners including the U.S. Bureau of Reclamation (BOR), Arizona Department of Environmental Quality (ADEQ), and the California Regional Water Quality Control Board (CRWQCB). As of February 1997, the partners agreed to work towards greater cooperation, specifically: 1. to revise the 1977 Joint Report of IBWC Engineers as specified in IBWC Minute No. 289; 2. to implement a binational Intergovernmental Task Force for Water-Quality Monitoring (ITFM) workgroup by inviting the participation of cooperators from Mexico; 3. to review and revise each agency’s existing monitoring network to reduce interagency redundancy; 4. to develop a bilingual manual for water-quality monitoring that would describe various field methods used for sampling water, aquatic biology, and sediment, and for assessing stream habitat; and selection of methods on the basis of DQOs, representativeness, and limitations; 5. to establish a common, easily accessible water-quality database; and 6. to hold joint training programs in water-quality monitoring and data management. Part of the fourth goal—to develop a field manual for water-sample-collection methods—will be accomplished with the publication of this manual.

  8. Child mortality estimation 2013: an overview of updates in estimation methods by the United Nations Inter-agency Group for Child Mortality Estimation.

    PubMed

    Alkema, Leontine; New, Jin Rou; Pedersen, Jon; You, Danzhen

    2014-01-01

    In September 2013, the United Nations Inter-agency Group for Child Mortality Estimation (UN IGME) published an update of the estimates of the under-five mortality rate (U5MR) and under-five deaths for all countries. Compared to the UN IGME estimates published in 2012, updated data inputs and a new method for estimating the U5MR were used. We summarize the new U5MR estimation method, which is a Bayesian B-spline Bias-reduction model, and highlight differences with the previously used method. Differences in UN IGME U5MR estimates as published in 2012 and those published in 2013 are presented and decomposed into differences due to the updated database and differences due to the new estimation method to explain and motivate changes in estimates. Compared to the previously used method, the new UN IGME estimation method is based on a different trend fitting method that can track (recent) changes in U5MR more closely. The new method provides U5MR estimates that account for data quality issues. Resulting differences in U5MR point estimates between the UN IGME 2012 and 2013 publications are small for the majority of countries but greater than 10 deaths per 1,000 live births for 33 countries in 2011 and 19 countries in 1990. These differences can be explained by the updated database used, the curve fitting method as well as accounting for data quality issues. Changes in the number of deaths were less than 10% on the global level and for the majority of MDG regions. The 2013 UN IGME estimates provide the most recent assessment of levels and trends in U5MR based on all available data and an improved estimation method that allows for closer-to-real-time monitoring of changes in the U5MR and takes account of data quality issues.

  9. Child Mortality Estimation 2013: An Overview of Updates in Estimation Methods by the United Nations Inter-Agency Group for Child Mortality Estimation

    PubMed Central

    Alkema, Leontine; New, Jin Rou; Pedersen, Jon; You, Danzhen

    2014-01-01

    Background In September 2013, the United Nations Inter-agency Group for Child Mortality Estimation (UN IGME) published an update of the estimates of the under-five mortality rate (U5MR) and under-five deaths for all countries. Compared to the UN IGME estimates published in 2012, updated data inputs and a new method for estimating the U5MR were used. Methods We summarize the new U5MR estimation method, which is a Bayesian B-spline Bias-reduction model, and highlight differences with the previously used method. Differences in UN IGME U5MR estimates as published in 2012 and those published in 2013 are presented and decomposed into differences due to the updated database and differences due to the new estimation method to explain and motivate changes in estimates. Findings Compared to the previously used method, the new UN IGME estimation method is based on a different trend fitting method that can track (recent) changes in U5MR more closely. The new method provides U5MR estimates that account for data quality issues. Resulting differences in U5MR point estimates between the UN IGME 2012 and 2013 publications are small for the majority of countries but greater than 10 deaths per 1,000 live births for 33 countries in 2011 and 19 countries in 1990. These differences can be explained by the updated database used, the curve fitting method as well as accounting for data quality issues. Changes in the number of deaths were less than 10% on the global level and for the majority of MDG regions. Conclusions The 2013 UN IGME estimates provide the most recent assessment of levels and trends in U5MR based on all available data and an improved estimation method that allows for closer-to-real-time monitoring of changes in the U5MR and takes account of data quality issues. PMID:25013954

  10. The NorWeST project: Crowd-sourcing a big data stream temperature database and high-resolution climate scenarios for western rivers and streams

    NASA Astrophysics Data System (ADS)

    Isaak, D.; Wenger, S. J.; Peterson, E.; Ver Hoef, J.; Luce, C.; Hostetler, S.

    2015-12-01

    Climate change is warming streams across the western U.S. and threatens billions of dollars of investments made to conserve valuable cold-water species like trout and salmon. Efficient threat response requires prioritization of limited conservation resources and coordinated interagency efforts guided by accurate information about climate at scales relevant to the distributions of species across landscapes. To provide that information, the NorWeST project was initiated in 2011 to aggregate stream temperature data from all available sources and create high-resolution climate scenarios. The database has since grown into the largest of its kind globally, and now consists of >60,000,000 hourly temperature recordings at >20,000 unique stream sites that were contributed by 100s of professionals working for >95 state, federal, tribal, municipal, county, and private resource agencies. This poster shows a high-resolution (1-kilometer) summer temperature scenario created with these data and mapped to 800,000 kilometers of network across eight western states (ID, WA, OR, MT, WY, UT, NV, CA). The geospatial data associated with this climate scenario and thirty others developed in this project are distributed in user-friendly digital formats through the NorWeST website (http://www.fs.fed.us/rm/boise/AWAE/projects/NorWeST.shtml). The accuracy, utility, and convenience of NorWeST data products has led to their rapid adoption and use by the management and research communities for conservation planning, inter-agency coordination of monitoring networks, and new research on stream temperatures and thermal ecology. A project of this scope and utility was possible only through crowd-sourcing techniques, which have also served to engage data contributors in the process of science creation while strengthening the social networks needed for effective conservation.

  11. Interagency Suspension and Debarment Committee

    EPA Pesticide Factsheets

    The Interagency Suspension and Debarment Committee (ISDC) was created, as an Office of Management and Budget (OMB) committee, by Executive Order 12549 for the purpose of monitoring the implementation of the Order.

  12. WWC Review of the Report "Meeting the Challenge of Combating Chronic Absenteeism: Impact of the NYC Mayor's Interagency Task Force on Chronic Absenteeism and School Attendance and Its Implications for Other Cities." What Works Clearinghouse Single Study Review

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2014

    2014-01-01

    The 2013 study, "Meeting the Challenge of Combating Chronic Absenteeism: Impact of the NYC Mayor's Interagency Task Force on Chronic Absenteeism and School Attendance and Its Implications for Other Cities", examined the impact of the strategies developed by an interagency task force in New York City to combat chronic absenteeism in…

  13. Sharing and Shaping Usable Science through the Inter-Agency Climate Change Forum

    DTIC Science & Technology

    2011-10-31

    pressure on participants time, this increase is a clear indicator that the forum serves a valuable role, with more persons connecting as they learn about...Science through the Inter-Agency Climate Change Forum William  D.  Goran,  U.S.  Army  Corps  of  Engineers   Sam  Higuchi...COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Sharing and Shaping Usable Science through the Inter-Agency Climate Change Forum 5a

  14. Impact of coalition interoperability on PKI

    NASA Astrophysics Data System (ADS)

    Krall, Edward J.

    2003-07-01

    This paper examines methods for providing PKI interoperability among units of a coalition of armed forces drawn from different nations. The area in question is tactical identity management, for the purposes of confidentiality, integrity and non-repudiation in such a dynamic coalition. The interoperating applications under consideration range from email and other forms of store-and-forward messaging to TLS and IPSEC-protected real-time communications. Six interoperability architectures are examined with advantages and disadvantages of each described in the paper.

  15. Telemedicine system interoperability architecture: concept description and architecture overview.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard Layne, II

    2004-05-01

    In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.

  16. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...

  17. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...

  18. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...

  19. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  20. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  1. Data resource profile: United Nations Children's Fund (UNICEF).

    PubMed

    Murray, Colleen; Newby, Holly

    2012-12-01

    The United Nations Children's Fund (UNICEF) plays a leading role in the collection, compilation, analysis and dissemination of data to inform sound policies, legislation and programmes for promoting children's rights and well-being, and for global monitoring of progress towards the Millennium Development Goals. UNICEF maintains a set of global databases representing nearly 200 countries and covering the areas of child mortality, child health, maternal health, nutrition, immunization, water and sanitation, HIV/AIDS, education and child protection. These databases consist of internationally comparable and statistically sound data, and are updated annually through a process that draws on a wealth of data provided by UNICEF's wide network of >150 field offices. The databases are composed primarily of estimates from household surveys, with data from censuses, administrative records, vital registration systems and statistical models contributing to some key indicators as well. The data are assessed for quality based on a set of objective criteria to ensure that only the most reliable nationally representative information is included. For most indicators, data are available at the global, regional and national levels, plus sub-national disaggregation by sex, urban/rural residence and household wealth. The global databases are featured in UNICEF's flagship publications, inter-agency reports, including the Secretary General's Millennium Development Goals Report and Countdown to 2015, sector-specific reports and statistical country profiles. They are also publicly available on www.childinfo.org, together with trend data and equity analyses.

  2. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability.

    PubMed

    Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A

    2008-02-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).

  3. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability

    PubMed Central

    Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.

    2008-01-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service Oriented Architecture (SSOA) for cancer research by the National Cancer Institute’s cancer Biomedical Informatics Grid (caBIG™). PMID:17512259

  4. Maturity Model for Advancing Smart Grid Interoperability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knight, Mark; Widergren, Steven E.; Mater, J.

    2013-10-28

    Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met withmore » process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.« less

  5. Reflections on the role of open source in health information system interoperability.

    PubMed

    Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G

    2007-01-01

    This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.

  6. Evolution of Query Optimization Methods

    NASA Astrophysics Data System (ADS)

    Hameurlain, Abdelkader; Morvan, Franck

    Query optimization is the most critical phase in query processing. In this paper, we try to describe synthetically the evolution of query optimization methods from uniprocessor relational database systems to data Grid systems through parallel, distributed and data integration systems. We point out a set of parameters to characterize and compare query optimization methods, mainly: (i) size of the search space, (ii) type of method (static or dynamic), (iii) modification types of execution plans (re-optimization or re-scheduling), (iv) level of modification (intra-operator and/or inter-operator), (v) type of event (estimation errors, delay, user preferences), and (vi) nature of decision-making (centralized or decentralized control).

  7. Minimum information required for a DMET experiment reporting.

    PubMed

    Kumuthini, Judit; Mbiyavanga, Mamana; Chimusa, Emile R; Pathak, Jyotishman; Somervuo, Panu; Van Schaik, Ron Hn; Dolzan, Vita; Mizzi, Clint; Kalideen, Kusha; Ramesar, Raj S; Macek, Milan; Patrinos, George P; Squassina, Alessio

    2016-09-01

    To provide pharmacogenomics reporting guidelines, the information and tools required for reporting to public omic databases. For effective DMET data interpretation, sharing, interoperability, reproducibility and reporting, we propose the Minimum Information required for a DMET Experiment (MIDE) reporting. MIDE provides reporting guidelines and describes the information required for reporting, data storage and data sharing in the form of XML. The MIDE guidelines will benefit the scientific community with pharmacogenomics experiments, including reporting pharmacogenomics data from other technology platforms, with the tools that will ease and automate the generation of such reports using the standardized MIDE XML schema, facilitating the sharing, dissemination, reanalysis of datasets through accessible and transparent pharmacogenomics data reporting.

  8. SOAP based web services and their future role in VO projects

    NASA Astrophysics Data System (ADS)

    Topf, F.; Jacquey, C.; Génot, V.; Cecconi, B.; André, N.; Zhang, T. L.; Kallio, E.; Lammer, H.; Facsko, G.; Stöckler, R.; Khodachenko, M.

    2011-10-01

    Modern state-of-the-art web services are from crucial importance for the interoperability of different VO tools existing in the planetary community. SOAP based web services assure the interconnectability between different data sources and tools by providing a common protocol for communication. This paper will point out a best practice approach with the Automated Multi-Dataset Analysis Tool (AMDA) developed by CDPP, Toulouse and the provision of VEX/MAG data from a remote database located at IWF, Graz. Furthermore a new FP7 project IMPEx will be introduced with a potential usage example of AMDA web services in conjunction with simulation models.

  9. Open Access: From Myth to Paradox

    ScienceCinema

    Ginsparg, Paul [Cornell University, Ithaca, New York, United States

    2018-04-19

    True open access to scientific publications not only gives readers the possibility to read articles without paying subscription, but also makes the material available for automated ingestion and harvesting by 3rd parties. Once articles and associated data become universally treatable as computable objects, openly available to 3rd party aggregators and value-added services, what new services can we expect, and how will they change the way that researchers interact with their scholarly communications infrastructure? I will discuss straightforward applications of existing ideas and services, including citation analysis, collaborative filtering, external database linkages, interoperability, and other forms of automated markup, and speculate on the sociology of the next generation of users.

  10. MGIS: managing banana (Musa spp.) genetic resources information and high-throughput genotyping data

    PubMed Central

    Guignon, V.; Sempere, G.; Sardos, J.; Hueber, Y.; Duvergey, H.; Andrieu, A.; Chase, R.; Jenny, C.; Hazekamp, T.; Irish, B.; Jelali, K.; Adeka, J.; Ayala-Silva, T.; Chao, C.P.; Daniells, J.; Dowiya, B.; Effa effa, B.; Gueco, L.; Herradura, L.; Ibobondji, L.; Kempenaers, E.; Kilangi, J.; Muhangi, S.; Ngo Xuan, P.; Paofa, J.; Pavis, C.; Thiemele, D.; Tossou, C.; Sandoval, J.; Sutanto, A.; Vangu Paka, G.; Yi, G.; Van den houwe, I.; Roux, N.

    2017-01-01

    Abstract Unraveling the genetic diversity held in genebanks on a large scale is underway, due to advances in Next-generation sequence (NGS) based technologies that produce high-density genetic markers for a large number of samples at low cost. Genebank users should be in a position to identify and select germplasm from the global genepool based on a combination of passport, genotypic and phenotypic data. To facilitate this, a new generation of information systems is being designed to efficiently handle data and link it with other external resources such as genome or breeding databases. The Musa Germplasm Information System (MGIS), the database for global ex situ-held banana genetic resources, has been developed to address those needs in a user-friendly way. In developing MGIS, we selected a generic database schema (Chado), the robust content management system Drupal for the user interface, and Tripal, a set of Drupal modules which links the Chado schema to Drupal. MGIS allows germplasm collection examination, accession browsing, advanced search functions, and germplasm orders. Additionally, we developed unique graphical interfaces to compare accessions and to explore them based on their taxonomic information. Accession-based data has been enriched with publications, genotyping studies and associated genotyping datasets reporting on germplasm use. Finally, an interoperability layer has been implemented to facilitate the link with complementary databases like the Banana Genome Hub and the MusaBase breeding database. Database URL: https://www.crop-diversity.org/mgis/ PMID:29220435

  11. 77 FR 67815 - Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-14

    ..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public... persons that the Federal Communications Commission's (FCC) Communications Security, Reliability, and... the security, reliability, and interoperability of communications systems. On March 19, 2011, the FCC...

  12. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  13. 48 CFR 2917.500 - Scope of subpart.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... CONTRACT TYPES SPECIAL CONTRACTING METHODS Interagency Acquisitions Under The Economy Act 2917.500 Scope of... of interagency acquisitions under the Economy Act (31 U.S.C. 1535) as prescribed by FAR 17.5. ...

  14. 78 FR 59662 - Annual Public Meeting of the Interagency Steering Committee on Multimedia Environmental Modeling

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-27

    ... applications and assessment of site specific, generic, and process-oriented multimedia environmental models as... development and simulation supports interagency interests in risk assessment, uncertainty analyses, management...

  15. Federal Interagency Committee on Indoor Air Quality

    EPA Pesticide Factsheets

    The Federal Interagency Committee on Indoor Air Quality (CIAQ), which meets three times a year, was established by Congress to coordinate the activities of the Federal Government on issues relating to Indoor Air Quality.

  16. UAS Integration in the NAS Project: DAA-TCAS Interoperability "mini" HITL Primary Results

    NASA Technical Reports Server (NTRS)

    Rorie, Conrad; Fern, Lisa; Shively, Jay; Santiago, Confesor

    2016-01-01

    At the May 2015 SC-228 meeting, requirements for TCAS II interoperability became elevated in priority. A TCAS interoperability workgroup was formed to identify and address key issues/questions. The TCAS workgroup came up with an initial list of questions and a plan to address those questions. As part of that plan, NASA proposed to run a mini HITL to address display, alerting and guidance issues. A TCAS Interoperability Workshop was held to determine potential display/alerting/guidance issues that could be explored in future NASA mini HITLS. Consensus on main functionality of DAA guidance when TCAS II RA occurs. Prioritized list of independent variables for experimental design. Set of use cases to stress TCAS Interoperability.

  17. An Ontological Solution to Support Interoperability in the Textile Industry

    NASA Astrophysics Data System (ADS)

    Duque, Arantxa; Campos, Cristina; Jiménez-Ruiz, Ernesto; Chalmeta, Ricardo

    Significant developments in information and communication technologies and challenging market conditions have forced enterprises to adapt their way of doing business. In this context, providing mechanisms to guarantee interoperability among heterogeneous organisations has become a critical issue. Even though prolific research has already been conducted in the area of enterprise interoperability, we have found that enterprises still struggle to introduce fully interoperable solutions, especially, in terms of the development and application of ontologies. Thus, the aim of this paper is to introduce basic ontology concepts in a simple manner and to explain the advantages of the use of ontologies to improve interoperability. We will also present a case study showing the implementation of an application ontology for an enterprise in the textile/clothing sector.

  18. The Arctic Observing Viewer: A Web-mapping Application for U.S. Arctic Observing Activities

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Manley, W. F.; Gaylord, A. G.; Kassin, A.; Villarreal, S.; Barba, M.; Dover, M.; Escarzaga, S. M.; Habermann, T.; Kozimor, J.; Score, R.; Tweedie, C. E.

    2015-12-01

    Although a great deal of progress has been made with various arctic observing efforts, it can be difficult to assess such progress when so many agencies, organizations, research groups and others are making such rapid progress over such a large expanse of the Arctic. To help meet the strategic needs of the U.S. SEARCH-AON program and facilitate the development of SAON and other related initiatives, the Arctic Observing Viewer (AOV; http://ArcticObservingViewer.org) has been developed. This web mapping application compiles detailed information pertaining to U.S. Arctic Observing efforts. Contributing partners include the U.S. NSF, USGS, ACADIS, ADIwg, AOOS, a2dc, AON, ARMAP, BAID, IASOA, INTERACT, and others. Over 7700 observation sites are currently in the AOV database and the application allows users to visualize, navigate, select, advance search, draw, print, and more. During 2015, the web mapping application has been enhanced by the addition of a query builder that allows users to create rich and complex queries. AOV is founded on principles of software and data interoperability and includes an emerging "Project" metadata standard, which uses ISO 19115-1 and compatible web services. Substantial efforts have focused on maintaining and centralizing all database information. In order to keep up with emerging technologies, the AOV data set has been structured and centralized within a relational database and the application front-end has been ported to HTML5 to enable mobile access. Other application enhancements include an embedded Apache Solr search platform which provides users with the capability to perform advance searches and an administration web based data management system that allows administrators to add, update, and delete information in real time. We encourage all collaborators to use AOV tools and services for their own purposes and to help us extend the impact of our efforts and ensure AOV complements other cyber-resources. Reinforcing dispersed but interoperable resources in this way will help to ensure improved capacities for conducting activities such as assessing the status of arctic observing efforts, optimizing logistic operations, and for quickly accessing external and project-focused web resources for more detailed information and access to scientific data and derived products.

  19. The Arctic Observing Viewer: A Web-mapping Application for U.S. Arctic Observing Activities

    NASA Astrophysics Data System (ADS)

    Kassin, A.; Gaylord, A. G.; Manley, W. F.; Villarreal, S.; Tweedie, C. E.; Cody, R. P.; Copenhaver, W.; Dover, M.; Score, R.; Habermann, T.

    2014-12-01

    Although a great deal of progress has been made with various arctic observing efforts, it can be difficult to assess such progress when so many agencies, organizations, research groups and others are making such rapid progress. To help meet the strategic needs of the U.S. SEARCH-AON program and facilitate the development of SAON and related initiatives, the Arctic Observing Viewer (AOV; http://ArcticObservingViewer.org) has been developed. This web mapping application compiles detailed information pertaining to U.S. Arctic Observing efforts. Contributing partners include the U.S. NSF, USGS, ACADIS, ADIwg, AOOS, a2dc, AON, ARMAP, BAID, IASOA, INTERACT, and others. Over 6100 sites are currently in the AOV database and the application allows users to visualize, navigate, select, advance search, draw, print, and more. AOV is founded on principles of software and data interoperability and includes an emerging "Project" metadata standard, which uses ISO 19115-1 and compatible web services. In the last year, substantial efforts have focused on maintaining and centralizing all database information. In order to keep up with emerging technologies and demand for the application, the AOV data set has been structured and centralized within a relational database; furthermore, the application front-end has been ported to HTML5. Porting the application to HTML5 will now provide access to mobile users utilizing tablets and cell phone devices. Other application enhancements include an embedded Apache Solr search platform which provides users with the capability to perform advance searches throughout the AOV dataset, and an administration web based data management system which allows the administrators to add, update, and delete data in real time. We encourage all collaborators to use AOV tools and services for their own purposes and to help us extend the impact of our efforts and ensure AOV complements other cyber-resources. Reinforcing dispersed but interoperable resources in this way will help to ensure improved capacities for conducting activities such as assessing the status of arctic observing efforts, optimizing logistic operations, and for quickly accessing external and project-focused web resources for more detailed information and data.

  20. 77 FR 37001 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-20

    ... of the Interoperability Services Layer, Attn: Ron Chen, 400 Gigling Road, Seaside, CA 93955. Title; Associated Form; and OMB Number: Interoperability Services Layer; OMB Control Number 0704-TBD. Needs and Uses... INFORMATION: Summary of Information Collection IoLS (Interoperability Layer Services) is an application in a...

  1. The eXtensible ontology development (XOD) principles and tool implementation to support ontology interoperability.

    PubMed

    He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison

    2018-01-12

    Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).

  2. Achieving Interoperability in GEOSS - How Close Are We?

    NASA Astrophysics Data System (ADS)

    Arctur, D. K.; Khalsa, S. S.; Browdy, S. F.

    2010-12-01

    A primary goal of the Global Earth Observing System of System (GEOSS) is improving the interoperability between the observational, modelling, data assimilation, and prediction systems contributed by member countries. The GEOSS Common Infrastructure (GCI) comprises the elements designed to enable discovery and access to these diverse data and information sources. But to what degree can the mechanisms for accessing these data, and the data themselves, be considered interoperable? Will the separate efforts by Communities of Practice within GEO to build their own portals, such as for Energy, Biodiversity, and Air Quality, lead to fragmentation or synergy? What communication and leadership do we need with these communities to improve interoperability both within and across such communities? The Standards and Interoperability Forum (SIF) of GEO's Architecture and Data Committee has assessed progress towards achieving the goal of global interoperability and made recommendations regarding evolution of the architecture and overall data strategy to ensure fulfillment of the GEOSS vision. This presentation will highlight the results of this study, and directions for further work.

  3. Personal Health Records: Is Rapid Adoption Hindering Interoperability?

    PubMed Central

    Studeny, Jana; Coustasse, Alberto

    2014-01-01

    The establishment of the Meaningful Use criteria has created a critical need for robust interoperability of health records. A universal definition of a personal health record (PHR) has not been agreed upon. Standardized code sets have been built for specific entities, but integration between them has not been supported. The purpose of this research study was to explore the hindrance and promotion of interoperability standards in relationship to PHRs to describe interoperability progress in this area. The study was conducted following the basic principles of a systematic review, with 61 articles used in the study. Lagging interoperability has stemmed from slow adoption by patients, creation of disparate systems due to rapid development to meet requirements for the Meaningful Use stages, and rapid early development of PHRs prior to the mandate for integration among multiple systems. Findings of this study suggest that deadlines for implementation to capture Meaningful Use incentive payments are supporting the creation of PHR data silos, thereby hindering the goal of high-level interoperability. PMID:25214822

  4. Proceedings, U. S. Department of Agriculture interagency gypsy moth research review 1990

    Treesearch

    Kurt W. Gottschalk; Mark J. Twery; Shirley I. Smith; [Editors

    1991-01-01

    Eight invited papers and 68 abstracts of volunteer presentations on gypsy moth biology, ecology, impacts, and management presented at the U. S. Department of Agriculture Interagency Gypsy Moth Research Review.

  5. 48 CFR 801.602-74 - Review requirements for an interagency agreement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... officer or a contracting officer at the VA National Acquisition Center or the Denver Acquisition and Logistics Center may sign an interagency agreement if the dollar threshold is within the contracting officer...

  6. 48 CFR 801.602-74 - Review requirements for an interagency agreement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... officer or a contracting officer at the VA National Acquisition Center or the Denver Acquisition and Logistics Center may sign an interagency agreement if the dollar threshold is within the contracting officer...

  7. 48 CFR 801.602-74 - Review requirements for an interagency agreement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... officer or a contracting officer at the VA National Acquisition Center or the Denver Acquisition and Logistics Center may sign an interagency agreement if the dollar threshold is within the contracting officer...

  8. 48 CFR 801.602-74 - Review requirements for an interagency agreement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... officer or a contracting officer at the VA National Acquisition Center or the Denver Acquisition and Logistics Center may sign an interagency agreement if the dollar threshold is within the contracting officer...

  9. 48 CFR 801.602-74 - Review requirements for an interagency agreement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... officer or a contracting officer at the VA National Acquisition Center or the Denver Acquisition and Logistics Center may sign an interagency agreement if the dollar threshold is within the contracting officer...

  10. 41 CFR 109-39.107 - Limited exemptions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., AND MOTOR VEHICLES 39-INTERAGENCY FLEET MANAGEMENT SYSTEMS 39.1-Establishment, Modification, and Discontinuance of Interagency Fleet Management Systems § 109-39.107 Limited exemptions. The Director, Office of... exemptions from the fleet management system. ...

  11. 48 CFR 17.501 - General.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ....501 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES SPECIAL CONTRACTING METHODS Interagency Acquisitions 17.501 General. (a) Interagency acquisitions are commonly conducted through indefinite-delivery contracts, such as task- and delivery-order...

  12. New Jersey interagency emergency management plan.

    DOT National Transportation Integrated Search

    2005-09-01

    This report outlines the research and work performed to lay the foundation for the : development of a New Jersey Interagency Emergency Management Plan. The : research into existing practices within the four state level transportation agencies : revea...

  13. Organisational Interoperability: Evaluation and Further Development of the OIM Model

    DTIC Science & Technology

    2003-06-01

    an Organizational Interoperability Maturity Model (OIM) to evaluate interoperability at the organizational level. The OIM considers the human ... activity aspects of military operations, which are not covered in other models. This paper describes how the model has been used to identify problems and to

  14. Sources and Implications of Bias and Uncertainty in a Century of US Wildfire Activity Data

    NASA Astrophysics Data System (ADS)

    Short, K.

    2013-12-01

    The statistical analysis of wildfire activity is a critical component of national wildfire planning, operations, and research in the United States (US). Wildfire activity data have been collected in the US for over a century. Yet, to this day, no single unified system of wildfire record-keeping exists. Data for analysis are generally harvested from archival summary reports from federal or interagency fire organizations; incident-level wildfire reporting systems of the federal, state, and local fire services; and, increasingly, remote-sensing programs. It is typical for research into wildfire activity patterns for all or part of the last century to require data from several of these sources and perhaps others. That work is complicated by the disunity of the various datasets and potentially compromised by inherent reporting biases, discussed here. The availability of wildfire records with the information content and geospatial precision generally sought for increasingly popular climatological analyses and the modeling of contemporary wildfire risk is limited to recent decades. We explain how the disunity and idiosyncrasies of US wildfire reporting have largely precluded true interagency, or all-lands, analyses of even recent wildfire activity and hamstrung some early risk modeling efforts. We then describe our efforts to acquire, standardize, error-check, compile, scrub, and evaluate the completeness of US federal, state, and local wildfire records from 1992-2011 for the national interagency Fire Program Analysis (FPA) application. The resulting FPA Fire-Occurrence Database (FPA FOD) includes nearly 1.6 million records from the 20-year period, with values for at least the following core data elements: location at least as precise as a Public Land Survey System section (2.6-km2 grid), discovery date, and final fire size. The FPA FOD is publicly available from the Research Data Archive of the US Department of Agriculture, Forest Service (http://dx.doi.org/10.2737/RDS-2013-0009). While necessarily incomplete in some aspects, the database is intended to facilitate fairly high-resolution geospatial analysis of wildfire activity over the past two decades, based on available information from the authoritative systems of record. Formal non-federal wildfire reporting has been on the rise over the past several decades, and users of national datasets like the FPA FOD must beware of state and local reporting biases to avoid drawing spurious conclusions when analysing the data. Apparent trends in the numbers and area burned by wildfires, for example, may be the result of multiple factors, including changes in climate, fuels, demographics (e.g. population density), fire-management policies, and - as we underscore here - levels of reporting.

  15. 76 FR 72715 - National Institute of Environmental Health Sciences; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-25

    ..., as amended (5 U.S.C. App.), notice is hereby given of a meeting of the Interagency Breast Cancer and... the meeting. Name of Committee: Interagency Breast Cancer and Environmental Research Coordinating...

  16. 76 FR 80954 - National Institute of Environmental Health Sciences; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-27

    ..., as amended (5 U.S.C. App.), notice is hereby given of a meeting of the Interagency Breast Cancer and... the meeting. Name of Committee: Interagency Breast Cancer and Environmental Research Coordinating...

  17. 41 CFR 101-39.100 - General.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FEDERAL PROPERTY MANAGEMENT REGULATIONS AVIATION, TRANSPORTATION, AND MOTOR VEHICLES 39-INTERAGENCY FLEET MANAGEMENT SYSTEMS 39.1-Establishment, Modification, and Discontinuance of Interagency Fleet Management... fleet management systems. (a) Based on these studies, the Administrator of General Services, with the...

  18. 5 CFR 330.701 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS RECRUITMENT, SELECTION, AND PLACEMENT (GENERAL) Interagency Career Transition Assistance Plan for Displaced Employees § 330.701 Purpose... interagency career transition assistance program for Federal employees during a period of severe Federal...

  19. 5 CFR 330.711 - Oversight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... PLACEMENT (GENERAL) Interagency Career Transition Assistance Plan for Displaced Employees § 330.711 Oversight. OPM is responsible for oversight of the Interagency Career Transition Assistance Plan for Displaced Employees and may conduct reviews of agency activity at any time. ...

  20. 77 FR 48153 - Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-13

    ..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public..., Reliability, and Interoperability Council (CSRIC) will hold its fifth meeting. The CSRIC will vote on... to the FCC regarding best practices and actions the FCC can take to ensure the security, reliability...

  1. Evaluation of Interoperability Protocols in Repositories of Electronic Theses and Dissertations

    ERIC Educational Resources Information Center

    Hakimjavadi, Hesamedin; Masrek, Mohamad Noorman

    2013-01-01

    Purpose: The purpose of this study is to evaluate the status of eight interoperability protocols within repositories of electronic theses and dissertations (ETDs) as an introduction to further studies on feasibility of deploying these protocols in upcoming areas of interoperability. Design/methodology/approach: Three surveys of 266 ETD…

  2. Examining the Relationship between Electronic Health Record Interoperability and Quality Management

    ERIC Educational Resources Information Center

    Purcell, Bernice M.

    2013-01-01

    A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…

  3. Interoperability of Demand Response Resources Demonstration in NY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wellington, Andre

    2014-03-31

    The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.

  4. Watershed and Economic Data InterOperability (WEDO): Facilitating Discovery, Evaluation and Integration through the Sharing of Watershed Modeling Data

    EPA Science Inventory

    Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interop...

  5. Reminiscing about 15 years of interoperability efforts

    DOE PAGES

    Van de Sompel, Herbert; Nelson, Michael L.

    2015-11-01

    Over the past fifteen years, our perspective on tackling information interoperability problems for web-based scholarship has evolved significantly. In this opinion piece, we look back at three efforts that we have been involved in that aptly illustrate this evolution: OAI-PMH, OAI-ORE, and Memento. Understanding that no interoperability specification is neutral, we attempt to characterize the perspectives and technical toolkits that provided the basis for these endeavors. With that regard, we consider repository-centric and web-centric interoperability perspectives, and the use of a Linked Data or a REST/HATEAOS technology stack, respectively. In addition, we lament the lack of interoperability across nodes thatmore » play a role in web-based scholarship, but end on a constructive note with some ideas regarding a possible path forward.« less

  6. The HDF Product Designer - Interoperability in the First Mile

    NASA Astrophysics Data System (ADS)

    Lee, H.; Jelenak, A.; Habermann, T.

    2014-12-01

    Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.

  7. Potential interoperability problems facing multi-site radiation oncology centers in The Netherlands

    NASA Astrophysics Data System (ADS)

    Scheurleer, J.; Koken, Ph; Wessel, R.

    2014-03-01

    Aim: To identify potential interoperability problems facing multi-site Radiation Oncology (RO) departments in the Netherlands and solutions for unambiguous multi-system workflows. Specific challenges confronting the RO department of VUmc (RO-VUmc), which is soon to open a satellite department, were characterized. Methods: A nationwide questionnaire survey was conducted to identify possible interoperability problems and solutions. Further detailed information was obtained by in-depth interviews at 3 Dutch RO institutes that already operate in more than one site. Results: The survey had a 100% response rate (n=21). Altogether 95 interoperability problems were described. Most reported problems were on a strategic and semantic level. The majority were DICOM(-RT) and HL7 related (n=65), primarily between treatment planning and verification systems or between departmental and hospital systems. Seven were identified as being relevant for RO-VUmc. Departments have overcome interoperability problems with their own, or with tailor-made vendor solutions. There was little knowledge about or utilization of solutions developed by Integrating the Healthcare Enterprise Radiation Oncology (IHE-RO). Conclusions: Although interoperability problems are still common, solutions have been identified. Awareness of IHE-RO needs to be raised. No major new interoperability problems are predicted as RO-VUmc develops into a multi-site department.

  8. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem

    PubMed Central

    Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour

    2018-01-01

    The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem. PMID:29597286

  9. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem.

    PubMed

    AlShehri, Helala; Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour

    2018-03-28

    The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem.

  10. Clever generation of rich SPARQL queries from annotated relational schema: application to Semantic Web Service creation for biological databases.

    PubMed

    Wollbrett, Julien; Larmande, Pierre; de Lamotte, Frédéric; Ruiz, Manuel

    2013-04-15

    In recent years, a large amount of "-omics" data have been produced. However, these data are stored in many different species-specific databases that are managed by different institutes and laboratories. Biologists often need to find and assemble data from disparate sources to perform certain analyses. Searching for these data and assembling them is a time-consuming task. The Semantic Web helps to facilitate interoperability across databases. A common approach involves the development of wrapper systems that map a relational database schema onto existing domain ontologies. However, few attempts have been made to automate the creation of such wrappers. We developed a framework, named BioSemantic, for the creation of Semantic Web Services that are applicable to relational biological databases. This framework makes use of both Semantic Web and Web Services technologies and can be divided into two main parts: (i) the generation and semi-automatic annotation of an RDF view; and (ii) the automatic generation of SPARQL queries and their integration into Semantic Web Services backbones. We have used our framework to integrate genomic data from different plant databases. BioSemantic is a framework that was designed to speed integration of relational databases. We present how it can be used to speed the development of Semantic Web Services for existing relational biological databases. Currently, it creates and annotates RDF views that enable the automatic generation of SPARQL queries. Web Services are also created and deployed automatically, and the semantic annotations of our Web Services are added automatically using SAWSDL attributes. BioSemantic is downloadable at http://southgreen.cirad.fr/?q=content/Biosemantic.

  11. Clever generation of rich SPARQL queries from annotated relational schema: application to Semantic Web Service creation for biological databases

    PubMed Central

    2013-01-01

    Background In recent years, a large amount of “-omics” data have been produced. However, these data are stored in many different species-specific databases that are managed by different institutes and laboratories. Biologists often need to find and assemble data from disparate sources to perform certain analyses. Searching for these data and assembling them is a time-consuming task. The Semantic Web helps to facilitate interoperability across databases. A common approach involves the development of wrapper systems that map a relational database schema onto existing domain ontologies. However, few attempts have been made to automate the creation of such wrappers. Results We developed a framework, named BioSemantic, for the creation of Semantic Web Services that are applicable to relational biological databases. This framework makes use of both Semantic Web and Web Services technologies and can be divided into two main parts: (i) the generation and semi-automatic annotation of an RDF view; and (ii) the automatic generation of SPARQL queries and their integration into Semantic Web Services backbones. We have used our framework to integrate genomic data from different plant databases. Conclusions BioSemantic is a framework that was designed to speed integration of relational databases. We present how it can be used to speed the development of Semantic Web Services for existing relational biological databases. Currently, it creates and annotates RDF views that enable the automatic generation of SPARQL queries. Web Services are also created and deployed automatically, and the semantic annotations of our Web Services are added automatically using SAWSDL attributes. BioSemantic is downloadable at http://southgreen.cirad.fr/?q=content/Biosemantic. PMID:23586394

  12. Is There Evidence of Cost Benefits of Electronic Medical Records, Standards, or Interoperability in Hospital Information Systems? Overview of Systematic Reviews

    PubMed Central

    2017-01-01

    Background Electronic health (eHealth) interventions may improve the quality of care by providing timely, accessible information about one patient or an entire population. Electronic patient care information forms the nucleus of computerized health information systems. However, interoperability among systems depends on the adoption of information standards. Additionally, investing in technology systems requires cost-effectiveness studies to ensure the sustainability of processes for stakeholders. Objective The objective of this study was to assess cost-effectiveness of the use of electronically available inpatient data systems, health information exchange, or standards to support interoperability among systems. Methods An overview of systematic reviews was conducted, assessing the MEDLINE, Cochrane Library, LILACS, and IEEE Library databases to identify relevant studies published through February 2016. The search was supplemented by citations from the selected papers. The primary outcome sought the cost-effectiveness, and the secondary outcome was the impact on quality of care. Independent reviewers selected studies, and disagreement was resolved by consensus. The quality of the included studies was evaluated using a measurement tool to assess systematic reviews (AMSTAR). Results The primary search identified 286 papers, and two papers were manually included. A total of 211 were systematic reviews. From the 20 studies that were selected after screening the title and abstract, 14 were deemed ineligible, and six met the inclusion criteria. The interventions did not show a measurable effect on cost-effectiveness. Despite the limited number of studies, the heterogeneity of electronic systems reported, and the types of intervention in hospital routines, it was possible to identify some preliminary benefits in quality of care. Hospital information systems, along with information sharing, had the potential to improve clinical practice by reducing staff errors or incidents, improving automated harm detection, monitoring infections more effectively, and enhancing the continuity of care during physician handoffs. Conclusions This review identified some benefits in the quality of care but did not provide evidence that the implementation of eHealth interventions had a measurable impact on cost-effectiveness in hospital settings. However, further evidence is needed to infer the impact of standards adoption or interoperability in cost benefits of health care; this in turn requires further research. PMID:28851681

  13. Is There Evidence of Cost Benefits of Electronic Medical Records, Standards, or Interoperability in Hospital Information Systems? Overview of Systematic Reviews.

    PubMed

    Reis, Zilma Silveira Nogueira; Maia, Thais Abreu; Marcolino, Milena Soriano; Becerra-Posada, Francisco; Novillo-Ortiz, David; Ribeiro, Antonio Luiz Pinho

    2017-08-29

    Electronic health (eHealth) interventions may improve the quality of care by providing timely, accessible information about one patient or an entire population. Electronic patient care information forms the nucleus of computerized health information systems. However, interoperability among systems depends on the adoption of information standards. Additionally, investing in technology systems requires cost-effectiveness studies to ensure the sustainability of processes for stakeholders. The objective of this study was to assess cost-effectiveness of the use of electronically available inpatient data systems, health information exchange, or standards to support interoperability among systems. An overview of systematic reviews was conducted, assessing the MEDLINE, Cochrane Library, LILACS, and IEEE Library databases to identify relevant studies published through February 2016. The search was supplemented by citations from the selected papers. The primary outcome sought the cost-effectiveness, and the secondary outcome was the impact on quality of care. Independent reviewers selected studies, and disagreement was resolved by consensus. The quality of the included studies was evaluated using a measurement tool to assess systematic reviews (AMSTAR). The primary search identified 286 papers, and two papers were manually included. A total of 211 were systematic reviews. From the 20 studies that were selected after screening the title and abstract, 14 were deemed ineligible, and six met the inclusion criteria. The interventions did not show a measurable effect on cost-effectiveness. Despite the limited number of studies, the heterogeneity of electronic systems reported, and the types of intervention in hospital routines, it was possible to identify some preliminary benefits in quality of care. Hospital information systems, along with information sharing, had the potential to improve clinical practice by reducing staff errors or incidents, improving automated harm detection, monitoring infections more effectively, and enhancing the continuity of care during physician handoffs. This review identified some benefits in the quality of care but did not provide evidence that the implementation of eHealth interventions had a measurable impact on cost-effectiveness in hospital settings. However, further evidence is needed to infer the impact of standards adoption or interoperability in cost benefits of health care; this in turn requires further research. ©Zilma Silveira Nogueira Reis, Thais Abreu Maia, Milena Soriano Marcolino, Francisco Becerra-Posada, David Novillo-Ortiz, Antonio Luiz Pinho Ribeiro. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 29.08.2017.

  14. 28 CFR 42.413 - Interagency cooperation and delegations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Interagency cooperation and delegations. 42.413 Section 42.413 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Coordination of Enforcement of Non-discrimination in...

  15. 76 FR 36174 - Federal Interagency Committee on Emergency Medical Services; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-21

    ... DEPARTMENT OF TRANSPORTATION National Highway Traffic Safety Administration [NHTSA Docket No... Highway Traffic Safety Administration (NHTSA), DOT. ACTION: Meeting Notice--Federal Interagency Committee... CONTACT: Drew Dawson, Director, Office of Emergency Medical Services, National Highway Traffic Safety...

  16. 3 CFR 13580 - Executive Order 13580 of July 12, 2011. Interagency Working Group on Coordination of Domestic...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...: Section 1. Policy. Interagency coordination is important for the safe, responsible, and efficient... agency or office, from: (i) the Council on Environmental Quality; (ii) the Office of Science and...

  17. Interagency cooperation : FEMA and DOD in domestic support operations.

    DOT National Transportation Integrated Search

    1997-05-01

    This paper studies the interagency cooperation between DOD and FEMA, focusing specifically on the evolution of doctrine and procedures for responding to natural disasters. While both FEMA and DOD have improved in their ability to respond to disasters...

  18. WHO'S WHO III IN THE INTERAGENCY ENERGY/ENVIRONMENT R AND D PROGRAM

    EPA Science Inventory

    This directory provides a means of access to information on specific projects currently underway within the Interagency Program. The 14 major categories covered are: Characterization, measurement, and monitoring, Environmental transport processes, Health effects, Ecological effec...

  19. Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaidon, Clement; Poplawski, Michael

    First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.

  20. Smart Grid Interoperability Maturity Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Levinson, Alex; Mater, J.

    2010-04-28

    The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizationalmore » alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.« less

  1. Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'

    NASA Astrophysics Data System (ADS)

    Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno

    2015-04-01

    Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.

  2. Functional requirements document for NASA/MSFC Earth Science and Applications Division: Data and information system (ESAD-DIS). Interoperability, 1992

    NASA Technical Reports Server (NTRS)

    Stephens, J. Briscoe; Grider, Gary W.

    1992-01-01

    These Earth Science and Applications Division-Data and Information System (ESAD-DIS) interoperability requirements are designed to quantify the Earth Science and Application Division's hardware and software requirements in terms of communications between personal and visualization workstation, and mainframe computers. The electronic mail requirements and local area network (LAN) requirements are addressed. These interoperability requirements are top-level requirements framed around defining the existing ESAD-DIS interoperability and projecting known near-term requirements for both operational support and for management planning. Detailed requirements will be submitted on a case-by-case basis. This document is also intended as an overview of ESAD-DIs interoperability for new-comers and management not familiar with these activities. It is intended as background documentation to support requests for resources and support requirements.

  3. An open repositories network development for medical teaching resources.

    PubMed

    Soula, Gérard; Darmoni, Stefan; Le Beux, Pierre; Renard, Jean-Marie; Dahamna, Badisse; Fieschi, Marius

    2010-01-01

    The lack of interoperability between repositories of heterogeneous and geographically widespread data is an obstacle to the diffusion, sharing and reutilization of those data. We present the development of an open repositories network taking into account both the syntactic and semantic interoperability of the different repositories and based on international standards in this field. The network is used by the medical community in France for the diffusion and sharing of digital teaching resources. The syntactic interoperability of the repositories is managed using the OAI-PMH protocol for the exchange of metadata describing the resources. Semantic interoperability is based, on one hand, on the LOM standard for the description of resources and on MESH for the indexing of the latter and, on the other hand, on semantic interoperability management designed to optimize compliance with standards and the quality of the metadata.

  4. IRIS TOXICOLOGICAL REVIEW AND SUMMARY ...

    EPA Pesticide Factsheets

    EPA's assessment of the noncancer health effects and carcinogenic potential of Beryllium was added to the IRIS database in 1998. The IRIS program is updating the IRIS assessment for Beryllium. This update will incorporate health effects information published since the last assessment was prepared as well as new risk assessment methods. The IRIS assessment for Beryllium will consist of an updated Toxicological Review and IRIS Summary. The Toxicological Review is a critical review of the physicochemical and toxicokinetic properties of the chemical and its toxicity in humans and experimental systems. The assessment will present reference values for noncancer effects of Beryllium (RfD and RfC) and a cancer assessment. The Toxicological Review and IRIS Summary will be subject to internal peer consultation, Agency and Interagency review, and external scientific peer review. The final products will constitute the Agency's opinion on the toxicity of Beryllium. Beryllium is a light alkaline earth metal used in metal alloys and in high-performance products in the metallurgical, aerospace, and nuclear industries. According to the Superfund database, beryllium is found in over 300 NPL sites

  5. Special Topic Interoperability and EHR: Combining openEHR, SNOMED, IHE, and Continua as approaches to interoperability on national eHealth.

    PubMed

    Beštek, Mate; Stanimirović, Dalibor

    2017-08-09

    The main aims of the paper comprise the characterization and examination of the potential approaches regarding interoperability. This includes openEHR, SNOMED, IHE, and Continua as combined interoperability approaches, possibilities for their incorporation into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. The paper represents an in-depth analysis regarding the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research and charting of existing experience in the field, and sources, both electronic and written, which include interoperability concepts and related implementation issues. The paper will try to answer the following inquiries that are complementing each other: 1. Scrutiny of the potential approaches, which could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field that critically influence development and implementation of eHealth projects in an efficient manner. Provided insights and identified success factors could serve as a constituent of the strategic starting points for continuous integration of interoperability principles into the healthcare domain. Moreover, the general implementation of the identified success factors could facilitate better penetration of ICT into the healthcare environment and enable the eHealth-based transformation of the health system especially in the countries which are still in an early phase of eHealth planning and development and are often confronted with differing interests, requirements, and contending strategies.

  6. Ensuring Sustainable Data Interoperability Across the Natural and Social Sciences

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.

    2015-12-01

    Both the natural and social science data communities are attempting to address the long-term sustainability of their data infrastructures in rapidly changing research, technological, and policy environments. Many parts of these communities are also considering how to improve the interoperability and integration of their data and systems across natural, social, health, and other domains. However, these efforts have generally been undertaken in parallel, with little thought about how different sustainability approaches may impact long-term interoperability from scientific, legal, or economic perspectives, or vice versa, i.e., how improved interoperability could enhance—or threaten—infrastructure sustainability. Scientific progress depends substantially on the ability to learn from the legacy of previous work available for current and future scientists to study, often by integrating disparate data not previously assembled. Digital data are less likely than scientific publications to be usable in the future unless they are managed by science-oriented repositories that can support long-term data access with the documentation and services needed for future interoperability. We summarize recent discussions in the social and natural science communities on emerging approaches to sustainability and relevant interoperability activities, including efforts by the Belmont Forum E-Infrastructures project to address global change data infrastructure needs; the Group on Earth Observations to further implement data sharing and improve data management across diverse societal benefit areas; and the Research Data Alliance to develop legal interoperability principles and guidelines and to address challenges faced by domain repositories. We also examine emerging needs for data interoperability in the context of the post-2015 development agenda and the expected set of Sustainable Development Goals (SDGs), which set ambitious targets for sustainable development, poverty reduction, and environmental stewardship by 2030. These efforts suggest the need for a holistic approach towards improving and implementing strategies, policies, and practices that will ensure long-term sustainability and interoperability of scientific data repositories and networks across multiple scientific domains.

  7. A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Sankaran, S.

    2015-12-01

    In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted as part of our everyday use of technology.

  8. 50 CFR 402.43 - Interagency exchanges of information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ADMINISTRATION, DEPARTMENT OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A INTERAGENCY COOPERATION-ENDANGERED SPECIES ACT OF 1973, AS AMENDED Counterpart Regulations Governing Actions by the U.S. Environmental Protection Agency Under the Federal Insecticide, Fungicide and Rodenticide Act § 402.43...

  9. 50 CFR 402.43 - Interagency exchanges of information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ADMINISTRATION, DEPARTMENT OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A INTERAGENCY COOPERATION-ENDANGERED SPECIES ACT OF 1973, AS AMENDED Counterpart Regulations Governing Actions by the U.S. Environmental Protection Agency Under the Federal Insecticide, Fungicide and Rodenticide Act § 402.43...

  10. 50 CFR 402.43 - Interagency exchanges of information.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... ADMINISTRATION, DEPARTMENT OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A INTERAGENCY COOPERATION-ENDANGERED SPECIES ACT OF 1973, AS AMENDED Counterpart Regulations Governing Actions by the U.S. Environmental Protection Agency Under the Federal Insecticide, Fungicide and Rodenticide Act § 402.43...

  11. 50 CFR 402.43 - Interagency exchanges of information.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... ADMINISTRATION, DEPARTMENT OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A INTERAGENCY COOPERATION-ENDANGERED SPECIES ACT OF 1973, AS AMENDED Counterpart Regulations Governing Actions by the U.S. Environmental Protection Agency Under the Federal Insecticide, Fungicide and Rodenticide Act § 402.43...

  12. 50 CFR 402.43 - Interagency exchanges of information.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ADMINISTRATION, DEPARTMENT OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A INTERAGENCY COOPERATION-ENDANGERED SPECIES ACT OF 1973, AS AMENDED Counterpart Regulations Governing Actions by the U.S. Environmental Protection Agency Under the Federal Insecticide, Fungicide and Rodenticide Act § 402.43...

  13. Memorandum of Understanding Regarding Interagency Coordination and Collaboration for the Protection of Tribal Treaty Rights

    EPA Pesticide Factsheets

    Interagency Memorandum of Understanding (MOU) affirming protection of tribal treaty rights and similar tribal rights relating to natural resources when federal action is taken. It will be updated as additional federal agencies become signatories.

  14. 75 FR 34201 - Meeting Notice-Federal Interagency Committee on Emergency Medical Services

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-16

    ... DEPARTMENT OF TRANSPORTATION National Highway Traffic Safety Administration [NHTSA Docket No...: National Highway Traffic Safety Administration (NHTSA), DOT. ACTION: Meeting Notice--Federal Interagency..., Director, Office of Emergency Medical Services, National Highway Traffic Safety Administration, 1200 New...

  15. (Docket A-93-02) Category II-F: Interagency Review Materials

    EPA Pesticide Factsheets

    This Index lists Interagency review materials related to the decision to certify that DOE had met the compliance criteria established by EPA in 40 CFR Part 194 and the disposal regulations set by EPA in 40 CFR Part 191.

  16. Notification: Preliminary Research: Review of Independent Government Cost Estimates and Indirect Costs for EPA’s Interagency Agreements

    EPA Pesticide Factsheets

    Project #OA-FY14-0130, February 11, 2014. The EPA OIG plans to begin preliminary research of the independent government cost estimates and indirect costs for the EPA's funds-in interagency agreements.

  17. 34 CFR 303.523 - Interagency agreements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... agency for paying for early intervention services (consistent with State law and the requirements of this... AND REHABILITATIVE SERVICES, DEPARTMENT OF EDUCATION EARLY INTERVENTION PROGRAM FOR INFANTS AND... interagency agreements with other State-level agencies involved in the State's early intervention program...

  18. 34 CFR 303.523 - Interagency agreements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... agency for paying for early intervention services (consistent with State law and the requirements of this... AND REHABILITATIVE SERVICES, DEPARTMENT OF EDUCATION EARLY INTERVENTION PROGRAM FOR INFANTS AND... interagency agreements with other State-level agencies involved in the State's early intervention program...

  19. 75 FR 68612 - National Institute of Mental Health; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ... Autism Coordinating Committee (IACC). The Interagency Autism Coordinating Committee (IACC) Services...: Interagency Autism Coordinating Committee (IACC) Type of meeting: Services Subcommittee. Date: November 29..., Access code: 1427016. Contact Person: Ms. Lina Perez, Office of Autism Research Coordination, National...

  20. 5 CFR 720.307 - Interagency report clearance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 2 2012-01-01 2012-01-01 false Interagency report clearance. 720.307 Section 720.307 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) AFFIRMATIVE EMPLOYMENT PROGRAMS Disabled Veterans Affirmative Action Program § 720.307...

Top