Sample records for message exchange interoperability

  1. Warfighter IT Interoperability Standards Study

    DTIC Science & Technology

    2012-07-22

    data (e.g. messages) between systems ? ii) What process did you used to validate and certify semantic interoperability between your...other systems at this time There was no requirement to validate and certify semantic interoperability The DLS program exchanges data with... semantics Testing for System Compliance with Data Models Verify and Certify Interoperability Using Data

  2. PACS/information systems interoperability using Enterprise Communication Framework.

    PubMed

    alSafadi, Y; Lord, W P; Mankovich, N J

    1998-06-01

    Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).

  3. Comparison and Analysis of ISO/IEEE 11073, IHE PCD-01, and HL7 FHIR Messages for Personal Health Devices

    PubMed Central

    Do, Hyoungho

    2018-01-01

    Objectives Increasing use of medical devices outside of healthcare facilities inevitably requires connectivity and interoperability between medical devices and healthcare information systems. To this end, standards have been developed and used to provide interoperability between personal health devices (PHDs) and external systems. ISO/IEEE 11073 standards and IHE PCD-01 standard messages have been used the most in the exchange of observation data of health devices. Recently, transmitting observation data using the HL7 FHIR standard has been devised in the name of DoF (Devices on FHIR) and adopted very fast. We compare and analyze these standards and suggest that which standard will work best at the different environments of device usage. Methods We generated each message/resource of the three standards for observed vital signs from blood pressure monitor and thermometer. Then, the size, the contents, and the exchange processes of these messages are compared and analyzed. Results ISO/IEEE 11073 standard message has the smallest data size, but it has no ability to contain the key information, patient information. On the other hand, PCD-01 messages and FHIR standards have the fields for patient information. HL7 DoF standards provide reusing of information unit known as resource, and it is relatively easy to parse DoF messages since it uses widely known XML and JSON. Conclusions ISO/IEEE 11073 standards are suitable for devices having very small computing power. IHE PCD-01 and HL7 DoF messages can be used for the devices that need to be connected to hospital information systems that require patient information. When information reuse is frequent, DoF is advantageous over PCD-01. PMID:29503752

  4. Comparison and Analysis of ISO/IEEE 11073, IHE PCD-01, and HL7 FHIR Messages for Personal Health Devices.

    PubMed

    Lee, Sungkee; Do, Hyoungho

    2018-01-01

    Increasing use of medical devices outside of healthcare facilities inevitably requires connectivity and interoperability between medical devices and healthcare information systems. To this end, standards have been developed and used to provide interoperability between personal health devices (PHDs) and external systems. ISO/IEEE 11073 standards and IHE PCD-01 standard messages have been used the most in the exchange of observation data of health devices. Recently, transmitting observation data using the HL7 FHIR standard has been devised in the name of DoF (Devices on FHIR) and adopted very fast. We compare and analyze these standards and suggest that which standard will work best at the different environments of device usage. We generated each message/resource of the three standards for observed vital signs from blood pressure monitor and thermometer. Then, the size, the contents, and the exchange processes of these messages are compared and analyzed. ISO/IEEE 11073 standard message has the smallest data size, but it has no ability to contain the key information, patient information. On the other hand, PCD-01 messages and FHIR standards have the fields for patient information. HL7 DoF standards provide reusing of information unit known as resource, and it is relatively easy to parse DoF messages since it uses widely known XML and JSON. ISO/IEEE 11073 standards are suitable for devices having very small computing power. IHE PCD-01 and HL7 DoF messages can be used for the devices that need to be connected to hospital information systems that require patient information. When information reuse is frequent, DoF is advantageous over PCD-01.

  5. Scalable and Resilient Middleware to Handle Information Exchange during Environment Crisis

    NASA Astrophysics Data System (ADS)

    Tao, R.; Poslad, S.; Moßgraber, J.; Middleton, S.; Hammitzsch, M.

    2012-04-01

    The EU FP7 TRIDEC project focuses on enabling real-time, intelligent, information management of collaborative, complex, critical decision processes for earth management. A key challenge is to promote a communication infrastructure to facilitate interoperable environment information services during environment events and crises such as tsunamis and drilling, during which increasing volumes and dimensionality of disparate information sources, including sensor-based and human-based ones, can result, and need to be managed. Such a system needs to support: scalable, distributed messaging; asynchronous messaging; open messaging to handling changing clients such as new and retired automated system and human information sources becoming online or offline; flexible data filtering, and heterogeneous access networks (e.g., GSM, WLAN and LAN). In addition, the system needs to be resilient to handle the ICT system failures, e.g. failure, degradation and overloads, during environment events. There are several system middleware choices for TRIDEC based upon a Service-oriented-architecture (SOA), Event-driven-Architecture (EDA), Cloud Computing, and Enterprise Service Bus (ESB). In an SOA, everything is a service (e.g. data access, processing and exchange); clients can request on demand or subscribe to services registered by providers; more often interaction is synchronous. In an EDA system, events that represent significant changes in state can be processed simply, or as streams or more complexly. Cloud computing is a virtualization, interoperable and elastic resource allocation model. An ESB, a fundamental component for enterprise messaging, supports synchronous and asynchronous message exchange models and has inbuilt resilience against ICT failure. Our middleware proposal is an ESB based hybrid architecture model: an SOA extension supports more synchronous workflows; EDA assists the ESB to handle more complex event processing; Cloud computing can be used to increase and decrease the ESB resources on demand. To reify this hybrid ESB centric architecture, we will adopt two complementary approaches: an open source one for scalability and resilience improvement while a commercial one can be used for ultra-speed messaging, whilst we can bridge between these two to support interoperability. In TRIDEC, to manage such a hybrid messaging system, overlay and underlay management techniques will be adopted. The managers (both global and local) will collect, store and update status information (e.g. CPU utilization, free space, number of clients) and balance the usage, throughput, and delays to improve resilience and scalability. The expected resilience improvement includes dynamic failover, self-healing, pre-emptive load balancing, and bottleneck prediction while the expected improvement for scalability includes capacity estimation, Http Bridge, and automatic configuration and reconfiguration (e.g. add or delete clients and servers).

  6. Lemnos interoperable security project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halbgewachs, Ronald D.

    2010-03-01

    With the Lemnos framework, interoperability of control security equipment is straightforward. To obtain interoperability between proprietary security appliance units, one or both vendors must now write cumbersome 'translation code.' If one party changes something, the translation code 'breaks.' The Lemnos project is developing and testing a framework that uses widely available security functions and protocols like IPsec - to form a secure communications channel - and Syslog, to exchange security log messages. Using this model, security appliances from two or more different vendors can clearly and securely exchange information, helping to better protect the total system. Simplify regulatory compliance inmore » a complicated security environment by leveraging the Lemnos framework. As an electric utility, are you struggling to implement the NERC CIP standards and other regulations? Are you weighing the misery of multiple management interfaces against committing to a ubiquitous single-vendor solution? When vendors build their security appliances to interoperate using the Lemnos framework, it becomes practical to match best-of-breed offerings from an assortment of vendors to your specific control systems needs. The Lemnos project is developing and testing a framework that uses widely available open-source security functions and protocols like IPsec and Syslog to create a secure communications channel between appliances in order to exchange security data.« less

  7. An Architecture for Semantically Interoperable Electronic Health Records.

    PubMed

    Toffanello, André; Gonçalves, Ricardo; Kitajima, Adriana; Puttini, Ricardo; Aguiar, Atualpa

    2017-01-01

    Despite the increasing adhesion of electronic health records, the challenge of semantic interoperability remains unsolved. The fact that different parties can exchange messages does not mean they can understand the underlying clinical meaning, therefore, it cannot be assumed or treated as a requirement. This work introduces an architecture designed to achieve semantic interoperability, in a way which organizations that follow different policies may still share medical information through a common infrastructure comparable to an ecosystem, whose organisms are exemplified within the Brazilian scenario. Nonetheless, the proposed approach describes a service-oriented design with modules adaptable to different contexts. We also discuss the establishment of an enterprise service bus to mediate a health infrastructure defined on top of international standards, such as openEHR and IHE. Moreover, we argue that, in order to achieve truly semantic interoperability in a wide sense, a proper profile must be published and maintained.

  8. Information Interaction Study for DER and DMS Interoperability

    NASA Astrophysics Data System (ADS)

    Liu, Haitao; Lu, Yiming; Lv, Guangxian; Liu, Peng; Chen, Yu; Zhang, Xinhui

    The Common Information Model (CIM) is an abstract data model that can be used to represent the major objects in Distribution Management System (DMS) applications. Because the Common Information Model (CIM) doesn't modeling the Distributed Energy Resources (DERs), it can't meet the requirements of DER operation and management for Distribution Management System (DMS) advanced applications. Modeling of DER were studied based on a system point of view, the article initially proposed a CIM extended information model. By analysis the basic structure of the message interaction between DMS and DER, a bidirectional messaging mapping method based on data exchange was proposed.

  9. Electronic health record - public health (EHR-PH) system prototype for interoperability in 21st century healthcare systems.

    PubMed

    Orlova, Anna O; Dunnagan, Mark; Finitzo, Terese; Higgins, Michael; Watkins, Todd; Tien, Allen; Beales, Steven

    2005-01-01

    Information exchange, enabled by computable interoperability, is the key to many of the initiatives underway including the development of Regional Health Information Exchanges, Regional Health Information Organizations, and the National Health Information Network. These initiatives must include public health as a full partner in the emerging transformation of our nation's healthcare system through the adoption and use of information technology. An electronic health record - public health (EHR-PH)system prototype was developed to demonstrate the feasibility of electronic data transfer from a health care provider, i.e. hospital or ambulatory care settings, to multiple customized public health systems which include a Newborn Metabolic Screening Registry, a Newborn Hearing Screening Registry, an Immunization Registry and a Communicable Disease Registry, using HL7 messaging standards. Our EHR-PH system prototype can be considered a distributed EHR-based RHIE/RHIO model - a principal element for a potential technical architecture for a NHIN.

  10. HTML5 microdata as a semantic container for medical information exchange.

    PubMed

    Kimura, Eizen; Kobayashi, Shinji; Ishihara, Ken

    2014-01-01

    Achieving interoperability between clinical electronic medical records (EMR) systems and cloud computing systems is challenging because of the lack of a universal reference method as a standard for information exchange with a secure connection. Here we describe an information exchange scheme using HTML5 microdata, where the standard semantic container is an HTML document. We embed HL7 messages describing laboratory test results in the microdata. We also annotate items in the clinical research report with the microdata. We mapped the laboratory test result data into the clinical research report using an HL7 selector specified in the microdata. This scheme can provide secure cooperation between the cloud-based service and the EMR system.

  11. Using a logical information model-driven design process in healthcare.

    PubMed

    Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen

    2011-01-01

    A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.

  12. Designing for scale: optimising the health information system architecture for mobile maternal health messaging in South Africa (MomConnect)

    PubMed Central

    Seebregts, Christopher; Dane, Pierre; Parsons, Annie Neo; Fogwill, Thomas; Rogers, Debbie; Bekker, Marcha; Shaw, Vincent; Barron, Peter

    2018-01-01

    MomConnect is a national initiative coordinated by the South African National Department of Health that sends text-based mobile phone messages free of charge to pregnant women who voluntarily register at any public healthcare facility in South Africa. We describe the system design and architecture of the MomConnect technical platform, planned as a nationally scalable and extensible initiative. It uses a health information exchange that can connect any standards-compliant electronic front-end application to any standards-compliant electronic back-end database. The implementation of the MomConnect technical platform, in turn, is a national reference application for electronic interoperability in line with the South African National Health Normative Standards Framework. The use of open content and messaging standards enables the architecture to include any application adhering to the selected standards. Its national implementation at scale demonstrates both the use of this technology and a key objective of global health information systems, which is to achieve implementation scale. The system’s limited clinical information, initially, allowed the architecture to focus on the base standards and profiles for interoperability in a resource-constrained environment with limited connectivity and infrastructural capacity. Maintenance of the system requires mobilisation of national resources. Future work aims to use the standard interfaces to include data from additional applications as well as to extend and interface the framework with other public health information systems in South Africa. The development of this platform has also shown the benefits of interoperability at both an organisational and technical level in South Africa. PMID:29713506

  13. Designing for scale: optimising the health information system architecture for mobile maternal health messaging in South Africa (MomConnect).

    PubMed

    Seebregts, Christopher; Dane, Pierre; Parsons, Annie Neo; Fogwill, Thomas; Rogers, Debbie; Bekker, Marcha; Shaw, Vincent; Barron, Peter

    2018-01-01

    MomConnect is a national initiative coordinated by the South African National Department of Health that sends text-based mobile phone messages free of charge to pregnant women who voluntarily register at any public healthcare facility in South Africa. We describe the system design and architecture of the MomConnect technical platform, planned as a nationally scalable and extensible initiative. It uses a health information exchange that can connect any standards-compliant electronic front-end application to any standards-compliant electronic back-end database. The implementation of the MomConnect technical platform, in turn, is a national reference application for electronic interoperability in line with the South African National Health Normative Standards Framework. The use of open content and messaging standards enables the architecture to include any application adhering to the selected standards. Its national implementation at scale demonstrates both the use of this technology and a key objective of global health information systems, which is to achieve implementation scale. The system's limited clinical information, initially, allowed the architecture to focus on the base standards and profiles for interoperability in a resource-constrained environment with limited connectivity and infrastructural capacity. Maintenance of the system requires mobilisation of national resources. Future work aims to use the standard interfaces to include data from additional applications as well as to extend and interface the framework with other public health information systems in South Africa. The development of this platform has also shown the benefits of interoperability at both an organisational and technical level in South Africa.

  14. An Electronic Health Record - Public Health (EHR-PH) System Prototype for Interoperability in 21st Century Healthcare Systems

    PubMed Central

    Orlova, Anna O.; Dunnagan, Mark; Finitzo, Terese; Higgins, Michael; Watkins, Todd; Tien, Allen; Beales, Steven

    2005-01-01

    Information exchange, enabled by computable interoperability, is the key to many of the initiatives underway including the development of Regional Health Information Exchanges, Regional Health Information Organizations, and the National Health Information Network. These initiatives must include public health as a full partner in the emerging transformation of our nation’s healthcare system through the adoption and use of information technology. An electronic health record - public health (EHR-PH) system prototype was developed to demonstrate the feasibility of electronic data transfer from a health care provider, i.e. hospital or ambulatory care settings, to multiple customized public health systems which include a Newborn Metabolic Screening Registry, a Newborn Hearing Screening Registry, an Immunization Registry and a Communicable Disease Registry, using HL7 messaging standards. Our EHR-PH system prototype can be considered a distributed EHR-based RHIE/RHIO model - a principal element for a potential technical architecture for a NHIN. PMID:16779105

  15. Simplifying HL7 Version 3 messages.

    PubMed

    Worden, Robert; Scott, Philip

    2011-01-01

    HL7 Version 3 offers a semantically robust method for healthcare interoperability but has been criticized as overly complex to implement. This paper reviews initiatives to simplify HL7 Version 3 messaging and presents a novel approach based on semantic mapping. Based on user-defined definitions, precise transforms between simple and full messages are automatically generated. Systems can be interfaced with the simple messages and achieve interoperability with full Version 3 messages through the transforms. This reduces the costs of HL7 interfacing and will encourage better uptake of HL7 Version 3 and CDA.

  16. A Collaboration-Oriented M2M Messaging Mechanism for the Collaborative Automation between Machines in Future Industrial Networks

    PubMed Central

    Gray, John

    2017-01-01

    Machine-to-machine (M2M) communication is a key enabling technology for industrial internet of things (IIoT)-empowered industrial networks, where machines communicate with one another for collaborative automation and intelligent optimisation. This new industrial computing paradigm features high-quality connectivity, ubiquitous messaging, and interoperable interactions between machines. However, manufacturing IIoT applications have specificities that distinguish them from many other internet of things (IoT) scenarios in machine communications. By highlighting the key requirements and the major technical gaps of M2M in industrial applications, this article describes a collaboration-oriented M2M (CoM2M) messaging mechanism focusing on flexible connectivity and discovery, ubiquitous messaging, and semantic interoperability that are well suited for the production line-scale interoperability of manufacturing applications. The designs toward machine collaboration and data interoperability at both the communication and semantic level are presented. Then, the application scenarios of the presented methods are illustrated with a proof-of-concept implementation in the PicknPack food packaging line. Eventually, the advantages and some potential issues are discussed based on the PicknPack practice. PMID:29165347

  17. Profiling Fast Healthcare Interoperability Resources (FHIR) of Family Health History based on the Clinical Element Models.

    PubMed

    Lee, Jaehoon; Hulse, Nathan C; Wood, Grant M; Oniki, Thomas A; Huff, Stanley M

    2016-01-01

    In this study we developed a Fast Healthcare Interoperability Resources (FHIR) profile to support exchanging a full pedigree based family health history (FHH) information across multiple systems and applications used by clinicians, patients, and researchers. We used previously developed clinical element models (CEMs) that are capable of representing the FHH information, and derived essential data elements including attributes, constraints, and value sets. We analyzed gaps between the FHH CEM elements and existing FHIR resources. Based on the analysis, we developed a profile that consists of 1) FHIR resources for essential FHH data elements, 2) extensions for additional elements that were not covered by the resources, and 3) a structured definition to integrate patient and family member information in a FHIR message. We implemented the profile using an open-source based FHIR framework and validated it using patient-entered FHH data that was captured through a locally developed FHH tool.

  18. Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101

    DTIC Science & Technology

    2012-07-01

    interoperability, although they are supported by some interoperability attributes  For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p

  19. Implementing standards for the interoperability among healthcare providers in the public regionalized Healthcare Information System of the Lombardy Region.

    PubMed

    Barbarito, Fulvio; Pinciroli, Francesco; Mason, John; Marceglia, Sara; Mazzola, Luca; Bonacina, Stefano

    2012-08-01

    Information technologies (ITs) have now entered the everyday workflow in a variety of healthcare providers with a certain degree of independence. This independence may be the cause of difficulty in interoperability between information systems and it can be overcome through the implementation and adoption of standards. Here we present the case of the Lombardy Region, in Italy, that has been able, in the last 10 years, to set up the Regional Social and Healthcare Information System, connecting all the healthcare providers within the region, and providing full access to clinical and health-related documents independently from the healthcare organization that generated the document itself. This goal, in a region with almost 10 millions citizens, was achieved through a twofold approach: first, the political and operative push towards the adoption of the Health Level 7 (HL7) standard within single hospitals and, second, providing a technological infrastructure for data sharing based on interoperability specifications recognized at the regional level for messages transmitted from healthcare providers to the central domain. The adoption of such regional interoperability specifications enabled the communication among heterogeneous systems placed in different hospitals in Lombardy. Integrating the Healthcare Enterprise (IHE) integration profiles which refer to HL7 standards are adopted within hospitals for message exchange and for the definition of integration scenarios. The IHE patient administration management (PAM) profile with its different workflows is adopted for patient management, whereas the Scheduled Workflow (SWF), the Laboratory Testing Workflow (LTW), and the Ambulatory Testing Workflow (ATW) are adopted for order management. At present, the system manages 4,700,000 pharmacological e-prescriptions, and 1,700,000 e-prescriptions for laboratory exams per month. It produces, monthly, 490,000 laboratory medical reports, 180,000 radiology medical reports, 180,000 first aid medical reports, and 58,000 discharge summaries. Hence, despite there being still work in progress, the Lombardy Region healthcare system is a fully interoperable social healthcare system connecting patients, healthcare providers, healthcare organizations, and healthcare professionals in a large and heterogeneous territory through the implementation of international health standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. A hazard-independent approach for the standardised multi-channel dissemination of warning messages

    NASA Astrophysics Data System (ADS)

    Esbri Palomares, M. A.; Hammitzsch, M.; Lendholt, M.

    2012-04-01

    The tsunami disaster affecting the Indian Ocean region on Christmas 2004 demonstrated very clearly the shortcomings in tsunami detection, public warning processes as well as intergovernmental warning message exchange in the Indian Ocean region. In that regard, early warning systems require that the dissemination of early warning messages has to be executed in way that ensures that the message delivery is timely; the message content is understandable, usable and accurate. To that end, diverse and multiple dissemination channels must be used to increase the chance of the messages reaching all affected persons in a hazard scenario. In addition to this, usage of internationally accepted standards for the warning dissemination such as the Common Alerting Protocol (CAP) and Emergency Data Exchange Language (EDXL) Distribution Element specified by the Organization for the Advancement of Structured Information Standards (OASIS) increase the interoperability among different warning systems enabling thus the concept of system-of-systems proposed by GEOSS. The project Distant Early Warning System (DEWS), co-funded by the European Commission under the 6th Framework Programme, aims at strengthening the early warning capacities by building an innovative generation of interoperable tsunami early warning systems based on the above mentioned concepts following a Service-oriented Architecture (SOA) approach. The project focuses on the downstream part of the hazard information processing where customized, user-tailored warning messages and alerts flow from the warning centre to the responsible authorities and/or the public with their different needs and responsibilities. The information logistics services within DEWS generate tailored EDXL-DE/CAP warning messages for each user that must receive the message according to their preferences, e.g., settings for language, interested areas, dissemination channels, etc.. However, the significant difference in the implementation and capabilities of different dissemination channels such as SMS, email and television, have bearing on the information processing required for delivery and consumption of a DEWS EDXL-DE/CAP message over each dissemination channel. These messages may include additional information in the form of maps, graphs, documents, sensor observations, etc. Therefore, the generated messages are pre-processed by channel adaptors in the information dissemination services converting it into a format that is suitable for end-to-end delivery over the dissemination channels without any semantic distortion. The approach followed by DEWS for disseminating warnings not only relies on traditional communication ways used by the already established early warnings such as the delivery of faxes and phone calls but takes into consideration the use of other broadly used communication channels such as SMS, email, narrowcast and broadcast television, instant messaging, Voice over IP, and radio. It also takes advantage of social media channels like RSS feeds, Facebook, Twitter, etc., enabling a multiplier effect, like in the case of radio and television, and thus allowing to create mash-ups by aggregating other sources of information to the original message. Finally, status information is also important in order to assess and understand whether the process of disseminating the warning to the message consumers has been successfully completed or the process failed at some point of the dissemination chain. To that end, CAP-based messages generated within the information dissemination services provide the semantics for those fields that are of interest within the context of reporting the warning dissemination status in DEWS.

  1. E-health and healthcare enterprise information system leveraging service-oriented architecture.

    PubMed

    Hsieh, Sung-Huai; Hsieh, Sheau-Ling; Cheng, Po-Hsun; Lai, Feipei

    2012-04-01

    To present the successful experiences of an integrated, collaborative, distributed, large-scale enterprise healthcare information system over a wired and wireless infrastructure in National Taiwan University Hospital (NTUH). In order to smoothly and sequentially transfer from the complex relations among the old (legacy) systems to the new-generation enterprise healthcare information system, we adopted the multitier framework based on service-oriented architecture to integrate the heterogeneous systems as well as to interoperate among many other components and multiple databases. We also present mechanisms of a logical layer reusability approach and data (message) exchange flow via Health Level 7 (HL7) middleware, DICOM standard, and the Integrating the Healthcare Enterprise workflow. The architecture and protocols of the NTUH enterprise healthcare information system, especially in the Inpatient Information System (IIS), are discussed in detail. The NTUH Inpatient Healthcare Information System is designed and deployed on service-oriented architecture middleware frameworks. The mechanisms of integration as well as interoperability among the components and the multiple databases apply the HL7 standards for data exchanges, which are embedded in XML formats, and Microsoft .NET Web services to integrate heterogeneous platforms. The preliminary performance of the current operation IIS is evaluated and analyzed to verify the efficiency and effectiveness of the designed architecture; it shows reliability and robustness in the highly demanding traffic environment of NTUH. The newly developed NTUH IIS provides an open and flexible environment not only to share medical information easily among other branch hospitals, but also to reduce the cost of maintenance. The HL7 message standard is widely adopted to cover all data exchanges in the system. All services are independent modules that enable the system to be deployed and configured to the highest degree of flexibility. Furthermore, we can conclude that the multitier Inpatient Healthcare Information System has been designed successfully and in a collaborative manner, based on the index of performance evaluations, central processing unit, and memory utilizations.

  2. Convergence of Health Level Seven Version 2 Messages to Semantic Web Technologies for Software-Intensive Systems in Telemedicine Trauma Care.

    PubMed

    Menezes, Pedro Monteiro; Cook, Timothy Wayne; Cavalini, Luciana Tricai

    2016-01-01

    To present the technical background and the development of a procedure that enriches the semantics of Health Level Seven version 2 (HL7v2) messages for software-intensive systems in telemedicine trauma care. This study followed a multilevel model-driven approach for the development of semantically interoperable health information systems. The Pre-Hospital Trauma Life Support (PHTLS) ABCDE protocol was adopted as the use case. A prototype application embedded the semantics into an HL7v2 message as an eXtensible Markup Language (XML) file, which was validated against an XML schema that defines constraints on a common reference model. This message was exchanged with a second prototype application, developed on the Mirth middleware, which was also used to parse and validate both the original and the hybrid messages. Both versions of the data instance (one pure XML, one embedded in the HL7v2 message) were equally validated and the RDF-based semantics recovered by the receiving side of the prototype from the shared XML schema. This study demonstrated the semantic enrichment of HL7v2 messages for intensive-software telemedicine systems for trauma care, by validating components of extracts generated in various computing environments. The adoption of the method proposed in this study ensures the compliance of the HL7v2 standard in Semantic Web technologies.

  3. Impact of coalition interoperability on PKI

    NASA Astrophysics Data System (ADS)

    Krall, Edward J.

    2003-07-01

    This paper examines methods for providing PKI interoperability among units of a coalition of armed forces drawn from different nations. The area in question is tactical identity management, for the purposes of confidentiality, integrity and non-repudiation in such a dynamic coalition. The interoperating applications under consideration range from email and other forms of store-and-forward messaging to TLS and IPSEC-protected real-time communications. Six interoperability architectures are examined with advantages and disadvantages of each described in the paper.

  4. Convergence of Health Level Seven Version 2 Messages to Semantic Web Technologies for Software-Intensive Systems in Telemedicine Trauma Care

    PubMed Central

    Cook, Timothy Wayne; Cavalini, Luciana Tricai

    2016-01-01

    Objectives To present the technical background and the development of a procedure that enriches the semantics of Health Level Seven version 2 (HL7v2) messages for software-intensive systems in telemedicine trauma care. Methods This study followed a multilevel model-driven approach for the development of semantically interoperable health information systems. The Pre-Hospital Trauma Life Support (PHTLS) ABCDE protocol was adopted as the use case. A prototype application embedded the semantics into an HL7v2 message as an eXtensible Markup Language (XML) file, which was validated against an XML schema that defines constraints on a common reference model. This message was exchanged with a second prototype application, developed on the Mirth middleware, which was also used to parse and validate both the original and the hybrid messages. Results Both versions of the data instance (one pure XML, one embedded in the HL7v2 message) were equally validated and the RDF-based semantics recovered by the receiving side of the prototype from the shared XML schema. Conclusions This study demonstrated the semantic enrichment of HL7v2 messages for intensive-software telemedicine systems for trauma care, by validating components of extracts generated in various computing environments. The adoption of the method proposed in this study ensures the compliance of the HL7v2 standard in Semantic Web technologies. PMID:26893947

  5. GMSEC Interface Specification Document 2016 March

    NASA Technical Reports Server (NTRS)

    Handy, Matthew

    2016-01-01

    The GMSEC Interface Specification Document contains the standard set of defined messages. Each GMSEC standard message contains a GMSEC Information Bus Header section and a Message Contents section. Each message section identifies required fields, optional fields, data type and recommended use of the fields. Additionally, this document includes the message subjects associated with the standard messages. The system design of the operations center should ensure the components that are selected use both the API and the defined standard messages in order to achieve full interoperability from component to component.

  6. [HL7 standard--features, principles, and methodology].

    PubMed

    Koncar, Miroslav

    2005-01-01

    The mission of HL7 Inc. non-profit organization is to provide standards for the exchange, management and integration of data that support clinical patient care, and the management, delivery and evaluation of healthcare services. As the standards developed by HL7 Inc. represent the world's most influential standardization efforts in the field of medical informatics, the HL7 family of standards has been recognized by the technical and scientific community as the foundation for the next generation healthcare information systems. Versions 1 and 2 of HL7 standard have solved many issues, but also demonstrated the size and complexity of health information sharing problem. As the solution complete new methodology has been adopted that is encompassed in the HL7 Version 3 recommendations. This approach standardizes Reference Information Model (RIM), which is the source of all derived domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely coupled systems that are.designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project in the Republic of Croatia in 2002, the decision was to go directly to Version 3. The target scope of work includes clinical, financial and administrative data management in the domain of healthcare processes. By using HL7v3 standardized methodology we were able to completely map the Croatian primary healthcare domain to HL7v3 artefacts. Further refinement processes that are planned for the future will provide semantic interoperability and detailed description of all elements in HL7 messages. Our HL7 Business Component is in constant process of studying different legacy applications, making solid foundation for their integration to HL7-enabled communication environment.

  7. Feasibility of Representing a Danish Microbiology Model Using FHIR.

    PubMed

    Andersen, Mie Vestergaard; Kristensen, Ida Hvass; Larsen, Malene Møller; Pedersen, Claus Hougaard; Gøeg, Kirstine Rosenbeck; Pape-Haugaard, Louise B

    2017-01-01

    Achieving interoperability in health is a challenge and requires standardization. The newly developed HL7 standard: Fast Healthcare Interoperability Resources (FHIR) promises both flexibility and interoperability. This study investigates the feasibility of expressing a Danish microbiology message model content in FHIR to explore whether complex in-use legacy models can be migrated and what challenges this may pose. The Danish microbiology message model (the DMM) is used as a case to illustrate challenges and opportunities accosted with applying the FHIR standard. Mapping of content from DMM to FHIR was done as close as possible to the DMM to minimize migration costs except when the structure of the content did not fit into FHIR. From the DMM a total of 183 elements were mapped to FHIR. 75 (40.9%) elements were modeled as existing FHIR elements and 96 (52.5%) elements were modeled as extensions and 12 (6.6%) elements were deemed unnecessary because of build-in FHIR characteristics. In this study, it was possible to represent the content of a Danish message model using HL7 FHIR.

  8. An HLA-Based Approach to Quantify Achievable Performance for Tactical Edge Applications

    DTIC Science & Technology

    2011-05-01

    in: Proceedings of the 2002 Fall Simulation Interoperability Workshop, 02F- SIW -068, Nov 2002. [16] P. Knight, et al. ―WBT RTI Independent...Benchmark Tests: Design, Implementation, and Updated Results‖, in: Proceedings of the 2002 Spring Simulation Interoperability Workshop, 02S- SIW -081, March...Interoperability Workshop, 98F- SIW -085, Nov 1998. [18] S. Ferenci and R. Fujimoto. ―RTI Performance on Shared Memory and Message Passing Architectures‖, in

  9. Asynchronous Message Service Reference Implementation

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott C.

    2011-01-01

    This software provides a library of middleware functions with a simple application programming interface, enabling implementation of distributed applications in conformance with the CCSDS AMS (Consultative Committee for Space Data Systems Asynchronous Message Service) specification. The AMS service, and its protocols, implement an architectural concept under which the modules of mission systems may be designed as if they were to operate in isolation, each one producing and consuming mission information without explicit awareness of which other modules are currently operating. Communication relationships among such modules are self-configuring; this tends to minimize complexity in the development and operations of modular data systems. A system built on this model is a society of generally autonomous, inter-operating modules that may fluctuate freely over time in response to changing mission objectives, modules functional upgrades, and recovery from individual module failure. The purpose of AMS, then, is to reduce mission cost and risk by providing standard, reusable infrastructure for the exchange of information among data system modules in a manner that is simple to use, highly automated, flexible, robust, scalable, and efficient. The implementation is designed to spawn multiple threads of AMS functionality under the control of an AMS application program. These threads enable all members of an AMS-based, distributed application to discover one another in real time, subscribe to messages on specific topics, and to publish messages on specific topics. The query/reply (client/server) communication model is also supported. Message exchange is optionally subject to encryption (to support confidentiality) and authorization. Fault tolerance measures in the discovery protocol minimize the likelihood of overall application failure due to any single operational error anywhere in the system. The multi-threaded design simplifies processing while enabling application nodes to operate at high speeds; linked lists protected by mutex semaphores and condition variables are used for efficient, inter-thread communication. Applications may use a variety of transport protocols underlying AMS itself, including TCP (Transmission Control Protocol), UDP (User Datagram Protocol), and message queues.

  10. Experimenting with C2 Applications and Federated Infrastructures for Integrated Full-Spectrum Operational Environments in Support of Collaborative Planning and Interoperable Execution

    DTIC Science & Technology

    2004-06-01

    Situation Understanding) Common Operational Pictures Planning & Decision Support Capabilities Message & Order Processing Common Operational...Pictures Planning & Decision Support Capabilities Message & Order Processing Common Languages & Data Models Modeling & Simulation Domain

  11. Combining Archetypes with Fast Health Interoperability Resources in Future-proof Health Information Systems.

    PubMed

    Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat

    2015-01-01

    Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes.

  12. Interoperability for Space Mission Monitor and Control: Applying Technologies from Manufacturing Automation and Process Control Industries

    NASA Technical Reports Server (NTRS)

    Jones, Michael K.

    1998-01-01

    Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.

  13. C2 Core and UCore Message Design Capstone: Interoperable Message Structure

    DTIC Science & Technology

    2009-09-01

    there are sufficient resources to carry out a mission. The Team used the Theatre Battle Management Command System ( TBMCS ) to generate sample CMD...System ( TBMCS ) was used to generate CMD messages as inputs for both use cases. These were programmatically transformed into the three-layer message...used for the experiment was generated from the TBMCS in the form of a CMD XML document. The Capstone experiment included transforming that document to

  14. Electronic Information Standards to Support Obesity Prevention and Bridge Services Across Systems, 2010-2015.

    PubMed

    Wiltz, Jennifer L; Blanck, Heidi M; Lee, Brian; Kocot, S Lawrence; Seeff, Laura; McGuire, Lisa C; Collins, Janet

    2017-10-26

    Electronic information technology standards facilitate high-quality, uniform collection of data for improved delivery and measurement of health care services. Electronic information standards also aid information exchange between secure systems that link health care and public health for better coordination of patient care and better-informed population health improvement activities. We developed international data standards for healthy weight that provide common definitions for electronic information technology. The standards capture healthy weight data on the "ABCDs" of a visit to a health care provider that addresses initial obesity prevention and care: assessment, behaviors, continuity, identify resources, and set goals. The process of creating healthy weight standards consisted of identifying needs and priorities, developing and harmonizing standards, testing the exchange of data messages, and demonstrating use-cases. Healthy weight products include 2 message standards, 5 use-cases, 31 LOINC (Logical Observation Identifiers Names and Codes) question codes, 7 healthy weight value sets, 15 public-private engagements with health information technology implementers, and 2 technical guides. A logic model and action steps outline activities toward better data capture, interoperable systems, and information use. Sharing experiences and leveraging this work in the context of broader priorities can inform the development of electronic information standards for similar core conditions and guide strategic activities in electronic systems.

  15. Electronic Information Standards to Support Obesity Prevention and Bridge Services Across Systems, 2010–2015

    PubMed Central

    Blanck, Heidi M.; Lee, Brian; Kocot, S. Lawrence; Seeff, Laura; McGuire, Lisa C.; Collins, Janet

    2017-01-01

    Electronic information technology standards facilitate high-quality, uniform collection of data for improved delivery and measurement of health care services. Electronic information standards also aid information exchange between secure systems that link health care and public health for better coordination of patient care and better-informed population health improvement activities. We developed international data standards for healthy weight that provide common definitions for electronic information technology. The standards capture healthy weight data on the “ABCDs” of a visit to a health care provider that addresses initial obesity prevention and care: assessment, behaviors, continuity, identify resources, and set goals. The process of creating healthy weight standards consisted of identifying needs and priorities, developing and harmonizing standards, testing the exchange of data messages, and demonstrating use-cases. Healthy weight products include 2 message standards, 5 use-cases, 31 LOINC (Logical Observation Identifiers Names and Codes) question codes, 7 healthy weight value sets, 15 public–private engagements with health information technology implementers, and 2 technical guides. A logic model and action steps outline activities toward better data capture, interoperable systems, and information use. Sharing experiences and leveraging this work in the context of broader priorities can inform the development of electronic information standards for similar core conditions and guide strategic activities in electronic systems. PMID:29072985

  16. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran

    PubMed Central

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E.

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information. PMID:25954452

  17. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran.

    PubMed

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.

  18. Message Received How to Bridge the Communication Gap and Save Lives

    DTIC Science & Technology

    2004-03-01

    safety during an emergency depend on the ability of first responders to talk via radio, directly, without dispatch and in real time. Many technologies are...Words interoperability Coast Guard first responders procedures interagency communications policies 18...communication interoperability for public safety first responders entails far more than finding and emplacing a technology and training the operators. The

  19. On the formal definition of the systems' interoperability capability: an anthropomorphic approach

    NASA Astrophysics Data System (ADS)

    Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav

    2017-03-01

    The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.

  20. Building a portable data and information interoperability infrastructure-framework for a standard Taiwan Electronic Medical Record Template.

    PubMed

    Jian, Wen-Shan; Hsu, Chien-Yeh; Hao, Te-Hui; Wen, Hsyien-Chia; Hsu, Min-Huei; Lee, Yen-Liang; Li, Yu-Chuan; Chang, Polun

    2007-11-01

    Traditional electronic health record (EHR) data are produced from various hospital information systems. They could not have existed independently without an information system until the incarnation of XML technology. The interoperability of a healthcare system can be divided into two dimensions: functional interoperability and semantic interoperability. Currently, no single EHR standard exists that provides complete EHR interoperability. In order to establish a national EHR standard, we developed a set of local EHR templates. The Taiwan Electronic Medical Record Template (TMT) is a standard that aims to achieve semantic interoperability in EHR exchanges nationally. The TMT architecture is basically composed of forms, components, sections, and elements. Data stored in the elements which can be referenced by the code set, data type, and narrative block. The TMT was established with the following requirements in mind: (1) transformable to international standards; (2) having a minimal impact on the existing healthcare system; (3) easy to implement and deploy, and (4) compliant with Taiwan's current laws and regulations. The TMT provides a basis for building a portable, interoperable information infrastructure for EHR exchange in Taiwan.

  1. GPS Timing Performance

    DTIC Science & Technology

    2014-01-01

    termed the Galileo -GPS Time Offset (GGTO), and it will be Type 35 in the GPS CNAV message. Knowledge of the GGTO makes it possible for a properly...U.S. Naval Observatory (USNO) [1]. Interoperability with Galileo , and perhaps someday with other Global Navigation Satellite Systems (GNSS), is to...Interoperability with Galileo , and perhaps someday with other Global Navigation Satellite Systems (GNSS), is to be established through transmission of the

  2. An Approach to Semantic Interoperability for Improved Capability Exchanges in Federations of Systems

    ERIC Educational Resources Information Center

    Moschoglou, Georgios

    2013-01-01

    This study seeks an affirmative answer to the question whether a knowledge-based approach to system of systems interoperation using semantic web standards and technologies can provide the centralized control of the capability for exchanging data and services lacking in a federation of systems. Given the need to collect and share real-time…

  3. A framework for semantic interoperability in healthcare: a service oriented architecture based on health informatics standards.

    PubMed

    Ryan, Amanda; Eklund, Peter

    2008-01-01

    Healthcare information is composed of many types of varying and heterogeneous data. Semantic interoperability in healthcare is especially important when all these different types of data need to interact. Presented in this paper is a solution to interoperability in healthcare based on a standards-based middleware software architecture used in enterprise solutions. This architecture has been translated into the healthcare domain using a messaging and modeling standard which upholds the ideals of the Semantic Web (HL7 V3) combined with a well-known standard terminology of clinical terms (SNOMED CT).

  4. Implementation of Certified EHR, Patient Portal, and "Direct" Messaging Technology in a Radiology Environment Enhances Communication of Radiology Results to Both Referring Physicians and Patients.

    PubMed

    Reicher, Joshua Jay; Reicher, Murray Aaron

    2016-06-01

    Since 2009, the Federal government distributed over $29 billion to providers who were adopting compliant electronic health record (EHR) technology. With a focus on radiology, we explore how EHR technology impacts interoperability with referring clinicians' EHRs and patient engagement. We also discuss the high-level details of contributing supporting frameworks, specifically Direct messaging and health information service provider (HISP) technology. We characterized Direct messaging, a secure e-mail-like protocol built to allow exchange of encrypted health information online, and the new supporting HISP infrastructure. Statistics related to both the testing and active use of this framework were obtained from DirectTrust.org, an organization whose framework supports Direct messaging use by healthcare organizations. To evaluate patient engagement, we obtained usage data from a radiology-centric patient portal between 2014 and 2015, which in some cases included access to radiology reports. Statistics from 2013 to 2015 showed a rise in issued secure Direct addresses from 8724 to 752,496; a rise in the number of participating healthcare organizations from 667 to 39,751; and a rise in the secure messages sent from 122,842 to 27,316,438. Regarding patient engagement, an average of 234,679 patients per month were provided portal access, with 86,400 patients per month given access to radiology reports. Availability of radiology reports online was strongly associated with increased system usage, with a likelihood ratio of 2.63. The use of certified EHR technology and Direct messaging in the practice of radiology allows for the communication of patient information and radiology results with referring clinicians and increases patient use of patient portal technology, supporting bidirectional radiologist-patient communication.

  5. The NASA Scientific and Technical Information (STI) Program's Implementation of Open Archives Initiative (OAI) for Data Interoperability and Data Exchange.

    ERIC Educational Resources Information Center

    Rocker, JoAnne; Roncaglia, George J.; Heimerl, Lynn N.; Nelson, Michael L.

    Interoperability and data-exchange are critical for the survival of government information management programs. E-government initiatives are transforming the way the government interacts with the public. More information is to be made available through Web-enabled technologies. Programs such as the NASA's Scientific and Technical Information (STI)…

  6. FHIR Healthcare Directories: Adopting Shared Interfaces to Achieve Interoperable Medical Device Data Integration.

    PubMed

    Tyndall, Timothy; Tyndall, Ayami

    2018-01-01

    Healthcare directories are vital for interoperability among healthcare providers, researchers and patients. Past efforts at directory services have not provided the tools to allow integration of the diverse data sources. Many are overly strict, incompatible with legacy databases, and do not provide Data Provenance. A more architecture-independent system is needed to enable secure, GDPR-compatible (8) service discovery across organizational boundaries. We review our development of a portable Data Provenance Toolkit supporting provenance within Health Information Exchange (HIE) systems. The Toolkit has been integrated with client software and successfully leveraged in clinical data integration. The Toolkit validates provenance stored in a Blockchain or Directory record and creates provenance signatures, providing standardized provenance that moves with the data. This healthcare directory suite implements discovery of healthcare data by HIE and EHR systems via FHIR. Shortcomings of past directory efforts include the ability to map complex datasets and enabling interoperability via exchange endpoint discovery. By delivering data without dictating how it is stored we improve exchange and facilitate discovery on a multi-national level through open source, fully interoperable tools. With the development of Data Provenance resources we enhance exchange and improve security and usability throughout the health data continuum.

  7. Implementing the HL7v3 standard in Croatian primary healthcare domain.

    PubMed

    Koncar, Miroslav

    2004-01-01

    The mission of HL7 Inc. is to provide standards for the exchange, management and integration of data that supports clinical patient care and the management, delivery and evaluation of healthcare services. The scope of this work includes the specifications of flexible, cost-effective approaches, standards, guidelines, methodologies, and related services for interoperability between healthcare information systems. In the field of medical information technologies, HL7 provides the world's most advanced information standards. Versions 1 and 2 of the HL7 standard have on the one hand solved many issues, but on the other demonstrated the size and complexity of the health information sharing problem. As the solution, a complete new methodology has been adopted, which is being encompassed in version 3 recommendations. This approach standardizes the Reference Information Model (RIM), which is the source of all domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely-coupled systems that are designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project, we have decided to go directly to HL7v3. Implementing the HL7v3 standard in healthcare applications represents a challenging task. By using standardized refinement and localization methods we were able to define information models for Croatian primary healthcare domain. The scope of our work includes clinical, financial and administrative data management, where in some cases we were compelled to introduce new HL7v3-compliant models. All of the HL7v3 transactions are digitally signed, using the W3C XML Digital Signature standard.

  8. Design and management of public health outreach using interoperable mobile multimedia: an analysis of a national winter weather preparedness campaign.

    PubMed

    Bandera, Cesar

    2016-05-25

    The Office of Public Health Preparedness and Response (OPHPR) in the Centers for Disease Control and Prevention conducts outreach for public preparedness for natural and manmade incidents. In 2011, OPHPR conducted a nationwide mobile public health (m-Health) campaign that pushed brief videos on preparing for severe winter weather onto cell phones, with the objective of evaluating the interoperability of multimedia m-Health outreach with diverse cell phones (including handsets without Internet capability), carriers, and user preferences. Existing OPHPR outreach material on winter weather preparedness was converted into mobile-ready multimedia using mobile marketing best practices to improve audiovisual quality and relevance. Middleware complying with opt-in requirements was developed to push nine bi-weekly multimedia broadcasts onto subscribers' cell phones, and OPHPR promoted the campaign on its web site and to subscribers on its govdelivery.com notification platform. Multimedia, text, and voice messaging activity to/from the middleware was logged and analyzed. Adapting existing media into mobile video was straightforward using open source and commercial software, including web pages, PDF documents, and public service announcements. The middleware successfully delivered all outreach videos to all participants (a total of 504 videos) regardless of the participant's device. 54 % of videos were viewed on cell phones, 32 % on computers, and 14 % were retrieved by search engine web crawlers. 21 % of participating cell phones did not have Internet access, yet still received and displayed all videos. The time from media push to media viewing on cell phones was half that of push to viewing on computers. Video delivered through multimedia messaging can be as interoperable as text messages, while providing much richer information. This may be the only multimedia mechanism available to outreach campaigns targeting vulnerable populations impacted by the digital divide. Anti-spam laws preserve the integrity of mobile messaging, but complicate campaign promotion. Person-to-person messages may boost enrollment.

  9. Positive train control shared network.

    DOT National Transportation Integrated Search

    2015-05-01

    The Interoperable Train Control (ITC) Positive : Train Control (PTC) Shared Network (IPSN) : project investigated anticipated industry benefits : and the level of support for the development of : a hosted technological platform for PTC : messaging ac...

  10. Multi-disciplinary interoperability challenges (Ian McHarg Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Annoni, Alessandro

    2013-04-01

    Global sustainability research requires multi-disciplinary efforts to address the key research challenges to increase our understanding of the complex relationships between environment and society. For this reason dependence on ICT systems interoperability is rapidly growing but, despite some relevant technological improvement is observed, in practice operational interoperable solutions are still lacking. Among the causes is the absence of a generally accepted definition of "interoperability" in all its broader aspects. In fact the concept of interoperability is just a concept and the more popular definitions are not addressing all challenges to realize operational interoperable solutions. The problem become even more complex when multi-disciplinary interoperability is required because in that case solutions for interoperability of different interoperable solution should be envisaged. In this lecture the following definition will be used: "interoperability is the ability to exchange information and to use it". In the lecture the main challenges for addressing multi-disciplinary interoperability will be presented and a set of proposed approaches/solutions shortly introduced.

  11. 2015 ESGF Progress Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, D. N.

    2015-06-22

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration whose purpose is to develop the software infrastructure needed to facilitate and empower the study of climate change on a global scale. ESGF’s architecture employs a system of geographically distributed peer nodes that are independently administered yet united by common federation protocols and application programming interfaces. The cornerstones of its interoperability are the peer-to-peer messaging, which is continuously exchanged among all nodes in the federation; a shared architecture for search and discovery; and a security infrastructure based on industry standards. ESGF integrates popular application engines available from the open-sourcemore » community with custom components (for data publishing, searching, user interface, security, and messaging) that were developed collaboratively by the team. The full ESGF infrastructure has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the Coupled Model Intercomparison Project (CMIP)—output used by the Intergovernmental Panel on Climate Change assessment reports. ESGF is a successful example of integration of disparate open-source technologies into a cohesive functional system that serves the needs of the global climate science community.« less

  12. Auditing Consistency and Usefulness of LOINC Use among Three Large Institutions - Using Version Spaces for Grouping LOINC Codes

    PubMed Central

    Lin, M.C.; Vreeman, D.J.; Huff, S.M.

    2012-01-01

    Objectives We wanted to develop a method for evaluating the consistency and usefulness of LOINC code use across different institutions, and to evaluate the degree of interoperability that can be attained when using LOINC codes for laboratory data exchange. Our specific goals were to: 1) Determine if any contradictory knowledge exists in LOINC. 2) Determine how many LOINC codes were used in a truly interoperable fashion between systems. 3) Provide suggestions for improving the semantic interoperability of LOINC. Methods We collected Extensional Definitions (EDs) of LOINC usage from three institutions. The version space approach was used to divide LOINC codes into small sets, which made auditing of LOINC use across the institutions feasible. We then compared pairings of LOINC codes from the three institutions for consistency and usefulness. Results The number of LOINC codes evaluated were 1,917, 1,267 and 1,693 as obtained from ARUP, Intermountain and Regenstrief respectively. There were 2,022, 2,030, and 2,301 version spaces among ARUP & Intermountain, Intermountain & Regenstrief and ARUP & Regenstrief respectively. Using the EDs as the gold standard, there were 104, 109 and 112 pairs containing contradictory knowledge and there were 1,165, 765 and 1,121 semantically interoperable pairs. The interoperable pairs were classified into three levels: 1) Level I – No loss of meaning, complete information was exchanged by identical codes. 2) Level II – No loss of meaning, but processing of data was needed to make the data completely comparable. 3) Level III – Some loss of meaning. For example, tests with a specific ‘method’ could be rolled-up with tests that were ‘methodless’. Conclusions There are variations in the way LOINC is used for data exchange that result in some data not being truly interoperable across different enterprises. To improve its semantic interoperability, we need to detect and correct any contradictory knowledge within LOINC and add computable relationships that can be used for making reliable inferences about the data. The LOINC committee should also provide detailed guidance on best practices for mapping from local codes to LOINC codes and for using LOINC codes in data exchange. PMID:22306382

  13. Assessment of Life Cycle Information Exchanges (LCie): Understanding the Value-Added Benefit of a COBie Process

    DTIC Science & Technology

    2013-10-01

    exchange (COBie), Building Information Modeling ( BIM ), value-added analysis, business processes, project management 16. SECURITY CLASSIFICATION OF: 17...equipment. The innovative aspect of Building In- formation Modeling ( BIM ) is that it creates a computable building descrip- tion. The ability to use a...interoperability. In order for the building information to be interoperable, it must also con- form to a common data model , or schema, that defines the class

  14. Designing learning management system interoperability in semantic web

    NASA Astrophysics Data System (ADS)

    Anistyasari, Y.; Sarno, R.; Rochmawati, N.

    2018-01-01

    The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.

  15. Turning Interoperability Operational with GST

    NASA Astrophysics Data System (ADS)

    Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha

    2013-04-01

    GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially how it complies with existing or developing standards or quasi-standards and how it applies and extents services towards interoperability in the Earth sciences.

  16. Architectural approaches for HL7-based health information systems implementation.

    PubMed

    López, D M; Blobel, B

    2010-01-01

    Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.

  17. Archive interoperability in the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Genova, Françoise

    2003-02-01

    Main goals of Virtual Observatory projects are to build interoperability between astronomical on-line services, observatory archives, databases and results published in journals, and to develop tools permitting the best scientific usage from the very large data sets stored in observatory archives and produced by large surveys. The different Virtual Observatory projects collaborate to define common exchange standards, which are the key for a truly International Virtual Observatory: for instance their first common milestone has been a standard allowing exchange of tabular data, called VOTable. The Interoperability Work Area of the European Astrophysical Virtual Observatory project aims at networking European archives, by building a prototype using the CDS VizieR and Aladin tools, and at defining basic rules to help archive providers in interoperability implementation. The prototype is accessible for scientific usage, to get user feedback (and science results!) at an early stage of the project. ISO archive participates very actively to this endeavour, and more generally to information networking. The on-going inclusion of the ISO log in SIMBAD will allow higher level links for users.

  18. Incorporating Brokers within Collaboration Environments

    NASA Astrophysics Data System (ADS)

    Rajasekar, A.; Moore, R.; de Torcy, A.

    2013-12-01

    A collaboration environment, such as the integrated Rule Oriented Data System (iRODS - http://irods.diceresearch.org), provides interoperability mechanisms for accessing storage systems, authentication systems, messaging systems, information catalogs, networks, and policy engines from a wide variety of clients. The interoperability mechanisms function as brokers, translating actions requested by clients to the protocol required by a specific technology. The iRODS data grid is used to enable collaborative research within hydrology, seismology, earth science, climate, oceanography, plant biology, astronomy, physics, and genomics disciplines. Although each domain has unique resources, data formats, semantics, and protocols, the iRODS system provides a generic framework that is capable of managing collaborative research initiatives that span multiple disciplines. Each interoperability mechanism (broker) is linked to a name space that enables unified access across the heterogeneous systems. The collaboration environment provides not only support for brokers, but also support for virtualization of name spaces for users, files, collections, storage systems, metadata, and policies. The broker enables access to data or information in a remote system using the appropriate protocol, while the collaboration environment provides a uniform naming convention for accessing and manipulating each object. Within the NSF DataNet Federation Consortium project (http://www.datafed.org), three basic types of interoperability mechanisms have been identified and applied: 1) drivers for managing manipulation at the remote resource (such as data subsetting), 2) micro-services that execute the protocol required by the remote resource, and 3) policies for controlling the execution. For example, drivers have been written for manipulating NetCDF and HDF formatted files within THREDDS servers. Micro-services have been written that manage interactions with the CUAHSI data repository, the DataONE information catalog, and the GeoBrain broker. Policies have been written that manage transfer of messages between an iRODS message queue and the Advanced Message Queuing Protocol. Examples of these brokering mechanisms will be presented. The DFC collaboration environment serves as the intermediary between community resources and compute grids, enabling reproducible data-driven research. It is possible to create an analysis workflow that retrieves data subsets from a remote server, assemble the required input files, automate the execution of the workflow, automatically track the provenance of the workflow, and share the input files, workflow, and output files. A collaborator can re-execute a shared workflow, compare results, change input files, and re-execute an analysis.

  19. Seeking the Path to Metadata Nirvana

    NASA Astrophysics Data System (ADS)

    Graybeal, J.

    2008-12-01

    Scientists have always found reusing other scientists' data challenging. Computers did not fundamentally change the problem, but enabled more and larger instances of it. In fact, by removing human mediation and time delays from the data sharing process, computers emphasize the contextual information that must be exchanged in order to exchange and reuse data. This requirement for contextual information has two faces: "interoperability" when talking about systems, and "the metadata problem" when talking about data. As much as any single organization, the Marine Metadata Interoperability (MMI) project has been tagged with the mission "Solve the metadata problem." Of course, if that goal is achieved, then sustained, interoperable data systems for interdisciplinary observing networks can be easily built -- pesky metadata differences, like which protocol to use for data exchange, or what the data actually measures, will be a thing of the past. Alas, as you might imagine, there will always be complexities and incompatibilities that are not addressed, and data systems that are not interoperable, even within a science discipline. So should we throw up our hands and surrender to the inevitable? Not at all. Rather, we try to minimize metadata problems as much as we can. In this we increasingly progress, despite natural forces that pull in the other direction. Computer systems let us work with more complexity, build community knowledge and collaborations, and preserve and publish our progress and (dis-)agreements. Funding organizations, science communities, and technologists see the importance interoperable systems and metadata, and direct resources toward them. With the new approaches and resources, projects like IPY and MMI can simultaneously define, display, and promote effective strategies for sustainable, interoperable data systems. This presentation will outline the role metadata plays in durable interoperable data systems, for better or worse. It will describe times when "just choosing a standard" can work, and when it probably won't work. And it will point out signs that suggest a metadata storm is coming to your community project, and how you might avoid it. From these lessons we will seek a path to producing interoperable, interdisciplinary, metadata-enlightened environment observing systems.

  20. Study and validation of tools interoperability in JPSEC

    NASA Astrophysics Data System (ADS)

    Conan, V.; Sadourny, Y.; Jean-Marie, K.; Chan, C.; Wee, S.; Apostolopoulos, J.

    2005-08-01

    Digital imagery is important in many applications today, and the security of digital imagery is important today and is likely to gain in importance in the near future. The emerging international standard ISO/IEC JPEG-2000 Security (JPSEC) is designed to provide security for digital imagery, and in particular digital imagery coded with the JPEG-2000 image coding standard. One of the primary goals of a standard is to ensure interoperability between creators and consumers produced by different manufacturers. The JPSEC standard, similar to the popular JPEG and MPEG family of standards, specifies only the bitstream syntax and the receiver's processing, and not how the bitstream is created or the details of how it is consumed. This paper examines the interoperability for the JPSEC standard, and presents an example JPSEC consumption process which can provide insights in the design of JPSEC consumers. Initial interoperability tests between different groups with independently created implementations of JPSEC creators and consumers have been successful in providing the JPSEC security services of confidentiality (via encryption) and authentication (via message authentication codes, or MACs). Further interoperability work is on-going.

  1. {open_quotes}Media-On-Demand{close_quotes} multimedia electronic mail: A tool for collaboration on the web

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsoi, Kei Nam; Rahman, S.M.

    1996-12-31

    Undoubtedly, multimedia electronic mail has many advantages in exchanging information electronically in a collaborative work. The existing design of e-mail systems architecture is inefficient in exchanging multimedia message which has much larger volume, and requires more bandwidth and storage space than the text-only messages. This paper presents an innovative method for exchanging multimedia mail messages in a heterogeneous environment to support collaborative work over YAW on the Internet. We propose a {open_quotes}Parcel Collection{close_quotes} approach for exchanging multimedia electronic mail messages. This approach for exchanging multimedia electronic mail messages integrates the current WWW technologies with the existing electronic mail systems.

  2. Report on the Second Catalog Interoperability Workshop

    NASA Technical Reports Server (NTRS)

    Thieman, James R.; James, Mary E.

    1988-01-01

    The events, resolutions, and recommendations of the Second Catalog Interoperability Workshop, held at JPL in January, 1988, are discussed. This workshop dealt with the issues of standardization and communication among directories, catalogs, and inventories in the earth and space science data management environment. The Directory Interchange Format, being constructed as a standard for the exchange of directory information among participating data systems, is discussed. Involvement in the Interoperability effort by NASA, NOAA, ISGS, and NSF is described, and plans for future interoperability considered. The NASA Master Directory prototype is presented and critiqued and options for additional capabilities debated.

  3. Joining the yellow hub: Uses of the Simple Application Messaging Protocol in Space Physics analysis tools

    NASA Astrophysics Data System (ADS)

    Génot, V.; André, N.; Cecconi, B.; Bouchemit, M.; Budnik, E.; Bourrel, N.; Gangloff, M.; Dufourg, N.; Hess, S.; Modolo, R.; Renard, B.; Lormant, N.; Beigbeder, L.; Popescu, D.; Toniutti, J.-P.

    2014-11-01

    The interest for data communication between analysis tools in planetary sciences and space physics is illustrated in this paper via several examples of the uses of SAMP. The Simple Application Messaging Protocol is developed in the frame of the IVOA from an earlier protocol called PLASTIC. SAMP enables easy communication and interoperability between astronomy software, stand-alone and web-based; it is now increasingly adopted by the planetary sciences and space physics community. Its attractiveness is based, on one hand, on the use of common file formats for exchange and, on the other hand, on established messaging models. Examples of uses at the CDPP and elsewhere are presented. The CDPP (Centre de Données de la Physique des Plasmas, http://cdpp.eu/), the French data center for plasma physics, is engaged for more than a decade in the archiving and dissemination of data products from space missions and ground observatories. Besides these activities, the CDPP developed services like AMDA (Automated Multi Dataset Analysis, http://amda.cdpp.eu/) which enables in depth analysis of large amount of data through dedicated functionalities such as: visualization, conditional search and cataloging. Besides AMDA, the 3DView (http://3dview.cdpp.eu/) tool provides immersive visualizations and is further developed to include simulation and observational data. These tools and their interactions with each other, notably via SAMP, are presented via science cases of interest to planetary sciences and space physics communities.

  4. A Domain-Specific Language for Aviation Domain Interoperability

    ERIC Educational Resources Information Center

    Comitz, Paul

    2013-01-01

    Modern information systems require a flexible, scalable, and upgradeable infrastructure that allows communication and collaboration between heterogeneous information processing and computing environments. Aviation systems from different organizations often use differing representations and distribution policies for the same data and messages,…

  5. An adaptive semantic based mediation system for data interoperability among Health Information Systems.

    PubMed

    Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung

    2014-08-01

    Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.

  6. Generic Educational Knowledge Representation for Adaptive and Cognitive Systems

    ERIC Educational Resources Information Center

    Caravantes, Arturo; Galan, Ramon

    2011-01-01

    The interoperability of educational systems, encouraged by the development of specifications, standards and tools related to the Semantic Web is limited to the exchange of information in domain and student models. High system interoperability requires that a common framework be defined that represents the functional essence of educational systems.…

  7. Knowledge sharing in the health scenario

    PubMed Central

    2014-01-01

    The understanding of certain data often requires the collection of similar data from different places to be analysed and interpreted. Interoperability standards and ontologies, are facilitating data interchange around the world. However, beyond the existing networks and advances for data transfer, data sharing protocols to support multilateral agreements are useful to exploit the knowledge of distributed Data Warehouses. The access to a certain data set in a federated Data Warehouse may be constrained by the requirement to deliver another specific data set. When bilateral agreements between two nodes of a network are not enough to solve the constraints for accessing to a certain data set, multilateral agreements for data exchange are needed. We present the implementation of a Multi-Agent System for multilateral exchange agreements of clinical data, and evaluate how those multilateral agreements increase the percentage of data collected by a single node from the total amount of data available in the network. Different strategies to reduce the number of messages needed to achieve an agreement are also considered. The results show that with this collaborative sharing scenario the percentage of data collected dramaticaly improve from bilateral agreements to multilateral ones, up to reach almost all data available in the network. PMID:25471452

  8. A piloted simulation study of data link ATC message exchange

    NASA Technical Reports Server (NTRS)

    Waller, Marvin C.; Lohr, Gary W.

    1989-01-01

    Data link Air Traffic Control (ATC) and Air Traffic Service (ATS) message and data exchange offers the potential benefits of increased flight safety and efficiency by reducing communication errors and allowing more information to be transferred between aircraft and ground facilities. Digital communication also presents an opportunity to relieve the overloading of ATC radio frequencies which hampers message exchange during peak traffic hours in many busy terminal areas. A piloted simulation study to develop pilot factor guidelines and assess potential flight crew benefits and liabilities from using data link ATC message exchange was completed. The data link ATC message exchange concept, implemented on an existing navigation computer Control Display Unit (CDU) required maintaining a voice radio telephone link with an appropriate ATC facility. Flight crew comments, scanning behavior, and measurements of time spent in ATC communication activities for data link ATC message exchange were compared to similar measures for simulated conventional voice radio operations. The results show crew preference for the quieter flight deck environment and a perception of lower communication workload.

  9. The OGC Sensor Web Enablement framework

    NASA Astrophysics Data System (ADS)

    Cox, S. J.; Botts, M.

    2006-12-01

    Sensor observations are at the core of natural sciences. Improvements in data-sharing technologies offer the promise of much greater utilisation of observational data. A key to this is interoperable data standards. The Open Geospatial Consortium's (OGC) Sensor Web Enablement initiative (SWE) is developing open standards for web interfaces for the discovery, exchange and processing of sensor observations, and tasking of sensor systems. The goal is to support the construction of complex sensor applications through real-time composition of service chains from standard components. The framework is based around a suite of standard interfaces, and standard encodings for the message transferred between services. The SWE interfaces include: Sensor Observation Service (SOS)-parameterized observation requests (by observation time, feature of interest, property, sensor); Sensor Planning Service (SPS)-tasking a sensor- system to undertake future observations; Sensor Alert Service (SAS)-subscription to an alert, usually triggered by a sensor result exceeding some value. The interface design generally follows the pattern established in the OGC Web Map Service (WMS) and Web Feature Service (WFS) interfaces, where the interaction between a client and service follows a standard sequence of requests and responses. The first obtains a general description of the service capabilities, followed by obtaining detail required to formulate a data request, and finally a request for a data instance or stream. These may be implemented in a stateless "REST" idiom, or using conventional "web-services" (SOAP) messaging. In a deployed system, the SWE interfaces are supplemented by Catalogue, data (WFS) and portrayal (WMS) services, as well as authentication and rights management. The standard SWE data formats are Observations and Measurements (O&M) which encodes observation metadata and results, Sensor Model Language (SensorML) which describes sensor-systems, Transducer Model Language (TML) which covers low-level data streams, and domain-specific GML Application Schemas for definitions of the target feature types. The SWE framework has been demonstrated in several interoperability testbeds. These were based around emergency management, security, contamination and environmental monitoring scenarios.

  10. Joint Interoperability of Tactical Command and Control Systems (JINTACCS) Technical Interface Design Plan (Test Edition). Appendix C. Glossary

    DTIC Science & Technology

    1981-03-01

    m u z r- Z > H < m ff -< C 2 C) 2 3 r- c is o 1 L_ • o > • en • • • Ŕ O > 00 o 30 n 30 T r- 1> - n o > IDE N T S E LIN E D IT...responsibilities to the Defense Intelligence Agency ( DIA ), the Services, and the unified and specified commands for carrying out that guidance. 3 - JCS...Testing 1-2 1.4 JINTACCS Message Text Eorinats (MTF) and TADIL Messages 1-2 1.4.1 JINTACCS Message Text Formats (MTF) 1- 3 1.4.2 Developmental TADIL

  11. Data Quality and Interoperability Challenges for eHealth Exchange Participants: Observations from the Department of Veterans Affairs' Virtual Lifetime Electronic Record Health Pilot Phase.

    PubMed

    Botts, Nathan; Bouhaddou, Omar; Bennett, Jamie; Pan, Eric; Byrne, Colene; Mercincavage, Lauren; Olinger, Lois; Hunolt, Elaine; Cullen, Theresa

    2014-01-01

    Authors studied the United States (U.S.) Department of Veterans Affairs' (VA) Virtual Lifetime Electronic Record (VLER) Health pilot phase relative to two attributes of data quality - the adoption of eHealth Exchange data standards, and clinical content exchanged. The VLER Health pilot was an early effort in testing implementation of eHealth Exchange standards and technology. Testing included evaluation of exchange data from the VLER Health pilot sites partners: VA, U.S. Department of Defense (DoD), and private sector health care organizations. Domains assessed data quality and interoperability as it relates to: 1) conformance with data standards related to the underlying structure of C32 Summary Documents (C32) produced by eHealth Exchange partners; and 2) the types of C32 clinical content exchanged. This analysis identified several standards non-conformance issues in sample C32 files and informed further discourse on the methods needed to effectively monitor Health Information Exchange (HIE) data content and standards conformance.

  12. The NASA Scientific and Technical Information (STI) Program's Implementation of Open Archives Initiation (OAI) for Data Interoperability and Data Exchange

    NASA Technical Reports Server (NTRS)

    Rocker, JoAnne; Roncaglia, George J.; Heimerl, Lynn N.; Nelson, Michael L.

    2002-01-01

    Interoperability and data-exchange are critical for the survival of government information management programs. E-government initiatives are transforming the way the government interacts with the public. More information is to be made available through web-enabled technologies. Programs such as the NASA's Scientific and Technical Information (STI) Program Office are tasked to find more effective ways to disseminate information to the public. The NASA STI Program is an agency-wide program charged with gathering, organizing, storing, and disseminating NASA-produced information for research and public use. The program is investigating the use of a new protocol called the Open Archives Initiative (OAI) as a means to improve data interoperability and data collection. OAI promotes the use of the OAI harvesting protocol as a simple way for data sharing among repositories. In two separate initiatives, the STI Program is implementing OAI In collaboration with the Air Force, Department of Energy, and Old Dominion University, the NASA STI Program has funded research on implementing the OAI to exchange data between the three organizations. The second initiative is the deployment of OAI for the NASA technical report server (TRS) environment. The NASA TRS environment is comprised of distributed technical report servers with a centralized search interface. This paper focuses on the implementation of OAI to promote interoperability among diverse data repositories.

  13. A SOA-Based Platform to Support Clinical Data Sharing.

    PubMed

    Gazzarata, R; Giannini, B; Giacomini, M

    2017-01-01

    The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the "Interoperable" Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the "Interoperable" Tier, the current solution actually covers the "Connected" Tier, due to local hospital policy restrictions.

  14. DIMP: an interoperable solution for software integration and product data exchange

    NASA Astrophysics Data System (ADS)

    Wang, Xi Vincent; Xu, Xun William

    2012-08-01

    Today, globalisation has become one of the main trends of manufacturing business that has led to a world-wide decentralisation of resources amongst not only individual departments within one company but also business partners. However, despite the development and improvement in the last few decades, difficulties in information exchange and sharing still exist in heterogeneous applications environments. This article is divided into two parts. In the first part, related research work and integrating solutions are reviewed and discussed. The second part introduces a collaborative environment called distributed interoperable manufacturing platform, which is based on a module-based, service-oriented architecture (SOA). In the platform, the STEP-NC data model is used to facilitate data-exchange among heterogeneous CAD/CAM/CNC systems.

  15. An observational study of the relationship between meaningful use-based electronic health information exchange, interoperability, and medication reconciliation capabilities.

    PubMed

    Elysee, Gerald; Herrin, Jeph; Horwitz, Leora I

    2017-10-01

    Stagnation in hospitals' adoption of data integration functionalities coupled with reduction in the number of operational health information exchanges could become a significant impediment to hospitals' adoption of 3 critical capabilities: electronic health information exchange, interoperability, and medication reconciliation, in which electronic systems are used to assist with resolving medication discrepancies and improving patient safety. Against this backdrop, we assessed the relationships between the 3 capabilities.We conducted an observational study applying partial least squares-structural equation modeling technique to 27 variables obtained from the 2013 American Hospital Association annual survey Information Technology (IT) supplement, which describes health IT capabilities.We included 1330 hospitals. In confirmatory factor analysis, out of the 27 variables, 15 achieved loading values greater than 0.548 at P < .001, as such were validated as the building blocks of the 3 capabilities. Subsequent path analysis showed a significant, positive, and cyclic relationship between the capabilities, in that decreases in the hospitals' adoption of one would lead to decreases in the adoption of the others.These results show that capability for high quality medication reconciliation may be impeded by lagging adoption of interoperability and health information exchange capabilities. Policies focused on improving one or more of these capabilities may have ancillary benefits.

  16. Chief Information Officer's Role in Adopting an Interoperable Electronic Health Record System for Medical Data Exchange

    ERIC Educational Resources Information Center

    Akpabio, Akpabio Enebong Ema

    2013-01-01

    Despite huge growth in hospital technology systems, there remains a dearth of literature examining health care administrator's perceptions of the efficacy of interoperable EHR systems. A qualitative research methodology was used in this multiple-case study to investigate the application of diffusion of innovations theory and the technology…

  17. 75 FR 3773 - Self-Regulatory Organizations; Chicago Board Options Exchange, Inc.; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-22

    ... on the Exchange's website ( http://www.cboe.org/legal ), at the Exchange's Office of the Secretary... message traffic. The Exchange believes liquidity providers generally are quoting more efficiently in... capacity to handle greater quote message traffic. Accordingly, the Exchange believes it would be...

  18. An open repositories network development for medical teaching resources.

    PubMed

    Soula, Gérard; Darmoni, Stefan; Le Beux, Pierre; Renard, Jean-Marie; Dahamna, Badisse; Fieschi, Marius

    2010-01-01

    The lack of interoperability between repositories of heterogeneous and geographically widespread data is an obstacle to the diffusion, sharing and reutilization of those data. We present the development of an open repositories network taking into account both the syntactic and semantic interoperability of the different repositories and based on international standards in this field. The network is used by the medical community in France for the diffusion and sharing of digital teaching resources. The syntactic interoperability of the repositories is managed using the OAI-PMH protocol for the exchange of metadata describing the resources. Semantic interoperability is based, on one hand, on the LOM standard for the description of resources and on MESH for the indexing of the latter and, on the other hand, on semantic interoperability management designed to optimize compliance with standards and the quality of the metadata.

  19. Real-time reference: the use of chat technology to improve point of need assistance.

    PubMed

    Connor, Elizabeth

    2002-01-01

    Chat reference refers to the use of instant messaging and call center software to support interactive text or voice communication with library patrons. Instant messaging has been integrated into many e-commerce environments, and into the social lives of many teenagers and young adults, affording a level of immediacy and intimacy not possible with e-mail applications. The convergence and interoperability of new and emerging technologies can be used to develop new communities of users that view libraries as being essential to their education, patient care, and research activities.

  20. Advanced radiology information system.

    PubMed

    Kolovou, L; Vatousi, M; Lymperopoulos, D; Koukias, M

    2005-01-01

    The innovative features of an advanced Radiology Information System (RIS) are presented in this paper. The interoperability of RIS with the other Intra-hospital Information Systems that interacts with, dealing with the compatibility and open architecture issues, are accomplished by two novel mechanisms [1]. The first one is the particular message handling system that is applied for the exchange of information, according to the Health Level Seven (HL7) protocol's specifications and serves the transfer of medical and administrative data among the RIS applications and data store unit. The same mechanism allows the secure and HL7-compatible interactions with the Hospital Information System (HIS) too. The second one implements the translation of information between the formats that HL7 and Digital Imaging and Communication in Medicine (DICOM) protocols specify, providing the communication between RIS and Picture and Archive Communication System (PACS). The whole structure ensures the automation of the every-day procedures that the ;medical protocol' specifies and provides its services through a friendly and easy to manage graphical user interface.

  1. An Assessment of Information Exchange Practices, Challenges, and Opportunities to Support US Disease Surveillance in 3 States.

    PubMed

    Garcia, Macarena C; Garrett, Nedra Y; Singletary, Vivian; Brown, Sheereen; Hennessy-Burt, Tamara; Haney, Gillian; Link, Kimberly; Tripp, Jennifer; Mac Kenzie, William R; Yoon, Paula

    2017-12-07

    State and local public health agencies collect and use surveillance data to identify outbreaks, track cases, investigate causes, and implement measures to protect the public-s health through various surveillance systems and data exchange practices. The purpose of this assessment was to better understand current practices at state and local public health agencies for collecting, managing, processing, reporting, and exchanging notifiable disease surveillance information. Over an 18-month period (January 2014-June 2015), we evaluated the process of data exchange between surveillance systems, reporting burdens, and challenges within 3 states (California, Idaho, and Massachusetts) that were using 3 different reporting systems. All 3 states use a combination of paper-based and electronic information systems for managing and exchanging data on reportable conditions within the state. The flow of data from local jurisdictions to the state health departments varies considerably. When state and local information systems are not interoperable, manual duplicative data entry and other work-arounds are often required. The results of the assessment show the complexity of disease reporting at the state and local levels and the multiple systems, processes, and resources engaged in preparing, processing, and transmitting data that limit interoperability and decrease efficiency. Through this structured assessment, the Centers for Disease Control and Prevention (CDC) has a better understanding of the complexities for surveillance of using commercial off-the-shelf data systems (California and Massachusetts), and CDC-developed National Electronic Disease Surveillance System Base System. More efficient data exchange and use of data will help facilitate interoperability between National Notifiable Diseases Surveillance Systems.

  2. Interoperability and models for exchange of data between information systems in public administration

    NASA Astrophysics Data System (ADS)

    Glavev, Victor

    2016-12-01

    The types of software applications used by public administrations can be divided in three main groups: document management systems, record management systems and business process systems. Each one of them generates outputs that can be used as input data to the others. This is the main reason that requires exchange of data between these three groups and well defined models that should be followed. There are also many other reasons that will be discussed in the paper. Interoperability is a key aspect when those models are implemented, especially when there are different manufactures of systems in the area of software applications used by public authorities. The report includes examples of implementation of models for exchange of data between software systems deployed in one of the biggest administration in Bulgaria.

  3. Breaking barriers to interoperability: assigning spatially and temporally unique identifiers to spaces and buildings.

    PubMed

    Pyke, Christopher R; Madan, Isaac

    2013-08-01

    The real estate industry routinely uses specialized information systems for functions, including design, construction, facilities management, brokerage, tax assessment, and utilities. These systems are mature and effective within vertically integrated market segments. However, new questions are reaching across these traditional information silos. For example, buyers may be interested in evaluating the design, energy efficiency characteristics, and operational performance of a commercial building. This requires the integration of information across multiple databases held by different institutions. Today, this type of data integration is difficult to automate and propone to errors due, in part, to the lack of generally accepted building and spaces identifiers. Moving forward, the real estate industry needs a new mechanism to assign identifiers for whole buildings and interior spaces for the purpose of interoperability, data exchange, and integration. This paper describes a systematic process to identify activities occurring at building or within interior spaces to provide a foundation for exchange and interoperability. We demonstrate the application of the approach with a prototype Web application. This concept and demonstration illustrate the elements of a practical interoperability framework that can increase productivity, create new business opportunities, and reduce errors, waste, and redundancy. © 2013 New York Academy of Sciences.

  4. A review on digital ECG formats and the relationships between them.

    PubMed

    Trigo, Jesús Daniel; Alesanco, Alvaro; Martínez, Ignacio; García, José

    2012-05-01

    A plethora of digital ECG formats have been proposed and implemented. This heterogeneity hinders the design and development of interoperable systems and entails critical integration issues for the healthcare information systems. This paper aims at performing a comprehensive overview on the current state of affairs of the interoperable exchange of digital ECG signals. This includes 1) a review on existing digital ECG formats, 2) a collection of applications and cardiology settings using such formats, 3) a compilation of the relationships between such formats, and 4) a reflection on the current situation and foreseeable future of the interoperable exchange of digital ECG signals. The objectives have been approached by completing and updating previous reviews on the topic through appropriate database mining. 39 digital ECG formats, 56 applications, tools or implantation experiences, 47 mappings/converters, and 6 relationships between such formats have been found in the literature. The creation and generalization of a single standardized ECG format is a desirable goal. However, this unification requires political commitment and international cooperation among different standardization bodies. Ongoing ontology-based approaches covering ECG domain have recently emerged as a promising alternative for reaching fully fledged ECG interoperability in the near future.

  5. Interoperable and standard e-Health solution over Bluetooth.

    PubMed

    Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J

    2010-01-01

    The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.

  6. The value of health care information exchange and interoperability.

    PubMed

    Walker, Jan; Pan, Eric; Johnston, Douglas; Adler-Milstein, Julia; Bates, David W; Middleton, Blackford

    2005-01-01

    In this paper we assess the value of electronic health care information exchange and interoperability (HIEI) between providers (hospitals and medical group practices) and independent laboratories, radiology centers, pharmacies, payers, public health departments, and other providers. We have created an HIEI taxonomy and combined published evidence with expert opinion in a cost-benefit model. Fully standardized HIEI could yield a net value of dollar 77.8 billion per year once fully implemented. Nonstandardized HIEI offers smaller positive financial returns. The clinical impact of HIEI for which quantitative estimates cannot yet be made would likely add further value. A compelling business case exists for national implementation of fully standardized HIEI.

  7. Interoperability prototype between hospitals and general practitioners in Switzerland.

    PubMed

    Alves, Bruno; Müller, Henning; Schumacher, Michael; Godel, David; Abu Khaled, Omar

    2010-01-01

    Interoperability in data exchange has the potential to improve the care processes and decrease costs of the health care system. Many countries have related eHealth initiatives in preparation or already implemented. In this area, Switzerland has yet to catch up. Its health system is fragmented, because of the federated nature of cantons. It is thus more difficult to coordinate efforts between the existing healthcare actors. In the Medicoordination project a pragmatic approach was selected: integrating several partners in healthcare on a regional scale in French speaking Switzerland. In parallel with the Swiss eHealth strategy, currently being elaborated by the Swiss confederation, particularly medium-sized hospitals and general practitioners were targeted in Medicoordination to implement concrete scenarios of information exchange between hospitals and general practitioners with a high added value. In this paper we focus our attention on a prototype implementation of one chosen scenario: the discharge summary. Although simple in concept, exchanging release letters shows small, hidden difficulties due to the multi-partner nature of the project. The added value of such a prototype is potentially high and it is now important to show that interoperability can work in practice.

  8. A cloud-based approach for interoperable electronic health records (EHRs).

    PubMed

    Bahga, Arshdeep; Madisetti, Vijay K

    2013-09-01

    We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.

  9. TMDD & MS/ETMCC guide : standard for functional level traffic management data dictionary (TMDD) and message sets for external traffic management center communications

    DOT National Transportation Integrated Search

    2000-01-01

    These draft standards are intended to work together to provide a high level of interoperability among regional and local traffic control centers. They provide consistent names, definitions and concepts similar to spelling and parts of speech to world...

  10. Do interoperable national information systems enhance availability of data to assess the effect of scale-up of HIV services on health workforce deployment in resource-limited countries?

    PubMed

    Oluoch, Tom; Muturi, David; Kiriinya, Rose; Waruru, Anthony; Lanyo, Kevin; Nguni, Robert; Ojwang, James; Waters, Keith P; Richards, Janise

    2015-01-01

    Sub-Saharan Africa (SSA) bears the heaviest burden of the HIV epidemic. Health workers play a critical role in the scale-up of HIV programs. SSA also has the weakest information and communication technology (ICT) infrastructure globally. Implementing interoperable national health information systems (HIS) is a challenge, even in developed countries. Countries in resource-limited settings have yet to demonstrate that interoperable systems can be achieved, and can improve quality of healthcare through enhanced data availability and use in the deployment of the health workforce. We established interoperable HIS integrating a Master Facility List (MFL), District Health Information Software (DHIS2), and Human Resources Information Systems (HRIS) through application programmers interfaces (API). We abstracted data on HIV care, health workers deployment, and health facilities geo-coordinates. Over 95% of data elements were exchanged between the MFL-DHIS and HRIS-DHIS. The correlation between the number of HIV-positive clients and nurses and clinical officers in 2013 was R2=0.251 and R2=0.261 respectively. Wrong MFL codes, data type mis-match and hyphens in legacy data were key causes of data transmission errors. Lack of information exchange standards for aggregate data made programming time-consuming.

  11. Extending the Wireshark Network Protocol Analyser to Decode Link 16 Tactical Data Link Messages

    DTIC Science & Technology

    2014-01-01

    Interoperability Workshop 2003, Paper No. 03F- SIW -002. 6. Boardman, B., (2008), Introduction to Tactical Data Links in the ADF, accessed from <http...2008, Paper No. 08E- SIW -046. 19. Lamping, U., (2013), Wireshark Developer’s Guide for Wireshark 1.11, accessed from <http://www.wireshark.org/docs

  12. Standardized exchange of clinical documents--towards a shared care paradigm in glaucoma treatment.

    PubMed

    Gerdsen, F; Müller, S; Jablonski, S; Prokosch, H-U

    2006-01-01

    The exchange of medical data from research and clinical routine across institutional borders is essential to establish an integrated healthcare platform. In this project we want to realize the standardized exchange of medical data between different healthcare institutions to implement an integrated and interoperable information system supporting clinical treatment and research of glaucoma. The central point of our concept is a standardized communication model based on the Clinical Document Architecture (CDA). Further, a communication concept between different health care institutions applying the developed document model has been defined. With our project we have been able to prove that standardized communication between an Electronic Medical Record (EMR), an Electronic Health Record (EHR) and the Erlanger Glaucoma Register (EGR) based on the established conceptual models, which rely on CDA rel.1 level 1 and SCIPHOX, could be implemented. The HL7-tool-based deduction of a suitable CDA rel.2 compliant schema showed significant differences when compared with the manually created schema. Finally fundamental requirements, which have to be implemented for an integrated health care platform, have been identified. An interoperable information system can enhance both clinical treatment and research projects. By automatically transferring screening findings from a glaucoma research project to the electronic medical record of our ophthalmology clinic, clinicians could benefit from the availability of a longitudinal patient record. The CDA as a standard for exchanging clinical documents has demonstrated its potential to enhance interoperability within a future shared care paradigm.

  13. Semantically Interoperable XML Data

    PubMed Central

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-01-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789

  14. Architecture for interoperable software in biology.

    PubMed

    Bare, James Christopher; Baliga, Nitin S

    2014-07-01

    Understanding biological complexity demands a combination of high-throughput data and interdisciplinary skills. One way to bring to bear the necessary combination of data types and expertise is by encapsulating domain knowledge in software and composing that software to create a customized data analysis environment. To this end, simple flexible strategies are needed for interconnecting heterogeneous software tools and enabling data exchange between them. Drawing on our own work and that of others, we present several strategies for interoperability and their consequences, in particular, a set of simple data structures--list, matrix, network, table and tuple--that have proven sufficient to achieve a high degree of interoperability. We provide a few guidelines for the development of future software that will function as part of an interoperable community of software tools for biological data analysis and visualization. © The Author 2012. Published by Oxford University Press.

  15. The Long Road to Semantic Interoperability in Support of Public Health: Experiences from Two States

    PubMed Central

    Vreeman, Daniel J.; Grannis, Shaun J.

    2014-01-01

    Proliferation of health information technologies creates opportunities to improve clinical and public health, including high quality, safer care and lower costs. To maximize such potential benefits, health information technologies must readily and reliably exchange information with other systems. However, evidence from public health surveillance programs in two states suggests that operational clinical information systems often fail to use available standards, a barrier to semantic interoperability. Furthermore, analysis of existing policies incentivizing semantic interoperability suggests they have limited impact and are fragmented. In this essay, we discuss three approaches for increasing semantic interoperability to support national goals for using health information technologies. A clear, comprehensive strategy requiring collaborative efforts by clinical and public health stakeholders is suggested as a guide for the long road towards better population health data and outcomes. PMID:24680985

  16. An Evaluation of Two Methods for Generating Synthetic HL7 Segments Reflecting Real-World Health Information Exchange Transactions

    PubMed Central

    Mwogi, Thomas S.; Biondich, Paul G.; Grannis, Shaun J.

    2014-01-01

    Motivated by the need for readily available data for testing an open-source health information exchange platform, we developed and evaluated two methods for generating synthetic messages. The methods used HL7 version 2 messages obtained from the Indiana Network for Patient Care. Data from both methods were analyzed to assess how effectively the output reflected original ‘real-world’ data. The Markov Chain method (MCM) used an algorithm based on transitional probability matrix while the Music Box model (MBM) randomly selected messages of particular trigger type from the original data to generate new messages. The MBM was faster, generated shorter messages and exhibited less variation in message length. The MCM required more computational power, generated longer messages with more message length variability. Both methods exhibited adequate coverage, producing a high proportion of messages consistent with original messages. Both methods yielded similar rates of valid messages. PMID:25954458

  17. Building Future Transatlantic Interoperability Around a Robust NATO Response Force

    DTIC Science & Technology

    2012-10-01

    than already traveled . However, this accrued wealth of interoperable capa- bility may be at its apogee, soon to decline as the result of two looming...and Bydgo- szcz, Poland, as well as major national training centers such as the bilateral U.S.- Romanian Joint Task Force– East at Kogalniceanu...operations. Increase U.S. and Allied Exchange Students at National and NATO military schools. Austerity measures may eventually affect the investment

  18. The National Opportunity for Interoperability and its Benefits for a Reliable, Robust, and Future Grid Realized Through Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    Today, increasing numbers of intermittent generation sources (e.g., wind and photovoltaic) and new mobile intermittent loads (e.g., electric vehicles) can significantly affect traditional utility business practices and operations. At the same time, a growing number of technologies and devices, from appliances to lighting systems, are being deployed at consumer premises that have more sophisticated controls and information that remain underused for anything beyond basic building equipment operations. The intersection of these two drivers is an untapped opportunity and underused resource that, if appropriately configured and realized in open standards, can provide significant energy efficiency and commensurate savings on utility bills,more » enhanced and lower cost reliability to utilities, and national economic benefits in the creation of new markets, sectors, and businesses being fueled by the seamless coordination of energy and information through device and technology interoperability. Or, as the Quadrennial Energy Review puts it, “A plethora of both consumer-level and grid-level devices are either in the market, under development, or at the conceptual stage. When tied together through the information technology that is increasingly being deployed on electric utilities’ distribution grids, they can be an important enabling part of the emerging grid of the future. However, what is missing is the ability for all of these devices to coordinate and communicate their operations with the grid, and among themselves, in a common language — an open standard.” In this paper, we define interoperability as the ability to exchange actionable information between two or more systems within a home or building, or across and within organizational boundaries. Interoperability relies on the shared meaning of the exchanged information, with agreed-upon expectations and consequences, for the response to the information exchange.« less

  19. Military Interoperable Digital Hospital Testbed (MIDHT)

    DTIC Science & Technology

    2013-01-01

    users of the PACS system in terms of viewing images originating from Miners and Meyersdale are Emergency Medicine and Trauma physicians. This...conditions, over the counter/ herbal medications, physician list, and emergency contacts. Through secure messaging with their physician, patients...et al. (1999). Impact of a patient-centered, computer- based health information/support system. American Journal of Preventive Medicine , 16(1), 1- 9

  20. The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geospatial Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ananthakrishnan, Rachana; Bell, Gavin; Cinquini, Luca

    2013-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less

  1. The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geo-Spatial Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cinquini, Luca; Crichton, Daniel; Miller, Neill

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less

  2. The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data

    NASA Technical Reports Server (NTRS)

    Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark; hide

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  3. Dynamic Business Networks: A Headache for Sustainable Systems Interoperability

    NASA Astrophysics Data System (ADS)

    Agostinho, Carlos; Jardim-Goncalves, Ricardo

    Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.

  4. Coordination of Mobile Devices : Technology and Standards Scan.

    DOT National Transportation Integrated Search

    2015-06-19

    The connected vehicle environment was envisioned as a means of exchanging messages through a connected vehicle fleet. The majority of the current connected vehicle environment focuses on the vehicle, by supporting the exchange of messages from vehicl...

  5. An approach for message exchange using archetypes.

    PubMed

    Moraes, João L C; Souza, Wanderley L; Cavalini, Luciana T; Pires, Luís F; Prado, Antonio F

    2013-01-01

    The application of ICT on the whole range of health sector activities, known as e-health, can simplify the access to health care services and will only be acceptable for realistic scenarios if it supports efficient information exchange amongst the caregivers and their patients. The aim of this paper is present an approach for message exchange to realistic scenarios.

  6. Smart Grid Interoperability Maturity Model Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Drummond, R.; Giroti, Tony

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across anmore » information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.« less

  7. Design and implementation of a health data interoperability mediator.

    PubMed

    Kuo, Mu-Hsing; Kushniruk, Andre William; Borycki, Elizabeth Marie

    2010-01-01

    The objective of this study is to design and implement a common-gateway oriented mediator to solve the health data interoperability problems that exist among heterogeneous health information systems. The proposed mediator has three main components: (1) a Synonym Dictionary (SD) that stores a set of global metadata and terminologies to serve as the mapping intermediary, (2) a Semantic Mapping Engine (SME) that can be used to map metadata and instance semantics, and (3) a DB-to-XML module that translates source health data stored in a database into XML format and back. A routine admission notification data exchange scenario is used to test the efficiency and feasibility of the proposed mediator. The study results show that the proposed mediator can make health information exchange more efficient.

  8. Harmonizing clinical terminologies: driving interoperability in healthcare.

    PubMed

    Hamm, Russell A; Knoop, Sarah E; Schwarz, Peter; Block, Aaron D; Davis, Warren L

    2007-01-01

    Internationally, there are countless initiatives to build National Healthcare Information Networks (NHIN) that electronically interconnect healthcare organizations by enhancing and integrating current information technology (IT) capabilities. The realization of such NHINs will enable the simple and immediate exchange of appropriate and vital clinical data among participating organizations. In order for institutions to accurately and automatically exchange information, the electronic clinical documents must make use of established clinical codes, such as those of SNOMED-CT, LOINC and ICD-9 CM. However, there does not exist one universally accepted coding scheme that encapsulates all pertinent clinical information for the purposes of patient care, clinical research and population heatlh reporting. In this paper, we propose a combination of methods and standards that target the harmonization of clinical terminologies and encourage sustainable, interoperable infrastructure for healthcare.

  9. Personalized-detailed clinical model for data interoperability among clinical standards.

    PubMed

    Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir; Lee, Sungyoung

    2013-08-01

    Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems.

  10. Personalized-Detailed Clinical Model for Data Interoperability Among Clinical Standards

    PubMed Central

    Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir

    2013-01-01

    Abstract Objective: Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. Materials and Methods: We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. Results: For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. Conclusions: The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems. PMID:23875730

  11. CCSDS Overview

    NASA Technical Reports Server (NTRS)

    Kearney, Mike

    2013-01-01

    The primary goal of Consultative Committee for Space Data Systems (CCSDS) is interoperability between communications and data systems of space agencies' vehicles, facilities, missions and programs. Of all of the technologies used in spaceflight, standardization of communications and data systems brings the most benefit to multi-agency interoperability. CCSDS Started in 1982 developing standards at the lower layers of the protocol stack. The CCSDS scope has grown to cover standards throughout the entire ISO communications stack, plus other Data Systems areas (architecture, archive, security, XML exchange formats, etc.

  12. Using Ontologies to Formalize Services Specifications in Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Breitman, Karin Koogan; Filho, Aluizio Haendchen; Haeusler, Edward Hermann

    2004-01-01

    One key issue in multi-agent systems (MAS) is their ability to interact and exchange information autonomously across applications. To secure agent interoperability, designers must rely on a communication protocol that allows software agents to exchange meaningful information. In this paper we propose using ontologies as such communication protocol. Ontologies capture the semantics of the operations and services provided by agents, allowing interoperability and information exchange in a MAS. Ontologies are a formal, machine processable, representation that allows to capture the semantics of a domain and, to derive meaningful information by way of logical inference. In our proposal we use a formal knowledge representation language (OWL) that translates into Description Logics (a subset of first order logic), thus eliminating ambiguities and providing a solid base for machine based inference. The main contribution of this approach is to make the requirements explicit, centralize the specification in a single document (the ontology itself), at the same that it provides a formal, unambiguous representation that can be processed by automated inference machines.

  13. Exchange of Computable Patient Data between the Department of Veterans Affairs (VA) and the Department of Defense (DoD): Terminology Mediation Strategy

    PubMed Central

    Bouhaddou, Omar; Warnekar, Pradnya; Parrish, Fola; Do, Nhan; Mandel, Jack; Kilbourne, John; Lincoln, Michael J.

    2008-01-01

    Complete patient health information that is available where and when it is needed is essential to providers and patients and improves healthcare quality and patient safety. VA and DoD have built on their previous experience in patient data exchange to establish data standards and terminology services to enable real-time bi-directional computable (i.e., encoded) data exchange and achieve semantic interoperability in compliance with recommended national standards and the eGov initiative. The project uses RxNorm, UMLS, and SNOMED CT terminology standards to mediate codified pharmacy and allergy data with greater than 92 and 60 percent success rates respectively. Implementation of the project has been well received by users and is being expanded to multiple joint care sites. Stable and mature standards, mediation strategies, and a close relationship between healthcare institutions and Standards Development Organizations are recommended to achieve and maintain semantic interoperability in a clinical setting. PMID:18096911

  14. Interoperability And Value Added To Earth Observation Data

    NASA Astrophysics Data System (ADS)

    Gasperi, J.

    2012-04-01

    Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.

  15. Development of an electronic claim system based on an integrated electronic health record platform to guarantee interoperability.

    PubMed

    Kim, Hwa Sun; Cho, Hune; Lee, In Keun

    2011-06-01

    We design and develop an electronic claim system based on an integrated electronic health record (EHR) platform. This system is designed to be used for ambulatory care by office-based physicians in the United States. This is achieved by integrating various medical standard technologies for interoperability between heterogeneous information systems. The developed system serves as a simple clinical data repository, it automatically fills out the Centers for Medicare and Medicaid Services (CMS)-1500 form based on information regarding the patients and physicians' clinical activities. It supports electronic insurance claims by creating reimbursement charges. It also contains an HL7 interface engine to exchange clinical messages between heterogeneous devices. The system partially prevents physician malpractice by suggesting proper treatments according to patient diagnoses and supports physicians by easily preparing documents for reimbursement and submitting claim documents to insurance organizations electronically, without additional effort by the user. To show the usability of the developed system, we performed an experiment that compares the time spent filling out the CMS-1500 form directly and time required create electronic claim data using the developed system. From the experimental results, we conclude that the system could save considerable time for physicians in making claim documents. The developed system might be particularly useful for those who need a reimbursement-specialized EHR system, even though the proposed system does not completely satisfy all criteria requested by the CMS and Office of the National Coordinator for Health Information Technology (ONC). This is because the criteria are not sufficient but necessary condition for the implementation of EHR systems. The system will be upgraded continuously to implement the criteria and to offer more stable and transparent transmission of electronic claim data.

  16. Implementation of Medical Information Exchange System Based on EHR Standard

    PubMed Central

    Han, Soon Hwa; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong

    2010-01-01

    Objectives To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. Methods To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. Results The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. Conclusions This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information. PMID:21818447

  17. Implementation of Medical Information Exchange System Based on EHR Standard.

    PubMed

    Han, Soon Hwa; Lee, Min Ho; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong

    2010-12-01

    To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information.

  18. Inter-organizational future proof EHR systems. A review of the security and privacy related issues.

    PubMed

    van der Linden, Helma; Kalra, Dipak; Hasman, Arie; Talmon, Jan

    2009-03-01

    Identification and analysis of privacy and security related issues that occur when health information is exchanged between health care organizations. Based on a generic scenario questions were formulated to reveal the occurring issues. Possible answers were verified in literature. Ensuring secure health information exchange across organizations requires a standardization of security measures that goes beyond organizational boundaries, such as global definitions of professional roles, global standards for patient consent and semantic interoperable audit logs. As to be able to fully address the privacy and security issues in interoperable EHRs and the long-life virtual EHR it is necessary to realize a paradigm shift from storing all incoming information in a local system to retrieving information from external systems whenever that information is deemed necessary for the care of the patient.

  19. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things.

    PubMed

    Jazayeri, Mohammad Ali; Liang, Steve H L; Huang, Chih-Yuan

    2015-09-22

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision.

  20. Achieving interoperability for metadata registries using comparative object modeling.

    PubMed

    Park, Yu Rang; Kim, Ju Han

    2010-01-01

    Achieving data interoperability between organizations relies upon agreed meaning and representation (metadata) of data. For managing and registering metadata, many organizations have built metadata registries (MDRs) in various domains based on international standard for MDR framework, ISO/IEC 11179. Following this trend, two pubic MDRs in biomedical domain have been created, United States Health Information Knowledgebase (USHIK) and cancer Data Standards Registry and Repository (caDSR), from U.S. Department of Health & Human Services and National Cancer Institute (NCI), respectively. Most MDRs are implemented with indiscriminate extending for satisfying organization-specific needs and solving semantic and structural limitation of ISO/IEC 11179. As a result it is difficult to address interoperability among multiple MDRs. In this paper, we propose an integrated metadata object model for achieving interoperability among multiple MDRs. To evaluate this model, we developed an XML Schema Definition (XSD)-based metadata exchange format. We created an XSD-based metadata exporter, supporting both the integrated metadata object model and organization-specific MDR formats.

  1. Methods and apparatus for distributed resource discovery using examples

    NASA Technical Reports Server (NTRS)

    Chang, Yuan-Chi (Inventor); Li, Chung-Sheng (Inventor); Smith, John Richard (Inventor); Hill, Matthew L. (Inventor); Bergman, Lawrence David (Inventor); Castelli, Vittorio (Inventor)

    2005-01-01

    Distributed resource discovery is an essential step for information retrieval and/or providing information services. This step is usually used for determining the location of an information or data repository which has relevant information. The most fundamental challenge is the usual lack of semantic interoperability of the requested resource. In accordance with the invention, a method is disclosed where distributed repositories achieve semantic interoperability through the exchange of examples and, optionally, classifiers. The outcome of the inventive method can be used to determine whether common labels are referring to the same semantic meaning.

  2. EUTELTRACS: The European experience on mobile satellite services

    NASA Technical Reports Server (NTRS)

    Colcy, Jean-Noel; Steinhaeuser, Rafael

    1993-01-01

    EUTELTRACS is Europe's first commercially operated Mobile Satellite Service. Under the overall network operation of EUTELSAT, the European Telecommunications Satellite Organization, EUTELTRACS provides an integrated message exchange and position reporting service. This paper describes the EUTELTRACS system architecture, the message exchange and the position reporting services, including the result of recent analysis of message delivery time and positioning accuracy. It also provides an overview of the commercial deployment, the regulatory situation for its operation within Europe and new applications outside its target market, the international road transportation.

  3. Usage of the hybrid encryption in a cloud instant messages exchange system

    NASA Astrophysics Data System (ADS)

    Kvyetnyy, Roman N.; Romanyuk, Olexander N.; Titarchuk, Evgenii O.; Gromaszek, Konrad; Mussabekov, Nazarbek

    2016-09-01

    A new approach for constructing cloud instant messaging represented in this article allows users to encrypt data locally by using Diffie - Hellman key exchange protocol. The described approach allows to construct a cloud service which operates only by users encrypted messages; encryption and decryption takes place locally at the user party using a symmetric AES encryption. A feature of the service is the conferences support without the need for messages reecryption for each participant. In the article it is given an example of the protocol implementation on the ECC and RSA encryption algorithms basis, as well as a comparison of these implementations.

  4. The Role of Memorable Messages in the Process of Organizational Socialization.

    ERIC Educational Resources Information Center

    Stohl, Cynthia

    1986-01-01

    Examines the structure, form, and nature of the content and context of memorable messages exchanged within an organization. Discusses how these features enhance the socializing and memorable nature of such messages. (MS)

  5. A Simple XML Producer-Consumer Protocol

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    There are many different projects from government, academia, and industry that provide services for delivering events in distributed environments. The problem with these event services is that they are not general enough to support all uses and they speak different protocols so that they cannot interoperate. We require such interoperability when we, for example, wish to analyze the performance of an application in a distributed environment. Such an analysis might require performance information from the application, computer systems, networks, and scientific instruments. In this work we propose and evaluate a standard XML-based protocol for the transmission of events in distributed systems. One recent trend in government and academic research is the development and deployment of computational grids. Computational grids are large-scale distributed systems that typically consist of high-performance compute, storage, and networking resources. Examples of such computational grids are the DOE Science Grid, the NASA Information Power Grid (IPG), and the NSF Partnerships for Advanced Computing Infrastructure (PACIs). The major effort to deploy these grids is in the area of developing the software services to allow users to execute applications on these large and diverse sets of resources. These services include security, execution of remote applications, managing remote data, access to information about resources and services, and so on. There are several toolkits for providing these services such as Globus, Legion, and Condor. As part of these efforts to develop computational grids, the Global Grid Forum is working to standardize the protocols and APIs used by various grid services. This standardization will allow interoperability between the client and server software of the toolkits that are providing the grid services. The goal of the Performance Working Group of the Grid Forum is to standardize protocols and representations related to the storage and distribution of performance data. These standard protocols and representations must support tasks such as profiling parallel applications, monitoring the status of computers and networks, and monitoring the performance of services provided by a computational grid. This paper describes a proposed protocol and data representation for the exchange of events in a distributed system. The protocol exchanges messages formatted in XML and it can be layered atop any low-level communication protocol such as TCP or UDP Further, we describe Java and C++ implementations of this protocol and discuss their performance. The next section will provide some further background information. Section 3 describes the main communication patterns of our protocol. Section 4 describes how we represent events and related information using XML. Section 5 describes our protocol and Section 6 discusses the performance of two implementations of the protocol. Finally, an appendix provides the XML Schema definition of our protocol and event information.

  6. From Many to Many More: Instant Interoperability Through the Integrated Ocean Observing System Data Assembly Center

    NASA Astrophysics Data System (ADS)

    Burnett, W.; Bouchard, R.; Hervey, R.; Crout, R.; Luke, R.

    2008-12-01

    As the Integrated Ocean Observing System (IOOS) Data Assembly Center (DAC), NOAA's National Data Buoy Center (NDBC) collects data from many ocean observing systems, quality controls the data, and distributes them nationally and internationally. The DAC capabilities provide instant interoperability of any ocean observatory with the national and international agencies responsible for critical forecasts and warnings and with the national media. This interoperability is an important milestone in an observing system's designation as an operational system. Data collection begins with NDBC's own observing systems - Meteorological and Oceanographic Buoys and Coastal Stations, the Tropical Atmosphere Ocean Array, and the NOAA tsunameter network. Leveraging the data management functions that support NDBC systems, the DAC can support data partners including ocean observations from IOOS Regional Observing Systems, the meteorological observations from the National Water Level Observing Network, meteorological and oceanographic observations from the National Estuarine Research Reserve System, Integrated Coral Observing Network, merchant ship observations from the Voluntary Observing Ship program, and ocean current measurements from oil and gas platforms in the Gulf of Mexico and from Coastal HF Radars. The DAC monitors and quality controls IOOS Partner data alerting the data provider to outages and quality discrepancies. After performing automated and manual quality control procedures, the DAC prepares the observations for distribution. The primary means of data distribution is in standard World Meteorological Organization alphanumeric coded messages distributed via the Global Telecommunications System, NOAAPort, and Family of Services. Observing systems provide their data via ftp to an NDBC server using a simple XML. The DAC also posts data in real-time to the NDBC webpages in columnar text format and data plots that maritime interests (e.g., surfing, fishing, boating) widely use. The webpage text feeds the Dial-A-Buoy capability that reads the latest data from webpages and the latest NWS forecast for the station to a user via telephone. The DAC also operates a DODS/OPenDAP server to provide data in netCDF. Recently the DAC implemented the NOAA IOOS Data Integration Framework, which facilitates the exchange of data between IOOS Regional Observing Systems by standardizing data exchange formats and incorporating needed metadata for the correct application of the data. The DAC has become an OceanSITES Global Data Assembly Center - part of the Initial Global Observing System for Climate. Supported by the NOAA IOOS Program, the DAC provides round-the-clock monitoring, quality control, and data distribution to ensure that its IOOS Partners can conduct operations that meet the NOAA definition of: Sustained, systematic, reliable, and robust mission activities with an institutional commitment to deliver appropriate, cost-effective products and services.

  7. Phenomenological Behavior-Exchange Models of Marital Success.

    ERIC Educational Resources Information Center

    Gottman, John; And Others

    The objective of two studies was to devise an assessment procedure for the evaluation of therapy with distressed marriages. An extension of behavior exchange theory was proposed to include phenomenological ratings by the couple of the intent of messages sent and the impact of messages received. Convergent criteria were used to select 14…

  8. Wide Area Recovery & Resiliency Program (WARRP) Transition Manager Series, Coalition Warrior Interoperability Demonstration (CWID) 2011: SSC Pacific Civilian Message Systems in Trial 2.32

    DTIC Science & Technology

    2011-07-01

    displayed sensor readings and maps: an Apple iPad and a Samsung Galaxy Tab. • Mobile Wi-Fi Hotspot: A 3G AT&T MiFi integrated the sensors, laptops and... 7 NEXT STEPS...Developing “apps” for both, an Apple IOS (IPAD) tablet and Android ( Galaxy ) tablet to display the common operating picture (COP). • Providing

  9. Semantic Interoperability of Health Risk Assessments

    PubMed Central

    Rajda, Jay; Vreeman, Daniel J.; Wei, Henry G.

    2011-01-01

    The health insurance and benefits industry has administered Health Risk Assessments (HRAs) at an increasing rate. These are used to collect data on modifiable health risk factors for wellness and disease management programs. However, there is significant variability in the semantics of these assessments, making it difficult to compare data sets from the output of 2 different HRAs. There is also an increasing need to exchange this data with Health Information Exchanges and Electronic Medical Records. To standardize the data and concepts from these tools, we outline a process to determine presence of certain common elements of modifiable health risk extracted from these surveys. This information is coded using concept identifiers, which allows cross-survey comparison and analysis. We propose that using LOINC codes or other universal coding schema may allow semantic interoperability of a variety of HRA tools across the industry, research, and clinical settings. PMID:22195174

  10. Iris: Constructing and Analyzing Spectral Energy Distributions with the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Laurino, O.; Budynkiewicz, J.; Busko, I.; Cresitello-Dittmar, M.; D'Abrusco, R.; Doe, S.; Evans, J.; Pevunova, O.

    2014-05-01

    We present Iris 2.0, the latest release of the Virtual Astronomical Observatory application for building and analyzing Spectral Energy Distributions (SEDs). With Iris, users may read in and display SEDs inspect and edit any selection of SED data, fit models to SEDs in arbitrary spectral ranges, and calculate confidence limits on best-fit parameters. SED data may be loaded into the application from VOTable and FITS files compliant with the International Virtual Observatoy Alliance interoperable data models, or retrieved directly from NED or the Italian Space Agency Science Data Center; data in non-standard formats may also be converted within the application. Users may seamlessy exchange data between Iris and other Virtual Observatoy tools using the Simple Application Messaging Protocol. Iris 2.0 also provides a tool for redshifting, interpolating, and measuring integratd fluxes, and allows simple aperture corrections for individual points and SED segments. Custom Python functions, template models and template libraries may be imported into Iris for fitting SEDs. Iris may be extended through Java plugins; users can install third-party packages, or develop their own plugin using Iris' Software Development Kit. Iris 2.0 is available for Linux and Mac OS X systems.

  11. Patient Centeredness in Electronic Communication: Evaluation of Patient-to-Health Care Team Secure Messaging

    PubMed Central

    Luger, Tana M; Volkman, Julie E; Rocheleau, Mary; Mueller, Nora; Barker, Anna M; Nazi, Kim M; Houston, Thomas K; Bokhour, Barbara G

    2018-01-01

    Background As information and communication technology is becoming more widely implemented across health care organizations, patient-provider email or asynchronous electronic secure messaging has the potential to support patient-centered communication. Within the medical home model of the Veterans Health Administration (VA), secure messaging is envisioned as a means to enhance access and strengthen the relationships between veterans and their health care team members. However, despite previous studies that have examined the content of electronic messages exchanged between patients and health care providers, less research has focused on the socioemotional aspects of the communication enacted through those messages. Objective Recognizing the potential of secure messaging to facilitate the goals of patient-centered care, the objectives of this analysis were to not only understand why patients and health care team members exchange secure messages but also to examine the socioemotional tone engendered in these messages. Methods We conducted a cross-sectional coding evaluation of a corpus of secure messages exchanged between patients and health care team members over 6 months at 8 VA facilities. We identified patients whose medical records showed secure messaging threads containing at least 2 messages and compiled a random sample of these threads. Drawing on previous literature regarding the analysis of asynchronous, patient-provider electronic communication, we developed a coding scheme comprising a series of a priori patient and health care team member codes. Three team members tested the scheme on a subset of the messages and then independently coded the sample of messaging threads. Results Of the 711 messages coded from the 384 messaging threads, 52.5% (373/711) were sent by patients and 47.5% (338/711) by health care team members. Patient and health care team member messages included logistical content (82.6%, 308/373 vs 89.1%, 301/338), were neutral in tone (70.2%, 262/373 vs 82.0%, 277/338), and respectful in nature (25.7%, 96/373 vs 33.4%, 113/338). Secure messages from health care team members sometimes appeared hurried (25.4%, 86/338) but also displayed friendliness or warmth (18.9%, 64/338) and reassurance or encouragement (18.6%, 63/338). Most patient messages involved either providing or seeking information; however, the majority of health care team member messages involved information provision in response to patient questions. Conclusions This evaluation is an important step toward understanding the content and socioemotional tone that is part of the secure messaging exchanges between patients and health care team members. Our findings were encouraging; however, there are opportunities for improvement. As health care organizations seek to supplement traditional encounters with virtual care, they must reexamine their use of secure messaging, including the patient centeredness of the communication, and the potential for more proactive use by health care team members. PMID:29519774

  12. Smart Grid Demonstration Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Craig; Carroll, Paul; Bell, Abigail

    The National Rural Electric Cooperative Association (NRECA) organized the NRECA-U.S. Department of Energy (DOE) Smart Grid Demonstration Project (DE-OE0000222) to install and study a broad range of advanced smart grid technologies in a demonstration that spanned 23 electric cooperatives in 12 states. More than 205,444 pieces of electronic equipment and more than 100,000 minor items (bracket, labels, mounting hardware, fiber optic cable, etc.) were installed to upgrade and enhance the efficiency, reliability, and resiliency of the power networks at the participating co-ops. The objective of this project was to build a path for other electric utilities, and particularly electrical cooperatives,more » to adopt emerging smart grid technology when it can improve utility operations, thus advancing the co-ops’ familiarity and comfort with such technology. Specifically, the project executed multiple subprojects employing a range of emerging smart grid technologies to test their cost-effectiveness and, where the technology demonstrated value, provided case studies that will enable other electric utilities—particularly electric cooperatives— to use these technologies. NRECA structured the project according to the following three areas: Demonstration of smart grid technology; Advancement of standards to enable the interoperability of components; and Improvement of grid cyber security. We termed these three areas Technology Deployment Study, Interoperability, and Cyber Security. Although the deployment of technology and studying the demonstration projects at coops accounted for the largest portion of the project budget by far, we see our accomplishments in each of the areas as critical to advancing the smart grid. All project deliverables have been published. Technology Deployment Study: The deliverable was a set of 11 single-topic technical reports in areas related to the listed technologies. Each of these reports has already been submitted to DOE, distributed to co-ops, and posted for universal access at www.nreca.coop/smartgrid. This research is available for widespread distribution to both cooperative members and non-members. These reports are listed in Table 1.2. Interoperability: The deliverable in this area was the advancement of the MultiSpeak™ interoperability standard from version 4.0 to version 5.0, and improvement in the MultiSpeak™ documentation to include more than 100 use cases. This deliverable substantially expanded the scope and usability of MultiSpeak, ™ the most widely deployed utility interoperability standard, now in use by more than 900 utilities. MultiSpeak™ documentation can be accessed only at www.multispeak.org. Cyber Security: NRECA’s starting point was to develop cyber security tools that incorporated succinct guidance on best practices. The deliverables were: cyber security extensions to MultiSpeak,™ which allow more security message exchanges; a Guide to Developing a Cyber Security and Risk Mitigation Plan; a Cyber Security Risk Mitigation Checklist; a Cyber Security Plan Template that co-ops can use to create their own cyber security plans; and Security Questions for Smart Grid Vendors.« less

  13. Complete exchange on the iPSC-860

    NASA Technical Reports Server (NTRS)

    Bokhari, Shahid H.

    1991-01-01

    The implementation of complete exchange on the circuit switched Intel iPSC-860 hypercube is described. This pattern, also known as all-to-all personalized communication, is the densest requirement that can be imposed on a network. On the iPSC-860, care needs to be taken to avoid edge contention, which can have a disastrous impact on communication time. There are basically two classes of algorithms that achieve contention-free complete exchange. The first contains the classical standard exchange algorithm that is generally useful for small message sizes. The second includes a number of optimal or near-optimal algorithms that are best for large messages. Measurement of communication overhead on the iPSC-860 are given and a notation for analyzing communication link usage is developed. It is shown that for the two classes of algorithms, there is substantial variation in performance with synchronization technique and choice of message protocol. Timings of six implementations are given; each of these is useful over a particular range of message size and cube dimension. Since the complete exchange is a superset of communication patterns, these timings represent upper bounds on the time required by an arbitrary communication requirement. These results indicate that the programmer needs to evaluate several possibilities before finalizing an implementation - a careful choice can lead to very significant savings in time.

  14. A SOA-Based Platform to Support Clinical Data Sharing

    PubMed

    Gazzarata, R; Giannini, B; Giacomini, M

    2017-01-01

    The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the “Interoperable” Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the “Interoperable” Tier, the current solution actually covers the “Connected” Tier, due to local hospital policy restrictions. © 2017 R. Gazzarata et al.

  15. Challenges of interoperability using HL7 v3 in Czech healthcare.

    PubMed

    Nagy, Miroslav; Preckova, Petra; Seidl, Libor; Zvarova, Jana

    2010-01-01

    The paper describes several classification systems that could improve patient safety through semantic interoperability among contemporary electronic health record systems (EHR-Ss) with support of the HL7 v3 standard. We describe a proposal and a pilot implementation of a semantic interoperability platform (SIP) interconnecting current EHR-Ss by using HL7 v3 messages and concepts mappings on most widely used classification systems. The increasing number of classification systems and nomenclatures requires designing of various conversion tools for transfer between main classification systems. We present the so-called LIM filler module and the HL7 broker, which are parts of the SIP, playing the role of such conversion tools. The analysis of suitability and usability of individual terminological thesauri has been started by mapping of clinical contents of the Minimal Data Model for Cardiology (MDMC) to various terminological classification systems. A national-wide implementation of the SIP would include adopting and translating international coding systems and nomenclatures, and developing implementation guidelines facilitating the migration from national standards to international ones. Our research showed that creation of such a platform is feasible; however, it will require a huge effort to adapt fully the Czech healthcare system to the European environment.

  16. A SOA-Based Platform to Support Clinical Data Sharing

    PubMed Central

    Gazzarata, R.; Giannini, B.

    2017-01-01

    The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the “Interoperable” Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the “Interoperable” Tier, the current solution actually covers the “Connected” Tier, due to local hospital policy restrictions. PMID:29065576

  17. 76 FR 26777 - Self-Regulatory Organizations; Notice of Filing and Immediate Effectiveness of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-09

    ...) System Event Messages (e.g., start of messages, start of system hours, start of quoting, start of opening... assignments, to take advantage of the proposed $500 cap and thereby limit costs. \\11\\ Typically, a smaller... of the Exchange that is typically a small proprietary market maker doing business on the Exchange's...

  18. Efficiently passing messages in distributed spiking neural network simulation.

    PubMed

    Thibeault, Corey M; Minkovich, Kirill; O'Brien, Michael J; Harris, Frederick C; Srinivasa, Narayan

    2013-01-01

    Efficiently passing spiking messages in a neural model is an important aspect of high-performance simulation. As the scale of networks has increased so has the size of the computing systems required to simulate them. In addition, the information exchange of these resources has become more of an impediment to performance. In this paper we explore spike message passing using different mechanisms provided by the Message Passing Interface (MPI). A specific implementation, MVAPICH, designed for high-performance clusters with Infiniband hardware is employed. The focus is on providing information about these mechanisms for users of commodity high-performance spiking simulators. In addition, a novel hybrid method for spike exchange was implemented and benchmarked.

  19. Leveraging standards to support patient-centric interdisciplinary plans of care.

    PubMed

    Dykes, Patricia C; DaDamio, Rebecca R; Goldsmith, Denise; Kim, Hyeon-eui; Ohashi, Kumiko; Saba, Virginia K

    2011-01-01

    As health care systems and providers move towards meaningful use of electronic health records, the once distant vision of collaborative patient-centric, interdisciplinary plans of care, generated and updated across organizations and levels of care, may soon become a reality. Effective care planning is included in the proposed Stages 2-3 Meaningful Use quality measures. To facilitate interoperability, standardization of plan of care messaging, content, information and terminology models are needed. This degree of standardization requires local and national coordination. The purpose of this paper is to review some existing standards that may be leveraged to support development of interdisciplinary patient-centric plans of care. Standards are then applied to a use case to demonstrate one method for achieving patient-centric and interoperable interdisciplinary plan of care documentation. Our pilot work suggests that existing standards provide a foundation for adoption and implementation of patient-centric plans of care that are consistent with federal requirements.

  20. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things

    PubMed Central

    Jazayeri, Mohammad Ali; Liang, Steve H. L.; Huang, Chih-Yuan

    2015-01-01

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision. PMID:26402683

  1. Mobile platform for treatment of stroke: A case study of tele-assistance

    PubMed Central

    Torres Zenteno, Arturo Henry; Fernández, Francisco; Palomino-García, Alfredo; Moniche, Francisco; Escudero, Irene; Jiménez-Hernández, M Dolores; Caballero, Auxiliadora; Escobar-Rodriguez, Germán; Parra, Carlos

    2015-01-01

    This article presents the technological solution of a tele-assistance process for stroke patients in acute phase in the Seville metropolitan area. The main objective of this process is to reduce time from symptom onset to treatment of acute phase stroke patients by means of telemedicine, regarding mobility between an intensive care unit ambulance and an expert center and activating the pre-hospital care phase. The technological platform covering the process has been defined following an interoperability model based on standards and with a focus on service-oriented architecture focus. Messaging definition has been designed according to the reference model of the CEN/ISO 13606, messages content follows the structure of archetypes. An XDS-b (Cross-Enterprise Document Sharing-b) transaction messaging has been designed according to Integrating the Healthcare Enterprise profile for archetype notifications and update enquiries.This research has been performed by a multidisciplinary group. The Virgen del Rocío University Hospital acts as Reference Hospital and the Public Company for Healthcare as mobility surroundings. PMID:25975806

  2. STAR Online Framework: from Metadata Collection to Event Analysis and System Control

    NASA Astrophysics Data System (ADS)

    Arkhipkin, D.; Lauret, J.

    2015-05-01

    In preparation for the new era of RHIC running (RHIC-II upgrades and possibly, the eRHIC era), the STAR experiment is expanding its modular Message Interface and Reliable Architecture framework (MIRA). MIRA allowed STAR to integrate meta-data collection, monitoring, and online QA components in a very agile and efficient manner using a messaging infrastructure approach. In this paper, we briefly summarize our past achievements, provide an overview of the recent development activities focused on messaging patterns and describe our experience with the complex event processor (CEP) recently integrated into the MIRA framework. CEP was used in the recent RHIC Run 14, which provided practical use cases. Finally, we present our requirements and expectations for the planned expansion of our systems, which will allow our framework to acquire features typically associated with Detector Control Systems. Special attention is given to aspects related to latency, scalability and interoperability within heterogeneous set of services, various data and meta-data acquisition components coexisting in STAR online domain.

  3. Geoscience Information Network (USGIN) Solutions for Interoperable Open Data Access Requirements

    NASA Astrophysics Data System (ADS)

    Allison, M. L.; Richard, S. M.; Patten, K.

    2014-12-01

    The geosciences are leading development of free, interoperable open access to data. US Geoscience Information Network (USGIN) is a freely available data integration framework, jointly developed by the USGS and the Association of American State Geologists (AASG), in compliance with international standards and protocols to provide easy discovery, access, and interoperability for geoscience data. USGIN standards include the geologic exchange language 'GeoSciML' (v 3.2 which enables instant interoperability of geologic formation data) which is also the base standard used by the 117-nation OneGeology consortium. The USGIN deployment of NGDS serves as a continent-scale operational demonstration of the expanded OneGeology vision to provide access to all geoscience data worldwide. USGIN is developed to accommodate a variety of applications; for example, the International Renewable Energy Agency streams data live to the Global Atlas of Renewable Energy. Alternatively, users without robust data sharing systems can download and implement a free software packet, "GINstack" to easily deploy web services for exposing data online for discovery and access. The White House Open Data Access Initiative requires all federally funded research projects and federal agencies to make their data publicly accessible in an open source, interoperable format, with metadata. USGIN currently incorporates all aspects of the Initiative as it emphasizes interoperability. The system is successfully deployed as the National Geothermal Data System (NGDS), officially launched at the White House Energy Datapalooza in May, 2014. The USGIN Foundation has been established to ensure this technology continues to be accessible and available.

  4. Patient-clinician mobile communication: analyzing text messaging between adolescents with asthma and nurse case managers.

    PubMed

    Yoo, Woohyun; Kim, Soo Yun; Hong, Yangsun; Chih, Ming-Yuan; Shah, Dhavan V; Gustafson, David H

    2015-01-01

    With the increasing penetration of digital mobile devices among adolescents, mobile texting messaging is emerging as a new channel for patient-clinician communication for this population. In particular, it can promote active communication between healthcare clinicians and adolescents with asthma. However, little is known about the content of the messages exchanged in medical encounters via mobile text messaging. Therefore, this study explored the content of text messaging between clinicians and adolescents with asthma. We collected a total of 2,953 text messages exchanged between 5 nurse case managers and 131 adolescents with asthma through a personal digital assistant. The text messages were coded using a scheme developed by adapting categories from the Roter Interaction Analysis System. Nurse case managers sent more text messages (n=2,639) than adolescents with asthma. Most messages sent by nurse case managers were targeted messages (n=2,475) directed at all adolescents with asthma, whereas there were relatively few tailored messages (n=164) that were created personally for an individual adolescent. In addition, both targeted and tailored messages emphasized task-focused behaviors over socioemotional behaviors. Likewise, text messages (n=314) sent by adolescents also emphasized task-focused over socioemotional behaviors. Mobile texting messaging has the potential to play an important role in patient-clinician communication. It promotes not only active interaction, but also patient-centered communication with clinicians. In order to achieve this potential, healthcare clinicians may need to focus on socioemotional communication as well as task-oriented communication.

  5. Patient Centeredness in Electronic Communication: Evaluation of Patient-to-Health Care Team Secure Messaging.

    PubMed

    Hogan, Timothy P; Luger, Tana M; Volkman, Julie E; Rocheleau, Mary; Mueller, Nora; Barker, Anna M; Nazi, Kim M; Houston, Thomas K; Bokhour, Barbara G

    2018-03-08

    As information and communication technology is becoming more widely implemented across health care organizations, patient-provider email or asynchronous electronic secure messaging has the potential to support patient-centered communication. Within the medical home model of the Veterans Health Administration (VA), secure messaging is envisioned as a means to enhance access and strengthen the relationships between veterans and their health care team members. However, despite previous studies that have examined the content of electronic messages exchanged between patients and health care providers, less research has focused on the socioemotional aspects of the communication enacted through those messages. Recognizing the potential of secure messaging to facilitate the goals of patient-centered care, the objectives of this analysis were to not only understand why patients and health care team members exchange secure messages but also to examine the socioemotional tone engendered in these messages. We conducted a cross-sectional coding evaluation of a corpus of secure messages exchanged between patients and health care team members over 6 months at 8 VA facilities. We identified patients whose medical records showed secure messaging threads containing at least 2 messages and compiled a random sample of these threads. Drawing on previous literature regarding the analysis of asynchronous, patient-provider electronic communication, we developed a coding scheme comprising a series of a priori patient and health care team member codes. Three team members tested the scheme on a subset of the messages and then independently coded the sample of messaging threads. Of the 711 messages coded from the 384 messaging threads, 52.5% (373/711) were sent by patients and 47.5% (338/711) by health care team members. Patient and health care team member messages included logistical content (82.6%, 308/373 vs 89.1%, 301/338), were neutral in tone (70.2%, 262/373 vs 82.0%, 277/338), and respectful in nature (25.7%, 96/373 vs 33.4%, 113/338). Secure messages from health care team members sometimes appeared hurried (25.4%, 86/338) but also displayed friendliness or warmth (18.9%, 64/338) and reassurance or encouragement (18.6%, 63/338). Most patient messages involved either providing or seeking information; however, the majority of health care team member messages involved information provision in response to patient questions. This evaluation is an important step toward understanding the content and socioemotional tone that is part of the secure messaging exchanges between patients and health care team members. Our findings were encouraging; however, there are opportunities for improvement. As health care organizations seek to supplement traditional encounters with virtual care, they must reexamine their use of secure messaging, including the patient centeredness of the communication, and the potential for more proactive use by health care team members. ©Timothy P Hogan, Tana M Luger, Julie E Volkman, Mary Rocheleau, Nora Mueller, Anna M Barker, Kim M Nazi, Thomas K Houston, Barbara G Bokhour. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 08.03.2018.

  6. Economic impact of a nationwide interoperable e-Health system using the PENG evaluation tool.

    PubMed

    Parv, L; Saluse, J; Aaviksoo, A; Tiik, M; Sepper, R; Ross, P

    2012-01-01

    The aim of this paper is to evaluate the costs and benefits of the Estonian interoperable health information exchange system. In addition, a framework will be built for follow-up monitoring and analysis of a nationwide HIE system. PENG evaluation tool was used to map and quantify the costs and benefits arising from type II diabetic patient management for patients, providers and the society. The analysis concludes with a quantification based on real costs and potential benefits identified by a panel of experts. Setting up a countrywide interoperable eHealth system incurs a large initial investment. However, if the system is working seamlessly, benefits will surpass costs within three years. The results show that while the society stands to benefit the most, the costs will be mainly borne by the healthcare providers. Therefore, new government policies should be devised to encourage providers to invest to ensure society wide benefits.

  7. Nuclear Forensic Lab Interoperability and Criminal Investigation

    DTIC Science & Technology

    2014-08-01

    34 Hydrometallurgy (3-4): 175-180 13. Kolodynska, D., H. Hubicka, et al. (2008). " Sorption of heavy metal ions from aqueous solutions in the presence of...EDTA on monodisperse anion exchangers." Desalination (1-3): 150-166 14. Kolodynska, D. (2009). "Polyacrylate anion exchangers in sorption of heavy...F. D. and A. H. Martins (2004). "Selective sorption of nickel and cobalt from sulphate solutions using chelating resins." International Journal of

  8. Online Social Support for Patients with Multiple Sclerosis: A Thematic Analysis of Messages Posted to a Virtual Support Community.

    PubMed

    Shavazi, Masoumeh Abbasi; Morowatisharifabad, Mohammad Ali; Shavazi, Mohammad Taghi Abbasi; Mirzaei, Masoud; Ardekani, Ali Mellat

    2016-07-01

    Currently with the emergence of the Internet, patients have an opportunity to exchange social support online. However, little attention has been devoted to different dimensions of online social support exchanged in virtual support communities for patients with multiple sclerosis (MS). To provide a rich insight, the aim of this qualitative study was to explore and categorize different dimensions of online social support in messages exchanged in a virtual support community for patients with MS. A total of 548 posted messages created during one year period were selected using purposive sampling to consider the maximum variation sampling. Prior-research-driven thematic analysis was then conducted. In this regard, we used the Cutruna and Suhr's coding system. The messages that could not be categorized with the used coding system were thematically analyzed to explore new additional social support themes. The results showed that various forms of social support including informational, emotional, network, esteem and tangible support were exchanged. Moreover, new additional social support themes including sharing personal experiences, sharing coping strategies and spiritual support emerged in this virtual support community. The wide range of online social support exchanged in the virtual support community can be regarded as a supplementary source of social support for patients with MS. Future researches can examine online social support more comprehensively considering additional social support themes emerging in the present study.

  9. Workshop report: Identifying opportunities for global integration of toxicogenomics databases, 26-27 June 2013, Research Triangle Park, NC, USA.

    PubMed

    Hendrickx, Diana M; Boyles, Rebecca R; Kleinjans, Jos C S; Dearry, Allen

    2014-12-01

    A joint US-EU workshop on enhancing data sharing and exchange in toxicogenomics was held at the National Institute for Environmental Health Sciences. Currently, efficient reuse of data is hampered by problems related to public data availability, data quality, database interoperability (the ability to exchange information), standardization and sustainability. At the workshop, experts from universities and research institutes presented databases, studies, organizations and tools that attempt to deal with these problems. Furthermore, a case study showing that combining toxicogenomics data from multiple resources leads to more accurate predictions in risk assessment was presented. All participants agreed that there is a need for a web portal describing the diverse, heterogeneous data resources relevant for toxicogenomics research. Furthermore, there was agreement that linking more data resources would improve toxicogenomics data analysis. To outline a roadmap to enhance interoperability between data resources, the participants recommend collecting user stories from the toxicogenomics research community on barriers in data sharing and exchange currently hampering answering to certain research questions. These user stories may guide the prioritization of steps to be taken for enhancing integration of toxicogenomics databases.

  10. A Framework for Integration of Heterogeneous Medical Imaging Networks

    PubMed Central

    Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos

    2014-01-01

    Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS. PMID:25279021

  11. A framework for integration of heterogeneous medical imaging networks.

    PubMed

    Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos

    2014-01-01

    Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS.

  12. ACR Imaging IT Reference Guide: Image Sharing: Evolving Solutions in the Age of Interoperability

    PubMed Central

    Erickson, Bradley J.; Choy, Garry

    2014-01-01

    Interoperability is a major focus of the quickly evolving world of Health Information Technology. Easy, yet secure and confidential exchange of imaging exams and the associated reports must be a part of the solutions that are implemented. The availability of historical exams is essential in providing a quality interpretation and reducing inappropriate utilization of imaging services. Today exchange of imaging exams is most often achieved via a CD. We describe the virtues of this solution as well as challenges that have surfaced. Internet and cloud based technologies employed for many consumer services can provide a better solution. Vendors are making these solutions available. Standards for internet based exchange are emerging. Just as Radiology converged on DICOM as a standard to store and view images we need a common exchange standard. We will review the existing standards, and how they are organized into useful workflows through Integrating the Healthcare Enterprise (IHE) profiles. IHE and standards development processes are discussed. Healthcare and the domain of Radiology must stay current with quickly evolving internet standards. The successful use of the “cloud” will depend upon both the technologies we discuss and the policies put into place around these technologies. We discuss both aspects. The Radiology community must lead the way and provide a solution that works for radiologists and clinicians in the Electronic Medical Record (EMR). Lastly we describe the features we believe radiologists should consider when considering adding internet based exchange solutions to their practice. PMID:25467903

  13. A distributed framework for health information exchange using smartphone technologies.

    PubMed

    Abdulnabi, Mohamed; Al-Haiqi, Ahmed; Kiah, M L M; Zaidan, A A; Zaidan, B B; Hussain, Muzammil

    2017-05-01

    Nationwide health information exchange (NHIE) continues to be a persistent concern for government agencies, despite the many efforts and the conceived benefits of sharing patient data among healthcare providers. Difficulties in ensuring global connectivity, interoperability, and concerns on security have always hampered the government from successfully deploying NHIE. By looking at NHIE from a fresh perspective and bearing in mind the pervasiveness and power of modern mobile platforms, this paper proposes a new approach to NHIE that builds on the notion of consumer-mediated HIE, albeit without the focus on central health record banks. With the growing acceptance of smartphones as reliable, indispensable, and most personal devices, we suggest to leverage the concept of mobile personal health records (PHRs installed on smartphones) to the next level. We envision mPHRs that take the form of distributed storage units for health information, under the full control and direct possession of patients, who can have ready access to their personal data whenever needed. However, for the actual exchange of data with health information systems managed by healthcare providers, the latter have to be interoperable with patient-carried mPHRs. Computer industry has long ago solved a similar problem of interoperability between peripheral devices and operating systems. We borrow from that solution the idea of providing special interfaces between mPHRs and provider systems. This interface enables the two entities to communicate with no change to either end. The design and operation of the proposed approach is explained. Additional pointers on potential implementations are provided, and issues that pertain to any solution to implement NHIE are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Personal communications: An extension to the mobile satellite

    NASA Technical Reports Server (NTRS)

    Epstein, Murray; Draper, Francois

    1990-01-01

    As time progresses, customer demands become far more universal, involving integrated, simple to operate, cost effective services, with technology virtually transparent to the operator. Industry will be in a position of providing the necessary services to meet the subscribers' needs. Our resource based industries, transportation services, and utilities in the more rural and unserviced areas will require quality and affordable services that can only be supplied via satellite. One answer to these needs will be one- and two-way interoperable data messaging.

  15. Patient–Clinician Mobile Communication: Analyzing Text Messaging Between Adolescents with Asthma and Nurse Case Managers

    PubMed Central

    Kim, Soo Yun; Hong, Yangsun; Chih, Ming-Yuan; Shah, Dhavan V.; Gustafson, David H.

    2015-01-01

    Abstract Background: With the increasing penetration of digital mobile devices among adolescents, mobile texting messaging is emerging as a new channel for patient–clinician communication for this population. In particular, it can promote active communication between healthcare clinicians and adolescents with asthma. However, little is known about the content of the messages exchanged in medical encounters via mobile text messaging. Therefore, this study explored the content of text messaging between clinicians and adolescents with asthma. Materials and Methods: We collected a total of 2,953 text messages exchanged between 5 nurse case managers and 131 adolescents with asthma through a personal digital assistant. The text messages were coded using a scheme developed by adapting categories from the Roter Interaction Analysis System. Results: Nurse case managers sent more text messages (n=2,639) than adolescents with asthma. Most messages sent by nurse case managers were targeted messages (n=2,475) directed at all adolescents with asthma, whereas there were relatively few tailored messages (n=164) that were created personally for an individual adolescent. In addition, both targeted and tailored messages emphasized task-focused behaviors over socioemotional behaviors. Likewise, text messages (n=314) sent by adolescents also emphasized task-focused over socioemotional behaviors. Conclusions: Mobile texting messaging has the potential to play an important role in patient–clinician communication. It promotes not only active interaction, but also patient-centered communication with clinicians. In order to achieve this potential, healthcare clinicians may need to focus on socioemotional communication as well as task-oriented communication. PMID:25401324

  16. Can openEHR archetypes be used in a national context? The Danish archetype proof-of-concept project.

    PubMed

    Bernstein, Knut; Tvede, Ida; Petersen, Jan; Bredegaard, Kirsten

    2009-01-01

    Semantic interoperability and secondary use of data are important informatics challenges in modern healthcare. Connected Digital Health Denmark is investigating if the openEHR reference model, archetypes and templates could be used for representing and exchanging clinical content specification and could become a candidate for a national logical infrastructure for semantic interoperability. The Danish archetype proof-of-concept project has tried out some elements of the openEHR methodology in cooperation with regions and vendors. The project has pointed out benefits and challenges using archetypes, and has identified barriers that need to be addressed in the next steps.

  17. Application of portable CDA for secure clinical-document exchange.

    PubMed

    Huang, Kuo-Hsuan; Hsieh, Sung-Huai; Chang, Yuan-Jen; Lai, Feipei; Hsieh, Sheau-Ling; Lee, Hsiu-Hui

    2010-08-01

    Health Level Seven (HL7) organization published the Clinical Document Architecture (CDA) for exchanging documents among heterogeneous systems and improving medical quality based on the design method in CDA. In practice, although the HL7 organization tried to make medical messages exchangeable, it is still hard to exchange medical messages. There are many issues when two hospitals want to exchange clinical documents, such as patient privacy, network security, budget, and the strategies of the hospital. In this article, we propose a method for the exchange and sharing of clinical documents in an offline model based on the CDA-the Portable CDA. This allows the physician to retrieve the patient's medical record stored in a portal device, but not through the Internet in real time. The security and privacy of CDA data will also be considered.

  18. An Overview of Genomic Sequence Variation Markup Language (GSVML)

    PubMed Central

    Nakaya, Jun; Hiroi, Kaei; Ido, Keisuke; Yang, Woosung; Kimura, Michio

    2006-01-01

    Internationally accumulated genomic sequence variation data on human requires the interoperable data exchanging format. We developed the GSVML as the data exchanging format. The GSVML is human health oriented and has three categories. Analyses on the use case in human health domain and the investigation on the databases and markup languages were conducted. An interface ability to Health Level Seven Genotype Model was examined. GSVML provides a sharable platform for both clinical and research applications.

  19. MRML: an extensible communication protocol for interoperability and benchmarking of multimedia information retrieval systems

    NASA Astrophysics Data System (ADS)

    Mueller, Wolfgang; Mueller, Henning; Marchand-Maillet, Stephane; Pun, Thierry; Squire, David M.; Pecenovic, Zoran; Giess, Christoph; de Vries, Arjen P.

    2000-10-01

    While in the area of relational databases interoperability is ensured by common communication protocols (e.g. ODBC/JDBC using SQL), Content Based Image Retrieval Systems (CBIRS) and other multimedia retrieval systems are lacking both a common query language and a common communication protocol. Besides its obvious short term convenience, interoperability of systems is crucial for the exchange and analysis of user data. In this paper, we present and describe an extensible XML-based query markup language, called MRML (Multimedia Retrieval markup Language). MRML is primarily designed so as to ensure interoperability between different content-based multimedia retrieval systems. Further, MRML allows researchers to preserve their freedom in extending their system as needed. MRML encapsulates multimedia queries in a way that enable multimedia (MM) query languages, MM content descriptions, MM query engines, and MM user interfaces to grow independently from each other, reaching a maximum of interoperability while ensuring a maximum of freedom for the developer. For benefitting from this, only a few simple design principles have to be respected when extending MRML for one's fprivate needs. The design of extensions withing the MRML framework will be described in detail in the paper. MRML has been implemented and tested for the CBIRS Viper, using the user interface Snake Charmer. Both are part of the GNU project and can be downloaded at our site.

  20. Interoperability at ESA Heliophysics Science Archives: IVOA, HAPI and other implementations

    NASA Astrophysics Data System (ADS)

    Martinez-Garcia, B.; Cook, J. P.; Perez, H.; Fernandez, M.; De Teodoro, P.; Osuna, P.; Arnaud, M.; Arviset, C.

    2017-12-01

    The data of ESA heliophysics science missions are preserved at the ESAC Science Data Centre (ESDC). The ESDC aims for the long term preservation of those data, which includes missions such as Ulysses, Soho, Proba-2, Cluster, Double Star, and in the future, Solar Orbiter. Scientists have access to these data through web services, command line and graphical user interfaces for each of the corresponding science mission archives. The International Virtual Observatory Alliance (IVOA) provides technical standards that allow interoperability among different systems that implement them. By adopting some IVOA standards, the ESA heliophysics archives are able to share their data with those tools and services that are VO-compatible. Implementation of those standards can be found in the existing archives: Ulysses Final Archive (UFA) and Soho Science Archive (SSA). They already make use of VOTable format definition and Simple Application Messaging Protocol (SAMP). For re-engineered or new archives, the implementation of services through Table Access Protocol (TAP) or Universal Worker Service (UWS) will leverage this interoperability. This will be the case for the Proba-2 Science Archive (P2SA) and the Solar Orbiter Archive (SOAR). We present here which IVOA standards were already used by the ESA Heliophysics archives in the past and the work on-going.

  1. Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.

    PubMed

    Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher

    2017-08-01

    Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.

  2. Online Social Support for Patients with Multiple Sclerosis: A Thematic Analysis of Messages Posted to a Virtual Support Community

    PubMed Central

    Shavazi, Masoumeh Abbasi; Morowatisharifabad, Mohammad Ali; Shavazi, Mohammad Taghi Abbasi; Mirzaei, Masoud; Ardekani, Ali Mellat

    2016-01-01

    Background: Currently with the emergence of the Internet, patients have an opportunity to exchange social support online. However, little attention has been devoted to different dimensions of online social support exchanged in virtual support communities for patients with multiple sclerosis (MS). Methods: To provide a rich insight, the aim of this qualitative study was to explore and categorize different dimensions of online social support in messages exchanged in a virtual support community for patients with MS. A total of 548 posted messages created during one year period were selected using purposive sampling to consider the maximum variation sampling. Prior-research-driven thematic analysis was then conducted. In this regard, we used the Cutruna and Suhr’s coding system. The messages that could not be categorized with the used coding system were thematically analyzed to explore new additional social support themes. Results: The results showed that various forms of social support including informational, emotional, network, esteem and tangible support were exchanged. Moreover, new additional social support themes including sharing personal experiences, sharing coping strategies and spiritual support emerged in this virtual support community. Conclusion: The wide range of online social support exchanged in the virtual support community can be regarded as a supplementary source of social support for patients with MS. Future researches can examine online social support more comprehensively considering additional social support themes emerging in the present study. PMID:27382585

  3. Development of a Ground Water Data Portal for Interoperable Data Exchange within the U.S. National Ground Water Monitoring Network and Beyond

    NASA Astrophysics Data System (ADS)

    Booth, N. L.; Brodaric, B.; Lucido, J. M.; Kuo, I.; Boisvert, E.; Cunningham, W. L.

    2011-12-01

    The need for a national groundwater monitoring network within the United States is profound and has been recognized by organizations outside government as a major data gap for managing ground-water resources. Our country's communities, industries, agriculture, energy production and critical ecosystems rely on water being available in adequate quantity and suitable quality. To meet this need the Subcommittee on Ground Water, established by the Federal Advisory Committee on Water Information, created a National Ground Water Monitoring Network (NGWMN) envisioned as a voluntary, integrated system of data collection, management and reporting that will provide the data needed to address present and future ground-water management questions raised by Congress, Federal, State and Tribal agencies and the public. The NGWMN Data Portal is the means by which policy makers, academics and the public will be able to access ground water data through one seamless web-based application from disparate data sources. Data systems in the United States exist at many organizational and geographic levels and differing vocabulary and data structures have prevented data sharing and reuse. The data portal will facilitate the retrieval of and access to groundwater data on an as-needed basis from multiple, dispersed data repositories allowing the data to continue to be housed and managed by the data provider while being accessible for the purposes of the national monitoring network. This work leverages Open Geospatial Consortium (OGC) data exchange standards and information models. To advance these standards for supporting the exchange of ground water information, an OGC Interoperability Experiment was organized among international participants from government, academia and the private sector. The experiment focused on ground water data exchange across the U.S. / Canadian border. WaterML2.0, an evolving international standard for water observations, encodes ground water levels and is exchanged using the OGC Sensor Observation Service (SOS) standard. Ground Water Markup Language (GWML) encodes well log, lithology and construction information and is exchanged using the OGC Web Feature Service (WFS) standard. Within the NGWMN Data Portal, data exchange between distributed data provider repositories is achieved through the use of these web services and a central mediation hub, which performs both format (syntactic) and nomenclature (semantic) mediation, conforming heterogeneous inputs into common standards-based outputs. Through these common standards, interoperability between the U.S. NGWMN and Canada's Groundwater Information Network (GIN) is achieved, advancing a ground water virtual observatory across North America.

  4. Bringing Health and Fitness Data Together for Connected Health Care: Mobile Apps as Enablers of Interoperability

    PubMed Central

    2015-01-01

    Background A transformation is underway regarding how we deal with our health. Mobile devices make it possible to have continuous access to personal health information. Wearable devices, such as Fitbit and Apple’s smartwatch, can collect data continuously and provide insights into our health and fitness. However, lack of interoperability and the presence of data silos prevent users and health professionals from getting an integrated view of health and fitness data. To provide better health outcomes, a complete picture is needed which combines informal health and fitness data collected by the user together with official health records collected by health professionals. Mobile apps are well positioned to play an important role in the aggregation since they can tap into these official and informal health and data silos. Objective The objective of this paper is to demonstrate that a mobile app can be used to aggregate health and fitness data and can enable interoperability. It discusses various technical interoperability challenges encountered while integrating data into one place. Methods For 8 years, we have worked with third-party partners, including wearable device manufacturers, electronic health record providers, and app developers, to connect an Android app to their (wearable) devices, back-end servers, and systems. Results The result of this research is a health and fitness app called myFitnessCompanion, which enables users to aggregate their data in one place. Over 6000 users use the app worldwide to aggregate their health and fitness data. It demonstrates that mobile apps can be used to enable interoperability. Challenges encountered in the research process included the different wireless protocols and standards used to communicate with wireless devices, the diversity of security and authorization protocols used to be able to exchange data with servers, and lack of standards usage, such as Health Level Seven, for medical information exchange. Conclusions By limiting the negative effects of health data silos, mobile apps can offer a better holistic view of health and fitness data. Data can then be analyzed to offer better and more personalized advice and care. PMID:26581920

  5. Bringing Health and Fitness Data Together for Connected Health Care: Mobile Apps as Enablers of Interoperability.

    PubMed

    Gay, Valerie; Leijdekkers, Peter

    2015-11-18

    A transformation is underway regarding how we deal with our health. Mobile devices make it possible to have continuous access to personal health information. Wearable devices, such as Fitbit and Apple's smartwatch, can collect data continuously and provide insights into our health and fitness. However, lack of interoperability and the presence of data silos prevent users and health professionals from getting an integrated view of health and fitness data. To provide better health outcomes, a complete picture is needed which combines informal health and fitness data collected by the user together with official health records collected by health professionals. Mobile apps are well positioned to play an important role in the aggregation since they can tap into these official and informal health and data silos. The objective of this paper is to demonstrate that a mobile app can be used to aggregate health and fitness data and can enable interoperability. It discusses various technical interoperability challenges encountered while integrating data into one place. For 8 years, we have worked with third-party partners, including wearable device manufacturers, electronic health record providers, and app developers, to connect an Android app to their (wearable) devices, back-end servers, and systems. The result of this research is a health and fitness app called myFitnessCompanion, which enables users to aggregate their data in one place. Over 6000 users use the app worldwide to aggregate their health and fitness data. It demonstrates that mobile apps can be used to enable interoperability. Challenges encountered in the research process included the different wireless protocols and standards used to communicate with wireless devices, the diversity of security and authorization protocols used to be able to exchange data with servers, and lack of standards usage, such as Health Level Seven, for medical information exchange. By limiting the negative effects of health data silos, mobile apps can offer a better holistic view of health and fitness data. Data can then be analyzed to offer better and more personalized advice and care.

  6. Reference architecture and interoperability model for data mining and fusion in scientific cross-domain infrastructures

    NASA Astrophysics Data System (ADS)

    Haener, Rainer; Waechter, Joachim; Grellet, Sylvain; Robida, Francois

    2017-04-01

    Interoperability is the key factor in establishing scientific research environments and infrastructures, as well as in bringing together heterogeneous, geographically distributed risk management, monitoring, and early warning systems. Based on developments within the European Plate Observing System (EPOS), a reference architecture has been devised that comprises architectural blue-prints and interoperability models regarding the specification of business processes and logic as well as the encoding of data, metadata, and semantics. The architectural blueprint is developed on the basis of the so called service-oriented architecture (SOA) 2.0 paradigm, which combines intelligence and proactiveness of event-driven with service-oriented architectures. SOA 2.0 supports analysing (Data Mining) both, static and real-time data in order to find correlations of disparate information that do not at first appear to be intuitively obvious: Analysed data (e.g., seismological monitoring) can be enhanced with relationships discovered by associating them (Data Fusion) with other data (e.g., creepmeter monitoring), with digital models of geological structures, or with the simulation of geological processes. The interoperability model describes the information, communication (conversations) and the interactions (choreographies) of all participants involved as well as the processes for registering, providing, and retrieving information. It is based on the principles of functional integration, implemented via dedicated services, communicating via service-oriented and message-driven infrastructures. The services provide their functionality via standardised interfaces: Instead of requesting data directly, users share data via services that are built upon specific adapters. This approach replaces the tight coupling at data level by a flexible dependency on loosely coupled services. The main component of the interoperability model is the comprehensive semantic description of the information, business logic and processes on the basis of a minimal set of well-known, established standards. It implements the representation of knowledge with the application of domain-controlled vocabularies to statements about resources, information, facts, and complex matters (ontologies). Seismic experts for example, would be interested in geological models or borehole measurements at a certain depth, based on which it is possible to correlate and verify seismic profiles. The entire model is built upon standards from the Open Geospatial Consortium (Dictionaries, Service Layer), the International Organisation for Standardisation (Registries, Metadata), and the World Wide Web Consortium (Resource Description Framework, Spatial Data on the Web Best Practices). It has to be emphasised that this approach is scalable to the greatest possible extent: All information, necessary in the context of cross-domain infrastructures is referenced via vocabularies and knowledge bases containing statements that provide either the information itself or resources (service-endpoints), the information can be retrieved from. The entire infrastructure communication is subject to a broker-based business logic integration platform where the information exchanged between involved participants, is managed on the basis of standardised dictionaries, repositories, and registries. This approach also enables the development of Systems-of-Systems (SoS), which allow the collaboration of autonomous, large scale concurrent, and distributed systems, yet cooperatively interacting as a collective in a common environment.

  7. Controlled Bidirectional Quantum Secure Direct Communication

    PubMed Central

    Chou, Yao-Hsin; Lin, Yu-Ting; Zeng, Guo-Jyun; Lin, Fang-Jhu; Chen, Chi-Yuan

    2014-01-01

    We propose a novel protocol for controlled bidirectional quantum secure communication based on a nonlocal swap gate scheme. Our proposed protocol would be applied to a system in which a controller (supervisor/Charlie) controls the bidirectional communication with quantum information or secret messages between legitimate users (Alice and Bob). In this system, the legitimate users must obtain permission from the controller in order to exchange their respective quantum information or secret messages simultaneously; the controller is unable to obtain any quantum information or secret messages from the decoding process. Moreover, the presence of the controller also avoids the problem of one legitimate user receiving the quantum information or secret message before the other, and then refusing to help the other user decode the quantum information or secret message. Our proposed protocol is aimed at protecting against external and participant attacks on such a system, and the cost of transmitting quantum bits using our protocol is less than that achieved in other studies. Based on the nonlocal swap gate scheme, the legitimate users exchange their quantum information or secret messages without transmission in a public channel, thus protecting against eavesdroppers stealing the secret messages. PMID:25006596

  8. CSlib, a library to couple codes via Client/Server messaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plimpton, Steve

    The CSlib is a small, portable library which enables two (or more) independent simulation codes to be coupled, by exchanging messages with each other. Both codes link to the library when they are built, and can them communicate with each other as they run. The messages contain data or instructions that the two codes send back-and-forth to each other. The messaging can take place via files, sockets, or MPI. The latter is a standard distributed-memory message-passing library.

  9. Socializing Messages in Blue-Collar Families: Communicative Pathways to Social Mobility and Reproduction

    ERIC Educational Resources Information Center

    Lucas, Kristen

    2011-01-01

    This study explicitly links processes of anticipatory socialization to social mobility and reproduction. An examination of the socializing messages exchanged between blue-collar parents (n = 41) and their children (n = 25) demonstrate that family-based messages about work and career seldom occur in straightforward, unambiguous ways. Instead,…

  10. Hemodynamic signals of mixed messages during a social exchange.

    PubMed

    Zucker, Nancy L; Green, Steven; Morris, James P; Kragel, Philip; Pelphrey, Kevin A; Bulik, Cynthia M; LaBar, Kevin S

    2011-06-22

    This study used functional magnetic resonance imaging to characterize hemodynamic activation patterns recruited when the participants viewed mixed social communicative messages during a common interpersonal exchange. Mixed messages were defined as conflicting sequences of biological motion and facial affect signals that are unexpected within a particular social context (e.g. observing the reception of a gift). Across four social vignettes, valenced facial expressions were crossed with rejecting and accepting gestures in a virtual avatar responding to presentation of a gift from the participant. The results indicate that conflicting facial affect and gesture activated superior temporal sulcus, a region implicated in expectancy violations, as well as inferior frontal gyrus and putamen. Scenarios conveying rejection differentially activated the insula and putamen, regions implicated in embodied cognition, and motivated learning, as well as frontoparietal cortex. Characterizing how meaning is inferred from integration of conflicting nonverbal communicative cues is essential to understand nuances and complexities of human exchange.

  11. 78 FR 14793 - Advancing Interoperability and Health Information Exchange

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    ...: Designing Data Stewardship Entities and Advancing Data Access,'' Health Services Research 2010 45:5, Part II... business information that could be considered to be proprietary. We will post all comments received before... office-based physicians as well as hospitals. 5 6 For example, physician adoption of five core Meaningful...

  12. Opening up Library Automation Software

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  13. Mobile platform for treatment of stroke: A case study of tele-assistance.

    PubMed

    Torres Zenteno, Arturo Henry; Fernández, Francisco; Palomino-García, Alfredo; Moniche, Francisco; Escudero, Irene; Jiménez-Hernández, M Dolores; Caballero, Auxiliadora; Escobar-Rodriguez, Germán; Parra, Carlos

    2016-09-01

    This article presents the technological solution of a tele-assistance process for stroke patients in acute phase in the Seville metropolitan area. The main objective of this process is to reduce time from symptom onset to treatment of acute phase stroke patients by means of telemedicine, regarding mobility between an intensive care unit ambulance and an expert center and activating the pre-hospital care phase. The technological platform covering the process has been defined following an interoperability model based on standards and with a focus on service-oriented architecture focus. Messaging definition has been designed according to the reference model of the CEN/ISO 13606, messages content follows the structure of archetypes. An XDS-b (Cross-Enterprise Document Sharing-b) transaction messaging has been designed according to Integrating the Healthcare Enterprise profile for archetype notifications and update enquiries.This research has been performed by a multidisciplinary group. The Virgen del Rocío University Hospital acts as Reference Hospital and the Public Company for Healthcare as mobility surroundings. © The Author(s) 2015.

  14. Physical Forces between Humans and How Humans Attract and Repel Each Other Based on Their Social Interactions in an Online World

    PubMed Central

    Thurner, Stefan; Fuchs, Benedikt

    2015-01-01

    Physical interactions between particles are the result of the exchange of gauge bosons. Human interactions are mediated by the exchange of messages, goods, money, promises, hostilities, etc. While in the physical world interactions and their associated forces have immediate dynamical consequences (Newton’s laws) the situation is not clear for human interactions. Here we quantify the relative acceleration between humans who interact through the exchange of messages, goods and hostilities in a massive multiplayer online game. For this game we have complete information about all interactions (exchange events) between about 430,000 players, and about their trajectories (movements) in the metric space of the game universe at any point in time. We use this information to derive “interaction potentials" for communication, trade and attacks and show that they are harmonic in nature. Individuals who exchange messages and trade goods generally attract each other and start to separate immediately after exchange events end. The form of the interaction potential for attacks mirrors the usual “hit-and-run" tactics of aggressive players. By measuring interaction intensities as a function of distance, velocity and acceleration, we show that “forces" between players are directly related to the number of exchange events. We find an approximate power-law decay of the likelihood for interactions as a function of distance, which is in accordance with previous real world empirical work. We show that the obtained potentials can be understood with a simple model assuming an exchange-driven force in combination with a distance-dependent exchange rate. PMID:26196505

  15. Physical Forces between Humans and How Humans Attract and Repel Each Other Based on Their Social Interactions in an Online World.

    PubMed

    Thurner, Stefan; Fuchs, Benedikt

    2015-01-01

    Physical interactions between particles are the result of the exchange of gauge bosons. Human interactions are mediated by the exchange of messages, goods, money, promises, hostilities, etc. While in the physical world interactions and their associated forces have immediate dynamical consequences (Newton's laws) the situation is not clear for human interactions. Here we quantify the relative acceleration between humans who interact through the exchange of messages, goods and hostilities in a massive multiplayer online game. For this game we have complete information about all interactions (exchange events) between about 430,000 players, and about their trajectories (movements) in the metric space of the game universe at any point in time. We use this information to derive "interaction potentials" for communication, trade and attacks and show that they are harmonic in nature. Individuals who exchange messages and trade goods generally attract each other and start to separate immediately after exchange events end. The form of the interaction potential for attacks mirrors the usual "hit-and-run" tactics of aggressive players. By measuring interaction intensities as a function of distance, velocity and acceleration, we show that "forces" between players are directly related to the number of exchange events. We find an approximate power-law decay of the likelihood for interactions as a function of distance, which is in accordance with previous real world empirical work. We show that the obtained potentials can be understood with a simple model assuming an exchange-driven force in combination with a distance-dependent exchange rate.

  16. Design and Implementation of a Lunar Communications Satellite and Server for the 2012 SISO Smackdown

    NASA Technical Reports Server (NTRS)

    Bulgatz, Dennis; Heater, Daniel; O'Neal, Daniel A.; Norris, Bryan; Schricker, Bradley C.

    2012-01-01

    Last year, the Simulation Interoperability Standards Organization (SISO) inaugurated the now annual High Level Architecture (HLA) Smackdown at the Spring Simulation Interoperability Workshop (SIW). A primary objective of the Smackdown event is to provide college students with hands-on experience in the High Level Architecture (HLA). The University of Alabama in Huntsville (UAHuntsville) fielded teams in 2011 and 2012. Both the 2011 and 2012 smackdown scenarios were a lunar resupply mission. The 2012 UAHuntsville fielded four federates: a communications network Federate called Lunar Communications and Navigation Satellite Service (LCANServ) for sending and receiving messages, a Lunar Satellite Constellation (LCANSat) to put in place radios needed by the communications network for Line-Of-Sight communication calculations, and 3D graphical displays of the orbiting satellites and a 3D visualization of the lunar surface activities. This paper concentrates on the first two federates by describing the functions, algorithms, the modular FOM, experiences, lessons learned and recommendations for future Smackdown events.

  17. 78 FR 46395 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-31

    ... after logon 10:00:020--CAS receives a message from Client Application --Counter re-starts 10:00:070--No... receives a message from Client Application --Counter restarts (2) 10:00:000--Heartbeat Request sent to Client Application within login 10:00:020--CAS receives a message from Client Application --Counter re...

  18. Structured electronic physiotherapy records.

    PubMed

    Buyl, Ronald; Nyssen, Marc

    2009-07-01

    With the introduction of the electronic health record, physiotherapists too are encouraged to store their patient records in a structured digital format. The typical nature of a physiotherapy treatment requires a specific record structure to be implemented, with special attention to user-friendliness and communication with other healthcare providers. The objective of this study was to establish a framework for the electronic physiotherapy record and to define a model for the interoperability with the other healthcare providers involved in the patients' care. Although we started from the Belgian context, we used a generic approach so that the results can easily be extrapolated to other countries. The framework we establish here defines not only the different building blocks of the electronic physiotherapy record, but also describes the structure and the content of the exchanged data elements. Through a combined effort by all involved parties, we elaborated an eight-level structure for the electronic physiotherapy record. Furthermore we designed a server-based model for the exchange of data between electronic record systems held by physicians and those held by physiotherapists. Two newly defined XML messages enable data interchange: the physiotherapy prescription and the physiotherapy report. We succeeded in defining a solid, structural model for electronic physiotherapist record systems. Recent wide scale implementation of operational elements such as the electronic registry has proven to make the administrative work easier for the physiotherapist. Moreover, within the proposed framework all the necessary building blocks are present for further data exchange and communication with other healthcare parties in the future. Although we completed the design of the structure and already implemented some new aspects of the electronic physiotherapy record, the real challenge lies in persuading the end-users to start using these electronic record systems. Via a quality label certification procedure, based on adequate criteria, the Ministry of Health tries to promote the use of electronic physiotherapy records. We must keep in mind that physiotherapists will show an interest in electronic record keeping, only if this will lead to a positive return for them.

  19. The DBCLS BioHackathon: standardization and interoperability for bioinformatics web services and workflows. The DBCLS BioHackathon Consortium*.

    PubMed

    Katayama, Toshiaki; Arakawa, Kazuharu; Nakao, Mitsuteru; Ono, Keiichiro; Aoki-Kinoshita, Kiyoko F; Yamamoto, Yasunori; Yamaguchi, Atsuko; Kawashima, Shuichi; Chun, Hong-Woo; Aerts, Jan; Aranda, Bruno; Barboza, Lord Hendrix; Bonnal, Raoul Jp; Bruskiewich, Richard; Bryne, Jan C; Fernández, José M; Funahashi, Akira; Gordon, Paul Mk; Goto, Naohisa; Groscurth, Andreas; Gutteridge, Alex; Holland, Richard; Kano, Yoshinobu; Kawas, Edward A; Kerhornou, Arnaud; Kibukawa, Eri; Kinjo, Akira R; Kuhn, Michael; Lapp, Hilmar; Lehvaslaiho, Heikki; Nakamura, Hiroyuki; Nakamura, Yasukazu; Nishizawa, Tatsuya; Nobata, Chikashi; Noguchi, Tamotsu; Oinn, Thomas M; Okamoto, Shinobu; Owen, Stuart; Pafilis, Evangelos; Pocock, Matthew; Prins, Pjotr; Ranzinger, René; Reisinger, Florian; Salwinski, Lukasz; Schreiber, Mark; Senger, Martin; Shigemoto, Yasumasa; Standley, Daron M; Sugawara, Hideaki; Tashiro, Toshiyuki; Trelles, Oswaldo; Vos, Rutger A; Wilkinson, Mark D; York, William; Zmasek, Christian M; Asai, Kiyoshi; Takagi, Toshihisa

    2010-08-21

    Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies.

  20. Development of NATO's recognized environmental picture

    NASA Astrophysics Data System (ADS)

    Teufert, John F.; Trabelsi, Mourad

    2006-05-01

    An important element for the fielding of a viable, effective NATO Response Force (NRF) is access to meteorological, oceanographic, geospatial data (GEOMETOC) and imagery. Currently, the available GEOMETOC information suffers from being very fragmented. NATO defines the Recognised Environmental Picture as controlled information base for GEOMETOC data. The NATO REP proposes an architecture that is both flexible and open. The focus lies on enabling a network-centric approach. The key into achieving this is relying on using open, well recognized standards that apply to both the data exchange protocols and the data formats. Communication and information exchange based on open standards enables system interoperability. Diverse systems, each with unique, specialized contributions to an increased understanding of the battlespace, can now cooperate to a manageable information sphere. By clearly defining responsibilities in the generation of information, a reduction in data transfer overhead is achieved . REP identifies three main stages in the dissemination of GEOMETOC data. These are Collection, Fusion (and Analysis) and Publication. A REP architecture has been successfully deployed during the NATO Coalition Warrior Interoperability Demonstration (CWID) in Lillehammer, Norway during June 2005. CWID is an annual event to validate and improve the interoperability of NATO and national Consultation and command, control, communications, computers, intelligence, surveillance, and reconnaissance (C4ISR) systems. With a test case success rate of 84%, it was able to provide relevant GEOMETOC support to the main NRF component headquarters. In 2006, the REP architecture will be deployed and validated during the NATO NRF Steadfast live exercises.

  1. The DBCLS BioHackathon: standardization and interoperability for bioinformatics web services and workflows. The DBCLS BioHackathon Consortium*

    PubMed Central

    2010-01-01

    Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies. PMID:20727200

  2. Common Data Model for Neuroscience Data and Data Model Exchange

    PubMed Central

    Gardner, Daniel; Knuth, Kevin H.; Abato, Michael; Erde, Steven M.; White, Thomas; DeBellis, Robert; Gardner, Esther P.

    2001-01-01

    Objective: Generalizing the data models underlying two prototype neurophysiology databases, the authors describe and propose the Common Data Model (CDM) as a framework for federating a broad spectrum of disparate neuroscience information resources. Design: Each component of the CDM derives from one of five superclasses—data, site, method, model, and reference—or from relations defined between them. A hierarchic attribute-value scheme for metadata enables interoperability with variable tree depth to serve specific intra- or broad inter-domain queries. To mediate data exchange between disparate systems, the authors propose a set of XML-derived schema for describing not only data sets but data models. These include biophysical description markup language (BDML), which mediates interoperability between data resources by providing a meta-description for the CDM. Results: The set of superclasses potentially spans data needs of contemporary neuroscience. Data elements abstracted from neurophysiology time series and histogram data represent data sets that differ in dimension and concordance. Site elements transcend neurons to describe subcellular compartments, circuits, regions, or slices; non-neuroanatomic sites include sequences to patients. Methods and models are highly domain-dependent. Conclusions: True federation of data resources requires explicit public description, in a metalanguage, of the contents, query methods, data formats, and data models of each data resource. Any data model that can be derived from the defined superclasses is potentially conformant and interoperability can be enabled by recognition of BDML-described compatibilities. Such metadescriptions can buffer technologic changes. PMID:11141510

  3. Ocean Data Interoperability Platform (ODIP): developing a common framework for global marine data management

    NASA Astrophysics Data System (ADS)

    Glaves, H. M.

    2015-12-01

    In recent years marine research has become increasingly multidisciplinary in its approach with a corresponding rise in the demand for large quantities of high quality interoperable data as a result. This requirement for easily discoverable and readily available marine data is currently being addressed by a number of regional initiatives with projects such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Integrated Marine Observing System (IMOS) in Australia, having implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these systems has been developed to address local requirements and created in isolation from those in other regions.Multidisciplinary marine research on a global scale necessitates a common framework for marine data management which is based on existing data systems. The Ocean Data Interoperability Platform project is seeking to address this requirement by bringing together selected regional marine e-infrastructures for the purposes of developing interoperability across them. By identifying the areas of commonality and incompatibility between these data infrastructures, and leveraging the development activities and expertise of these individual systems, three prototype interoperability solutions are being created which demonstrate the effective sharing of marine data and associated metadata across the participating regional data infrastructures as well as with other target international systems such as GEO, COPERNICUS etc.These interoperability solutions combined with agreed best practice and approved standards, form the basis of a common global approach to marine data management which can be adopted by the wider marine research community. To encourage implementation of these interoperability solutions by other regional marine data infrastructures an impact assessment is being conducted to determine both the technical and financial implications of deploying them alongside existing services. The associated best practice and common standards are also being disseminated to the user community through relevant accreditation processes and related initiatives such as the Research Data Alliance and the Belmont Forum.

  4. Realization of Real-Time Clinical Data Integration Using Advanced Database Technology

    PubMed Central

    Yoo, Sooyoung; Kim, Boyoung; Park, Heekyong; Choi, Jinwook; Chun, Jonghoon

    2003-01-01

    As information & communication technologies have advanced, interest in mobile health care systems has grown. In order to obtain information seamlessly from distributed and fragmented clinical data from heterogeneous institutions, we need solutions that integrate data. In this article, we introduce a method for information integration based on real-time message communication using trigger and advanced database technologies. Messages were devised to conform to HL7, a standard for electronic data exchange in healthcare environments. The HL7 based system provides us with an integrated environment in which we are able to manage the complexities of medical data. We developed this message communication interface to generate and parse HL7 messages automatically from the database point of view. We discuss how easily real time data exchange is performed in the clinical information system, given the requirement for minimum loading of the database system. PMID:14728271

  5. Clinical data exchange standards and vocabularies for messages.

    PubMed Central

    Huff, S. M.

    1998-01-01

    Motivation for the creation of electronic data interchange (message) standards is discussed. The ISO Open Systems Interface model is described. Clinical information models, message syntax and structure, and the need for a standardized coded vocabulary are explained. The HIPAA legislation and subsequent HHS transaction recommendations are reviewed. The history and mission statements of six of the most popular message development organizations (MDOs) are summarized, and the data exchange standards developed by these organizations are listed. The organizations described include Health Level Seven (HL7), American Standards for Testing and Materials (ASTM) E31, Digital Image Communication in Medicine (DICOM), European Committee for Standardization (Comité Européen de Normalisation), Technical Committee for Health Informatics (CEN/TC 251), the National Council for Prescription Drug Programs (NCPDP), and Accredited Standards Committee X12 Insurance Subcommittee (X12N). The locations of Internet web sites for the six organizations are provided as resources for further information. PMID:9929183

  6. Gaps Analysis of Integrating Product Design, Manufacturing, and Quality Data in The Supply Chain Using Model-Based Definition.

    PubMed

    Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil

    2016-01-01

    Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated - from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the MBE vision. Finally, it also seeks to explore the interaction between CAD and CMM processes and determine if the concept of feedback from CAM and CMM back to CAD is feasible. The main goal of our study is to test the hypothesis that model-based-data interoperability from CAD-to-CAM and CAD-to-CMM is feasible through standards-based integration. This paper presents several barriers to model-based-data interoperability. Overall, the project team demonstrated the exchange of product definition data between CAD, CAM, and CMM systems using standards-based methods. While gaps in standards coverage were identified, the gaps should not stop industry's progress toward MBE. The results of our study provide evidence in support of an open-standards method to model-based-data interoperability, which would provide maximum value and impact to industry.

  7. Combining the CIDOC CRM and MPEG-7 to Describe Multimedia in Museums.

    ERIC Educational Resources Information Center

    Hunter, Jane

    This paper describes a proposal for an interoperable metadata model, based on international standards, that has been designed to enable the description, exchange and sharing of multimedia resources both within and between cultural institutions. Domain-specific ontologies have been developed by two different ISO Working Groups to standardize the…

  8. A "Simple Query Interface" Adapter for the Discovery and Exchange of Learning Resources

    ERIC Educational Resources Information Center

    Massart, David

    2006-01-01

    Developed as part of CEN/ISSS Workshop on Learning Technology efforts to improve interoperability between learning resource repositories, the Simple Query Interface (SQI) is an Application Program Interface (API) for querying heterogeneous repositories of learning resource metadata. In the context of the ProLearn Network of Excellence, SQI is used…

  9. A tale of three cities--where RHIOS meet the NHIN.

    PubMed

    DeBor, Greg; Diamond, Carol; Grodecki, Don; Halamka, John; Overhage, J Marc; Shirky, Clay

    2006-01-01

    Regional health information exchanges in California, Indiana, and Massachusetts have been collaborating on a prototype for a nationwide health information network, first under the auspices of the Markle Foundation's Connecting for Health program and now under contract to the Department of Health and Human Services' Office of the National Coordinator for Health Information Technology. Since mid-2004, this collaboration has evolved from a collection of regional efforts to a standards-driven cooperative and now to one of four prototype national networks fostered by federal efforts. This development reflects a maturing market for interoperability and integration in healthcare information technology, starting with RHIOs, and suggests one response to the industry's need for the type of plug-and-play information exchange available in other industries. The authors share their experiences and their views of how RHIOs and a Nationwide Health Information Network will further develop to make interoperable electronic health records a reality in coming years. The content of this article is solely the responsibility of the authors and does not necessarily represent the official view of the Office of the National Coordinator for Health Information Technology.

  10. Creating a Connected Community: Lessons Learned from the Western New York Beacon Community

    PubMed Central

    Maloney, Nancy; Heider, Arvela R.; Rockwood, Amy; Singh, Ranjit

    2014-01-01

    Introduction: Secure exchange of clinical data among providers has the potential to improve quality, safety, efficiency, and reduce duplication. Many communities are experiencing challenges in building effective health information exchanges (HIEs). Previous studies have focused on financial and technical issues regarding HIE development. This paper describes the Western New York (WNY) HIE growth and lessons learned about accelerating progress to become a highly connected community. Methods: HEALTHeLINK, with funding from the Office of the National Coordinator for Health Information Technology (ONC) under the Beacon Community Program, expanded HIE usage in eight counties. The communitywide transformation process used three main drivers: (1) a communitywide Electronic Health Record (EHR) adoption program; (2) clinical transformation partners; and (3) HIE outreach and infrastructure development. Results: ONC Beacon Community funding allowed WNY to achieve a new level in the use of interoperable HIE. Electronic delivery of results into the EHR expanded from 23 practices in 2010 to 222 practices in 2013, a tenfold increase. There were more than 12.5 million results delivered electronically (HL7 messages) to 222 practices’ EHRs via the HIE in 2013. Use of a secure portal and Virtual Health Record (VHR) to access reports (those not delivered directly to the EHR) also increased significantly, from 13,344 report views in 2010 to over 600,000 in 2013. Discussion and Conclusion: The WNY Beacon successfully expanded the sharing of clinical information among different sources of data and providers, creating a highly connected community to improve the quality and continuity of care. Technical, organizational, and community lessons described in this paper should prove beneficial to others as they pursue efforts to create connected communities. PMID:25848618

  11. Vehicle Information Exchange Needs for Mobility Applications

    DOT National Transportation Integrated Search

    2012-02-13

    Connected Vehicle to Vehicle (V2V) safety applications heavily rely on the BSM, which is one of the messages defined in the Society of Automotive standard J2735, Dedicated Short Range Communications (DSRC) Message Set Dictionary, November 2009. The B...

  12. 47 CFR 36.379 - Message processing expense.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... JURISDICTIONAL SEPARATIONS PROCEDURES; STANDARD PROCEDURES FOR SEPARATING TELECOMMUNICATIONS PROPERTY COSTS... Customer Operations Expenses § 36.379 Message processing expense. (a) This classification includes the... exchange operation. (1) Effective July 1, 2001 through June 30, 2011, study areas subject to price cap...

  13. Creating a data exchange strategy for radiotherapy research: towards federated databases and anonymised public datasets.

    PubMed

    Skripcak, Tomas; Belka, Claus; Bosch, Walter; Brink, Carsten; Brunner, Thomas; Budach, Volker; Büttner, Daniel; Debus, Jürgen; Dekker, Andre; Grau, Cai; Gulliford, Sarah; Hurkmans, Coen; Just, Uwe; Krause, Mechthild; Lambin, Philippe; Langendijk, Johannes A; Lewensohn, Rolf; Lühr, Armin; Maingon, Philippe; Masucci, Michele; Niyazi, Maximilian; Poortmans, Philip; Simon, Monique; Schmidberger, Heinz; Spezi, Emiliano; Stuschke, Martin; Valentini, Vincenzo; Verheij, Marcel; Whitfield, Gillian; Zackrisson, Björn; Zips, Daniel; Baumann, Michael

    2014-12-01

    Disconnected cancer research data management and lack of information exchange about planned and ongoing research are complicating the utilisation of internationally collected medical information for improving cancer patient care. Rapidly collecting/pooling data can accelerate translational research in radiation therapy and oncology. The exchange of study data is one of the fundamental principles behind data aggregation and data mining. The possibilities of reproducing the original study results, performing further analyses on existing research data to generate new hypotheses or developing computational models to support medical decisions (e.g. risk/benefit analysis of treatment options) represent just a fraction of the potential benefits of medical data-pooling. Distributed machine learning and knowledge exchange from federated databases can be considered as one beyond other attractive approaches for knowledge generation within "Big Data". Data interoperability between research institutions should be the major concern behind a wider collaboration. Information captured in electronic patient records (EPRs) and study case report forms (eCRFs), linked together with medical imaging and treatment planning data, are deemed to be fundamental elements for large multi-centre studies in the field of radiation therapy and oncology. To fully utilise the captured medical information, the study data have to be more than just an electronic version of a traditional (un-modifiable) paper CRF. Challenges that have to be addressed are data interoperability, utilisation of standards, data quality and privacy concerns, data ownership, rights to publish, data pooling architecture and storage. This paper discusses a framework for conceptual packages of ideas focused on a strategic development for international research data exchange in the field of radiation therapy and oncology. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. Creating a data exchange strategy for radiotherapy research: Towards federated databases and anonymised public datasets

    PubMed Central

    Skripcak, Tomas; Belka, Claus; Bosch, Walter; Brink, Carsten; Brunner, Thomas; Budach, Volker; Büttner, Daniel; Debus, Jürgen; Dekker, Andre; Grau, Cai; Gulliford, Sarah; Hurkmans, Coen; Just, Uwe; Krause, Mechthild; Lambin, Philippe; Langendijk, Johannes A.; Lewensohn, Rolf; Lühr, Armin; Maingon, Philippe; Masucci, Michele; Niyazi, Maximilian; Poortmans, Philip; Simon, Monique; Schmidberger, Heinz; Spezi, Emiliano; Stuschke, Martin; Valentini, Vincenzo; Verheij, Marcel; Whitfield, Gillian; Zackrisson, Björn; Zips, Daniel; Baumann, Michael

    2015-01-01

    Disconnected cancer research data management and lack of information exchange about planned and ongoing research are complicating the utilisation of internationally collected medical information for improving cancer patient care. Rapidly collecting/pooling data can accelerate translational research in radiation therapy and oncology. The exchange of study data is one of the fundamental principles behind data aggregation and data mining. The possibilities of reproducing the original study results, performing further analyses on existing research data to generate new hypotheses or developing computational models to support medical decisions (e.g. risk/benefit analysis of treatment options) represent just a fraction of the potential benefits of medical data-pooling. Distributed machine learning and knowledge exchange from federated databases can be considered as one beyond other attractive approaches for knowledge generation within “Big Data”. Data interoperability between research institutions should be the major concern behind a wider collaboration. Information captured in electronic patient records (EPRs) and study case report forms (eCRFs), linked together with medical imaging and treatment planning data, are deemed to be fundamental elements for large multi-centre studies in the field of radiation therapy and oncology. To fully utilise the captured medical information, the study data have to be more than just an electronic version of a traditional (un-modifiable) paper CRF. Challenges that have to be addressed are data interoperability, utilisation of standards, data quality and privacy concerns, data ownership, rights to publish, data pooling architecture and storage. This paper discusses a framework for conceptual packages of ideas focused on a strategic development for international research data exchange in the field of radiation therapy and oncology. PMID:25458128

  15. Decentralized operating procedures for orchestrating data and behavior across distributed military systems and assets

    NASA Astrophysics Data System (ADS)

    Peach, Nicholas

    2011-06-01

    In this paper, we present a method for a highly decentralized yet structured and flexible approach to achieve systems interoperability by orchestrating data and behavior across distributed military systems and assets with security considerations addressed from the beginning. We describe an architecture of a tool-based design of business processes called Decentralized Operating Procedures (DOP) and the deployment of DOPs onto run time nodes, supporting the parallel execution of each DOP at multiple implementation nodes (fixed locations, vehicles, sensors and soldiers) throughout a battlefield to achieve flexible and reliable interoperability. The described method allows the architecture to; a) provide fine grain control of the collection and delivery of data between systems; b) allow the definition of a DOP at a strategic (or doctrine) level by defining required system behavior through process syntax at an abstract level, agnostic of implementation details; c) deploy a DOP into heterogeneous environments by the nomination of actual system interfaces and roles at a tactical level; d) rapidly deploy new DOPs in support of new tactics and systems; e) support multiple instances of a DOP in support of multiple missions; f) dynamically add or remove run-time nodes from a specific DOP instance as missions requirements change; g) model the passage of, and business reasons for the transmission of each data message to a specific DOP instance to support accreditation; h) run on low powered computers with lightweight tactical messaging. This approach is designed to extend the capabilities of existing standards, such as the Generic Vehicle Architecture (GVA).

  16. Mapping Department of Defense laboratory results to Logical Observation Identifiers Names and Codes (LOINC).

    PubMed

    Lau, Lee Min; Banning, Pam D; Monson, Kent; Knight, Elva; Wilson, Pat S; Shakib, Shaun C

    2005-01-01

    The Department of Defense (DoD) has used a common application, Composite Health Care System (CHCS), throughout all DoD facilities. However, the master files used to encode patient data in CHCS are not identical across DoD facilities. The encoded data is thus not interoperable from one DoD facility to another. To enable data interoperability in the next-generation system, CHCS II, and for the DoD to exchange laboratory results with external organizations such as the Veterans Administration (VA), the disparate master file codes for laboratory results are mapped to Logical Observation Identifier Names and Codes (LOINC) wherever possible. This paper presents some findings from our experience mapping DoD laboratory results to LOINC.

  17. Integrating radiology information systems with healthcare delivery environments using DICOM and HL7 standards.

    PubMed

    Blazona, Bojan; Koncar, Miroslav

    2006-01-01

    Integration based on open standards, in order to achieve communication and information interoperability, is one of the key aspects of modern health care information systems. Interoperability presents data and communication layer interchange. In this context we identified the HL7 standard as the world's leading medical Information and communication technology (ICT) standard for the business layer in healthcare information systems and we tried to explore the ability to exchange clinical documents with minimal integrated healthcare information systems (IHCIS) change. We explored HL7 Clinical Document Architecture (CDA) abilities to achieve radiology information system integration (DICOM) to IHCIS (HL7). We introduced the use of WADO service interconnection to IHCIS and finally CDA rendering in widely used Internet explorers.

  18. Reference Architecture for MNE 5 Technical System

    DTIC Science & Technology

    2007-05-30

    of being available in most experiments. Core Services A core set of applications whi directories, web portal and collaboration applications etc. A...classifications Messages (xml, JMS, content level…) Meta data filtering, who can initiate services Web browsing Collaboration & messaging Border...Exchange Ref Architecture for MNE5 Tech System.doc 9 of 21 audit logging Person and machine Data lev objects, web services, messages rification el

  19. Vehicle Information Exchange Needs for Mobility Applications : Version 2.0

    DOT National Transportation Integrated Search

    2012-08-01

    Connected Vehicle to Vehicle (V2V) safety applications heavily rely on the BSM, which is one of the messages defined in the Society of Automotive standard J2735, Dedicated Short Range Communications (DSRC) Message Set Dictionary, November 2009. The B...

  20. Secure and interoperable communication infrastructures for PPDR organisations

    NASA Astrophysics Data System (ADS)

    Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David

    2016-05-01

    The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.

  1. Building Relationships through Exchange

    ERIC Educational Resources Information Center

    Primavera, Angi; Hall, Ellen

    2011-01-01

    From the moment of birth, children form and develop relationships with others in their world based on exchange. Children recognize that engaging in such encounters offers them the opportunity to enter into a relationship with another individual and to nurture that relationship through the exchange of messages and gifts, items and ideas. At Boulder…

  2. Ontological modeling of electronic health information exchange.

    PubMed

    McMurray, J; Zhu, L; McKillop, I; Chen, H

    2015-08-01

    Investments of resources to purposively improve the movement of information between health system providers are currently made with imperfect information. No inventories of system-level electronic health information flows currently exist, nor do measures of inter-organizational electronic information exchange. Using Protégé 4, an open-source OWL Web ontology language editor and knowledge-based framework, we formalized a model that decomposes inter-organizational electronic health information flow into derivative concepts such as diversity, breadth, volume, structure, standardization and connectivity. The ontology was populated with data from a regional health system and the flows were measured. Individual instance's properties were inferred from their class associations as determined by their data and object property rules. It was also possible to visualize interoperability activity for regional analysis and planning purposes. A property called Impact was created from the total number of patients or clients that a health entity in the region served in a year, and the total number of health service providers or organizations with whom it exchanged information in support of clinical decision-making, diagnosis or treatment. Identifying providers with a high Impact but low Interoperability score could assist planners and policy-makers to optimize technology investments intended to electronically share patient information across the continuum of care. Finally, we demonstrated how linked ontologies were used to identify logical inconsistencies in self-reported data for the study. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Health Information Exchange as a Complex and Adaptive Construct: Scoping Review.

    PubMed

    Akhlaq, Ather; Sheikh, Aziz; Pagliari, Claudia

    2017-01-25

    To understand how the concept of Health Information Exchange (HIE) has evolved over time.  Supplementary analysis of data from a systematic scoping review of definitions of HIE from 1900 to 2014, involving temporal analysis of underpinning themes. The search identified 268 unique definitions of HIE dating from 1957 onwards; 103 in scientific databases and 165 in Google. These contained consistent themes, representing the core concept of exchanging health information electronically, as well as fluid themes, reflecting the evolving policy, business, organisational and technological context of HIE (including the emergence of HIE as an organisational 'entity'). These are summarised graphically to show how the concept has evolved around the world with the passage of time.  The term HIE emerged in 1957 with the establishment of Occupational HIE, evolving through the 1990s with concepts such as electronic data interchange and mobile computing technology; then from 2006-10 largely aligning with the US Government's health information technology strategy and the creation of HIEs as organisational entities, alongside the broader interoperability imperative, and continuing to evolve today as part of a broader international agenda for sustainable, information-driven health systems. The concept of HIE is an evolving and adaptive one, reflecting the ongoing quest for integrated and interoperable information to improve the efficiency and effectiveness of health systems, in a changing technological and policy environment.

  4. GPULife

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, Priscilla N.

    2016-08-12

    The code runs the Game of Life among several processors. Each processor uses CUDA to set up the grid's buffer on the GPU, and that buffer is fed to other GPU languages to apply the rules of the game of life. Only the halo is copied off the buffer and exchanged using MPI. This code looks at the interoperability of GPU languages among current platforms.

  5. XDS-I outsourcing proxy: ensuring confidentiality while preserving interoperability.

    PubMed

    Ribeiro, Luís S; Viana-Ferreira, Carlos; Oliveira, José Luís; Costa, Carlos

    2014-07-01

    The interoperability of services and the sharing of health data have been a continuous goal for health professionals, patients, institutions, and policy makers. However, several issues have been hindering this goal, such as incompatible implementations of standards (e.g., HL7, DICOM), multiple ontologies, and security constraints. Cross-enterprise document sharing (XDS) workflows were proposed by Integrating the Healthcare Enterprise (IHE) to address current limitations in exchanging clinical data among organizations. To ensure data protection, XDS actors must be placed in trustworthy domains, which are normally inside such institutions. However, due to rapidly growing IT requirements, the outsourcing of resources in the Cloud is becoming very appealing. This paper presents a software proxy that enables the outsourcing of XDS architectural parts while preserving the interoperability, confidentiality, and searchability of clinical information. A key component in our architecture is a new searchable encryption (SE) scheme-Posterior Playfair Searchable Encryption (PPSE)-which, besides keeping the same confidentiality levels of the stored data, hides the search patterns to the adversary, bringing improvements when compared to the remaining practical state-of-the-art SE schemes.

  6. Camouflage Traffic: Minimizing Message Delay for Smart Grid Applications under Jamming

    DTIC Science & Technology

    2014-04-01

    technologies. To facilitate efficient information exchange, wireless networks have been proposed to be widely used in the smart grid. However, the jamming...attack that constantly broadcasts radio interference is a primary security threat to prevent the deployment of wireless networks in the smart grid. Hence... wireless communications, while at the same time providing latency guarantee for control messages. An open question is how to minimize message delay for

  7. OGC and Grid Interoperability in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/

  8. A Proposed Operational Concept for the Defense Communications Operations Support System.

    DTIC Science & Technology

    1986-01-01

    Artificial Intelligence AMA Automatic Message Accounting AMIE AUTODIN Management Index System AMPE Automated Message Processing Exchange ANCS AUTOVON Network...Support IMPRESS Inpact/Restoral System INFORM Information Retrieval System 1OC Initial Operational Capability IRU Intellegent Remote Unit I-S/A AMPE

  9. Analysis of Relational Communication in Dyads: New Measurement Procedures.

    ERIC Educational Resources Information Center

    Rogers, L. Edna; Farace, Richard

    Relational communication refers to the control or dominance aspects of message exchange in dyads--distinguishing it from the report or referential aspects of communication. In relational communicational analysis, messages as transactions are emphasized; major theoretical concepts which emerge are symmetry, transitoriness, and complementarity of…

  10. Group Centric Networking: A new Approach for Wireless Multi-Hop Networking to Enable the Internet of Things

    DTIC Science & Technology

    2015-11-11

    reliable data message delivery. The basic mechanism of link-based routing schemes is the broadcasting of a control message (called a “ hello ”) to all of its...short- est path route to a destination by using the set of ex- changed hello messages between users of the network. With sufficiently high frequency... hello messages are suc- cessfully exchanged across a high error link, and since this link is of longer distance, it gets used to build a shortest path

  11. Group Centric Networking: A new Approach for Wireless Multi-Hop Networking to Enable the Internet of Things

    DTIC Science & Technology

    2015-09-07

    reliable data message delivery. The basic mechanism of link-based routing schemes is the broadcasting of a control message (called a “ hello ”) to all of its...short- est path route to a destination by using the set of ex- changed hello messages between users of the network. With sufficiently high frequency... hello messages are suc- cessfully exchanged across a high error link, and since this link is of longer distance, it gets used to build a shortest path

  12. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard.

    PubMed

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han

    2014-01-01

    Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.

  13. Data distribution service-based interoperability framework for smart grid testbed infrastructure

    DOE PAGES

    Youssef, Tarek A.; Elsayed, Ahmed T.; Mohammed, Osama A.

    2016-03-02

    This study presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS) is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discoverymore » feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS).« less

  14. Reasoning about Resources and Hierarchical Tasks Using OWL and SWRL

    NASA Astrophysics Data System (ADS)

    Elenius, Daniel; Martin, David; Ford, Reginald; Denker, Grit

    Military training and testing events are highly complex affairs, potentially involving dozens of legacy systems that need to interoperate in a meaningful way. There are superficial interoperability concerns (such as two systems not sharing the same messaging formats), but also substantive problems such as different systems not sharing the same understanding of the terrain, positions of entities, and so forth. We describe our approach to facilitating such events: describe the systems and requirements in great detail using ontologies, and use automated reasoning to automatically find and help resolve problems. The complexity of our problem took us to the limits of what one can do with OWL, and we needed to introduce some innovative techniques of using and extending it. We describe our novel ways of using SWRL and discuss its limitations as well as extensions to it that we found necessary or desirable. Another innovation is our representation of hierarchical tasks in OWL, and an engine that reasons about them. Our task ontology has proved to be a very flexible and expressive framework to describe requirements on resources and their capabilities in order to achieve some purpose.

  15. A Proposed Information Architecture for Telehealth System Interoperability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, R.L.; Funkhouser, D.R.; Gallagher, L.K.

    1999-04-20

    We propose an object-oriented information architecture for telemedicine systems that promotes secure `plug-and-play' interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a ''lego-like'' fashion to achieve the desired device or system functionality. Introduction Telemedicine systems today rely increasingly on distributed, collaborative information technology during the care delivery process. While these leading-edge systems are bellwethers for highly advanced telemedicine, most are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that amore » single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver en- tire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. This paper proposes a reference architecture for plug-and-play telemedicine systems that addresses these issues.« less

  16. Using IHE and HL7 conformance to specify consistent PACS interoperability for a large multi-center enterprise.

    PubMed

    Henderson, Michael L; Dayhoff, Ruth E; Titton, Csaba P; Casertano, Andrew

    2006-01-01

    As part of its patient care mission, the U.S. Veterans Health Administration performs diagnostic imaging procedures at 141 medical centers and 850 outpatient clinics. VHA's VistA Imaging Package provides a full archival, display, and communications infrastructure and interfaces to radiology and other HIS modules as well as modalities and a worklist provider In addition, various medical center entities within VHA have elected to install commercial picture archiving and communications systems to enable image organization and interpretation. To evaluate interfaces between commercial PACS, the VistA hospital information system, and imaging modalities, VHA has built a fully constrained specification that is based on the Radiology Technical Framework (Rad-TF) Integrating the Healthcare Enterprise. The Health Level Seven normative conformance mechanism was applied to the IHE Rad-TF and agency requirements to arrive at a baseline set of message specifications. VHA provides a thorough implementation and testing process to promote the adoption of standards-based interoperability by all PACS vendors that want to interface with VistA Imaging.

  17. Auto-Generated Semantic Processing Services

    NASA Technical Reports Server (NTRS)

    Davis, Rodney; Hupf, Greg

    2009-01-01

    Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.

  18. Don't bother to wrap it: online Giftgiver and Bugchaser newsgroups, the social impact of gift exchanges and the 'carnivalesque'.

    PubMed

    Graydon, Michael

    2007-01-01

    At online internet newsgroups, members who self-identify as Giftgivers and Bugchasers post messages describing exchanging HIV as a gift, as the Gift. Using the literature on the social function of gift exchanges, this paper considers how Giftgiver and Bugchaser messages mobilize the language of gifts. In doing so, newsgroup members generate an ontological narrative wherein HIV, as the Gift, promotes social bonds, the creation and maintenance of self identity and social roles, and the meeting of particular goals. Thus the Gift appears to fulfill (at least discursively) many of the social functions of gift exchanges as described in the literature. Online newsgroups function as fora for the expression of a contrarian, transgressive HIV/AIDS narrative and act as 'carnivalesque' spaces in which normative social roles and meaning are inverted in the establishment of a realm whereby HIV becomes a gift.

  19. Gaps Analysis of Integrating Product Design, Manufacturing, and Quality Data in The Supply Chain Using Model-Based Definition

    PubMed Central

    Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil

    2017-01-01

    Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated – from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the MBE vision. Finally, it also seeks to explore the interaction between CAD and CMM processes and determine if the concept of feedback from CAM and CMM back to CAD is feasible. The main goal of our study is to test the hypothesis that model-based-data interoperability from CAD-to-CAM and CAD-to-CMM is feasible through standards-based integration. This paper presents several barriers to model-based-data interoperability. Overall, the project team demonstrated the exchange of product definition data between CAD, CAM, and CMM systems using standards-based methods. While gaps in standards coverage were identified, the gaps should not stop industry's progress toward MBE. The results of our study provide evidence in support of an open-standards method to model-based-data interoperability, which would provide maximum value and impact to industry. PMID:28691120

  20. CCSDS Spacecraft Monitor and Control Mission Operations Interoperability Prototype

    NASA Technical Reports Server (NTRS)

    Lucord, Steve; Martinez, Lindolfo

    2009-01-01

    We are entering a new era in space exploration. Reduced operating budgets require innovative solutions to leverage existing systems to implement the capabilities of future missions. Custom solutions to fulfill mission objectives are no longer viable. Can NASA adopt international standards to reduce costs and increase interoperability with other space agencies? Can legacy systems be leveraged in a service oriented architecture (SOA) to further reduce operations costs? The Operations Technology Facility (OTF) at the Johnson Space Center (JSC) is collaborating with Deutsches Zentrum fur Luft- und Raumfahrt (DLR) to answer these very questions. The Mission Operations and Information Management Services Area (MOIMS) Spacecraft Monitor and Control (SM&C) Working Group within the Consultative Committee for Space Data Systems (CCSDS) is developing the Mission Operations standards to address this problem space. The set of proposed standards presents a service oriented architecture to increase the level of interoperability among space agencies. The OTF and DLR are developing independent implementations of the standards as part of an interoperability prototype. This prototype will address three key components: validation of the SM&C Mission Operations protocol, exploration of the Object Management Group (OMG) Data Distribution Service (DDS), and the incorporation of legacy systems in a SOA. The OTF will implement the service providers described in the SM&C Mission Operation standards to create a portal for interaction with a spacecraft simulator. DLR will implement the service consumers to perform the monitor and control of the spacecraft. The specifications insulate the applications from the underlying transport layer. We will gain experience with a DDS transport layer as we delegate responsibility to the middleware and explore transport bridges to connect disparate middleware products. A SOA facilitates the reuse of software components. The prototype will leverage the capabilities of existing legacy systems. Various custom applications and middleware solutions will be combined into one system providing the illusion of a set of homogenous services. This paper will document our journey as we implement the interoperability prototype. The team consists of software engineers with experience on the current command, telemetry and messaging systems that support the International Space Station (ISS) and Space Shuttle programs. Emphasis will be on the objectives, results and potential cost saving benefits.

  1. Nurses' use of mobile instant messaging applications: A uses and gratifications perspective.

    PubMed

    Bautista, John Robert; Lin, Trisha T C

    2017-10-01

    To explore how and why mobile instant messaging applications are used by Filipino nurses as part of their work. Guided by the uses and gratifications theory, in-depth interviews with 20 staff nurses working in 9 hospitals (ie, 4 private and 5 public hospitals) in the Philippines were conducted in July 2015. Interview data were analysed through a phenomenological perspective to thematic analysis. Results show that mobile instant messaging applications such as Facebook Messenger and Viber were mostly used by staff nurses and these were accessed using their own smartphones. Thematic analysis indicates that they were used to meet staff nurses' need for information exchange, socialization, and catharsis. Moreover, user interactions vary depending on members within a chat group. For instance, communication via mobile instant messaging applications are much formal when superiors are included in a chat group. In general, the results show that mobile instant messaging applications are routinely used by Filipino staff nurses not only for clinical purposes (ie, information exchange) but also for non-clinical purposes (ie, socialization and catharsis). This paper ends with several practical and theoretical implications including future research directions. © 2017 John Wiley & Sons Australia, Ltd.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perry, Marcia

    The IRCD is an IRC server that was originally distributed by the IRCD Hybrid developer team for use as a server in IRC message over the public Internet. By supporting the IRC protocol defined in the IRC RFC, IRCD allows the users to create and join channels for group or one-to-one text-based instant messaging. It stores information about channels (e.g., whether it is public, secret, or invite-only, the topic set, membership) and users (who is online and what channels they are members of). It receives messages for a specific user or channel and forwards these messages to the targeted destination.more » Since server-to-server communication is also supported, these targeted destinations may be connected to different IRC servers. Messages are exchanged over TCP connections that remain open between the client and the server. The IRCD is being used within the Pervasive Computing Collaboration Environment (PCCE) as the 'chat server' for message exchange over public and private channels. After an LBNLSecureMessaging(PCCE chat) client has been authenticated, the client connects to IRCD with its assigned nickname or 'nick.' The client can then create or join channels for group discussions or one-to-one conversations. These channels can have an initial mode of public or invite-only and the mode may be changed after creation. If a channel is public, any one online can join the discussion; if a channel is invite-only, users can only join if existing members of the channel explicity invite them. Users can be invited to any type of channel and users may be members of multiple channels simultaneously. For use with the PCCE environment, the IRCD application (which was written in C) was ported to Linux and has been tested and installed under Linux Redhat 7.2. The source code was also modified with SSL so that all messages exchanged over the network are encrypted. This modified IRC server also verifies with an authentication server that the client is who he or she claims to be and that this user is authorized to ain access to the IRCD.« less

  3. Understanding the "Other Side": Intercultural learning in a Spanish-English E-Mail Exchange.

    ERIC Educational Resources Information Center

    O'Dowd, Robert

    2003-01-01

    Reviews recent research on intercultural learning and reports on a yearlong e-mail exchange between Spanish and English second year university students. Identifies key characteristics of e-mail exchanges that helped to develop learners' intercultural communicative competence. Outlines elements of e-mail messages that may enable students to develop…

  4. Multiphase complete exchange on Paragon, SP2 and CS-2

    NASA Technical Reports Server (NTRS)

    Bokhari, Shahid H.

    1995-01-01

    The overhead of interprocessor communication is a major factor in limiting the performance of parallel computer systems. The complete exchange is the severest communication pattern in that it requires each processor to send a distinct message to every other processor. This pattern is at the heart of many important parallel applications. On hypercubes, multiphase complete exchange has been developed and shown to provide optimal performance over varying message sizes. Most commercial multicomputer systems do not have a hypercube interconnect. However, they use special purpose hardware and dedicated communication processors to achieve very high performance communication and can be made to emulate the hypercube quite well. Multiphase complete exchange has been implemented on three contemporary parallel architectures: the Intel Paragon, IBM SP2 and Meiko CS-2. The essential features of these machines are described and their basic interprocessor communication overheads are discussed. The performance of multiphase complete exchange is evaluated on each machine. It is shown that the theoretical ideas developed for hypercubes are also applicable in practice to these machines and that multiphase complete exchange can lead to major savings in execution time over traditional solutions.

  5. Improving Groundwater Data Interoperability: Results of the Second OGC Groundwater Interoperability Experiment

    NASA Astrophysics Data System (ADS)

    Lucido, J. M.; Booth, N.

    2014-12-01

    Interoperable sharing of groundwater data across international boarders is essential for the proper management of global water resources. However storage and management of groundwater data is often times distributed across many agencies or organizations. Furthermore these data may be represented in disparate proprietary formats, posing a significant challenge for integration. For this reason standard data models are required to achieve interoperability across geographical and political boundaries. The GroundWater Markup Language 1.0 (GWML1) was developed in 2010 as an extension of the Geography Markup Language (GML) in order to support groundwater data exchange within Spatial Data Infrastructures (SDI). In 2013, development of GWML2 was initiated under the sponsorship of the Open Geospatial Consortium (OGC) for intended adoption by the international community as the authoritative standard for the transfer of groundwater feature data, including data about water wells, aquifers, and related entities. GWML2 harmonizes GWML1 and the EU's INSPIRE models related to geology and hydrogeology. Additionally, an interoperability experiment was initiated to test the model for commercial, technical, scientific, and policy use cases. The scientific use case focuses on the delivery of data required for input into computational flow modeling software used to determine the flow of groundwater within a particular aquifer system. It involves the delivery of properties associated with hydrogeologic units, observations related to those units, and information about the related aquifers. To test this use case web services are being implemented using GWML2 and WaterML2, which is the authoritative standard for water time series observations, in order to serve USGS water well and hydrogeologic data via standard OGC protocols. Furthermore, integration of these data into a computational groundwater flow model will be tested. This submission will present the GWML2 information model and results of an interoperability experiment with a particular emphasis on the scientific use case.

  6. Enhancing security and improving interoperability in healthcare information systems.

    PubMed

    Gritzalis, D A

    1998-01-01

    Security is a key issue in healthcare information systems, since most aspects of security become of considerable or even critical importance when handling healthcare information. In addition, the intense need for information exchange has revealed interoperability of systems and applications as another key issue. Standardization can play an important role towards both these issues. In this paper, relevant standardization activities are briefly presented, and existing and emerging healthcare information security standards are identified and critically analysed. The analysis is based on a framework which has been developed for this reason. Therefore, the identification of gaps and inconsistencies in current standardization, the description of the conflicts of standards with legislation, and the analysis of implications of these standards to user organizations, are the main results of this paper.

  7. Interoperability science cases with the CDPP tools

    NASA Astrophysics Data System (ADS)

    Nathanaël, J.; Cecconi, B.; André, N.; Bouchemit, M.; Gangloff, M.; Budnik, E.; Jacquey, C.; Pitout, F.; Durand, J.; Rouillard, A.; Lavraud, B.; Genot, V. N.; Popescu, D.; Beigbeder, L.; Toniutti, J. P.; Caussarieu, S.

    2017-12-01

    Data exchange protocols are never as efficient as when they are invisible for the end user who is then able to discover data, to cross compare observations and modeled data and finally to perform in depth analysis. Over the years these protocols, including SAMP from IVOA, EPN-TAP from the Europlanet 2020 RI community, backed by standard web-services, have been deployed in tools designed by the French Centre de Données de la Physique des Plasmas (CDPP) including AMDA, the Propagation Tool, 3DView, ... . This presentation will focus on science cases which show the capability of interoperability in the planetary and heliophysics contexts, involving both CDPP and companion tools. Europlanet 2020 RI has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208.

  8. Moving Beyond the 10,000 Ways That Don't Work

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Arctur, D. K.; Rueda, C.

    2009-12-01

    From his research in developing light bulb filaments, Thomas Edison provide us with a good lesson to advance any venture. He said "I have not failed, I've just found 10,000 ways that won't work." Advancing data and access interoperability is one of those ventures difficult to achieve because of the differences among the participating communities. Even within the marine domain, different communities exist and with them different technologies (formats and protocols) to publish data and its descriptions, and different vocabularies to name things (e.g. parameters, sensor types). Simplifying the heterogeneity of technologies is not only accomplished by adopting standards, but by creating profiles, and advancing tools that use those standards. In some cases, standards are advanced by building from existing tools. But what is the best strategy? Edison could provide us a hint. Prototypes and test beds are essential to achieve interoperability among geospatial communities. The Open Geospatial Consortium (OGC) calls them interoperability experiments. The World Wide Web Consortium (W3C) calls them incubator projects. Prototypes help test and refine specifications. The Marine Metadata Interoperability (MMI) Initiative, which is advancing marine data integration and re-use by promoting community solutions, understood this strategy and started an interoperability demonstration with the SURA Coastal Ocean Observing and Prediction (SCOOP) program. This interoperability demonstration transformed into the OGC Ocean Science Interoperability Experiment (Oceans IE). The Oceans IE brings together the Ocean-Observing community to advance interoperability of ocean observing systems by using OGC Standards. The Oceans IE Phase I investigated the use of OGC Web Feature Service (WFS) and OGC Sensor Observation Service (SOS) standards for representing and exchanging point data records from fixed in-situ marine platforms. The Oceans IE Phase I produced an engineering best practices report, advanced reference implementations, and submitted various change requests that are now being considered by the OGC SOS working group. Building on Phase I, and with a focus on semantically-enabled services, Oceans IE Phase II will continue the use and improvement of OGC specifications in the marine community. We will present the lessons learned and in particular the strategy of experimenting with technologies to advance standards to publish data in marine communities, which could also help advance interoperability in other geospatial communities. We will also discuss the growing collaborations among ocean-observing standards organizations that will bring about the institutional acceptance needed for these technologies and practices to gain traction globally.

  9. Three-pass protocol scheme for bitmap image security by using vernam cipher algorithm

    NASA Astrophysics Data System (ADS)

    Rachmawati, D.; Budiman, M. A.; Aulya, L.

    2018-02-01

    Confidentiality, integrity, and efficiency are the crucial aspects of data security. Among the other digital data, image data is too prone to abuse of operation like duplication, modification, etc. There are some data security techniques, one of them is cryptography. The security of Vernam Cipher cryptography algorithm is very dependent on the key exchange process. If the key is leaked, security of this algorithm will collapse. Therefore, a method that minimizes key leakage during the exchange of messages is required. The method which is used, is known as Three-Pass Protocol. This protocol enables message delivery process without the key exchange. Therefore, the sending messages process can reach the receiver safely without fear of key leakage. The system is built by using Java programming language. The materials which are used for system testing are image in size 200×200 pixel, 300×300 pixel, 500×500 pixel, 800×800 pixel and 1000×1000 pixel. The result of experiments showed that Vernam Cipher algorithm in Three-Pass Protocol scheme could restore the original image.

  10. 78 FR 60947 - Self-Regulatory Organizations; C2 Options Exchange, Incorporated; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-02

    ... Rule Change Relating to Message Types, Connectivity and Bandwidth Allowance September 26, 2013... definitions, practices and requirements related to System connectivity, message types and bandwidth allowance... types and bandwidth allowance to promote transparency and maintain clarity in the rules. Specifically...

  11. Voice-Based Technology for Parent Involvement: Results and Effects.

    ERIC Educational Resources Information Center

    Bauch, Jerold P.

    A study was conducted to implement and evaluate the Transparent School Model for improving parent involvement in nine Indiana schools. The Model uses computer-based voice messaging to exchange information between teachers and parents. Teachers record a brief message for parents that describes what was taught, special learning events, homework…

  12. College Students' Perceptions of Short Message Service-Supported Collaborative Learning

    ERIC Educational Resources Information Center

    Zamani-Miandashti, Naser; Ataei, Pouria

    2015-01-01

    Interaction is a major success factor that affects collaborative learning. This study examined the perceptions of college students about short message service (SMS) supported collaborative learning. Seventy-five BSc students from three classes were asked to cooperate on group assignments. The participants used their mobile phones to exchange text…

  13. Patient Privacy, Consent, and Identity Management in Health Information Exchange

    PubMed Central

    Hosek, Susan D.; Straus, Susan G.

    2013-01-01

    Abstract The Military Health System (MHS) and the Veterans Health Administration (VHA) have been among the nation's leaders in health information technology (IT), including the development of health IT systems and electronic health records that summarize patients' care from multiple providers. Health IT interoperability within MHS and across MHS partners, including VHA, is one of ten goals in the current MHS Strategic Plan. As a step toward achieving improved interoperability, the MHS is seeking to develop a research roadmap to better coordinate health IT research efforts, address IT capability gaps, and reduce programmatic risk for its enterprise projects. This article contributes to that effort by identifying gaps in research, policy, and practice involving patient privacy, consent, and identity management that need to be addressed to bring about improved quality and efficiency of care through health information exchange. Major challenges include (1) designing a meaningful patient consent procedure, (2) recording patients' consent preferences and designing procedures to implement restrictions on disclosures of protected health information, and (3) advancing knowledge regarding the best technical approaches to performing patient identity matches and how best to monitor results over time. Using a sociotechnical framework, this article suggests steps for overcoming these challenges and topics for future research. PMID:28083296

  14. PACS viewer interoperability for teleconsultation based on DICOM

    NASA Astrophysics Data System (ADS)

    Salant, Eliot; Shani, Uri

    2000-05-01

    Real-time teleconsultation in radiology enables physicians to perform same-time consultation between remote peers, based on medical images. Since digital medical images are commonly viewed on PACS workstations, it is possible to use one of several methods for remote sharing of the computer screen. For instance, software products such as Microsoft NetMeeting, or IBM SameTime, can be used. However, the amount of image data transmitted can be very high, since even minute changes in an image window/level requires re-transmitting the entire image again and again. This is too inefficient. Looking for better methods, when restricting the problem to the use of same hardware and software of the same vendor, it is easier to develop a solution that employs a proprietary specialized protocol to coordinate the visualization process. Such is a solution that we developed, and which demonstrated an excellent performance advantage by transmitting only the graphical events between the machines, rather than the image pixels. Our solution did not inter-operate with other viewers. It worked only on X11/Motif systems, and only between compatible versions of the same viewer application. Our purpose in this paper is to enable inter-operability between viewers of different platforms, and different vendors. We distinguish three parts: Session control, audiovisual (multimedia) data exchange, and medical image sharing. We intend to deal only with the third component, assuming the use of existing standards for the first two parts. After a session between two or more parties is established, and optional audiovisual data channels are set, the medical consultation is considered as the coordinated exchange of medical image contents. Some requirements for the contents exchange protocol: In the first stage, the parties negotiate the actual set of capabilities to be used during the consultation, using a formal description of these capabilities. The capabilities that one station lacks over the other (such as specific image processing algorithms) can be 'borrowed.' In the second stage, when interaction starts, it should assume that the graphical user interface of the stations might be different, as well as working procedures. During the consultation, data is exchanged based on DICOM for the data model of medical image folders, and the data format of image objects.

  15. Interoperability of medical device information and the clinical applications: an HL7 RMIM based on the ISO/IEEE 11073 DIM.

    PubMed

    Yuksel, Mustafa; Dogac, Asuman

    2011-07-01

    Medical devices are essential to the practice of modern healthcare services. Their benefits will increase if clinical software applications can seamlessly acquire the medical device data. The need to represent medical device observations in a format that can be consumable by clinical applications has already been recognized by the industry. Yet, the solutions proposed involve bilateral mappings from the ISO/IEEE 11073 Domain Information Model (DIM) to specific message or document standards. Considering that there are many different types of clinical applications such as the electronic health record and the personal health record systems, the clinical workflows, and the clinical decision support systems each conforming to different standard interfaces, detailing a mapping mechanism for every one of them introduces significant work and, thus, limits the potential health benefits of medical devices. In this paper, to facilitate the interoperability of clinical applications and the medical device data, we use the ISO/IEEE 11073 DIM to derive an HL7 v3 Refined Message Information Model (RMIM) of the medical device domain from the HL7 v3 Reference Information Mode (RIM). This makes it possible to trace the medical device data back to a standard common denominator, that is, HL7 v3 RIM from which all the other medical domains under HL7 v3 are derived. Hence, once the medical device data are obtained in the RMIM format, it can easily be transformed into HL7-based standard interfaces through XML transformations because these interfaces all have their building blocks from the same RIM. To demonstrate this, we provide the mappings from the developed RMIM to some of the widely used HL7 v3-based standard interfaces.

  16. A lightweight messaging-based distributed processing and workflow execution framework for real-time and big data analysis

    NASA Astrophysics Data System (ADS)

    Laban, Shaban; El-Desouky, Aly

    2014-05-01

    To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Also, it is possible to extend the use of the framework in monitoring the IDC pipeline. The detailed design, implementation,conclusion and future work of the proposed framework will be presented.

  17. An interoperability experiment for sharing hydrological rating tables

    NASA Astrophysics Data System (ADS)

    Lemon, D.; Taylor, P.; Sheahan, P.

    2013-12-01

    The increasing demand on freshwater resources is requiring authorities to produce more accurate and timely estimates of their available water. Calculation of continuous time-series of river discharge and storage volumes generally requires rating tables. These approximate relationships between two phenomena, such as river level and discharge, and allow us to produce continuous estimates of a phenomenon that may be impractical or impossible to measure directly. Standardised information models or access mechanisms for rating tables are required to support sharing and exchange of water flow data. An Interoperability Experiment (IE) is underway to test an information model that describes rating tables, the observations made to build these ratings, and river cross-section data. The IE is an initiative of the joint World Meteorological Organisation/Open Geospatial Consortium's Hydrology Domain Working Group (HydroDWG) and the model will be published as WaterML2.0 part 2. Interoperability Experiments (IEs) are low overhead, multiple member projects that are run under the OGC's interoperability program to test existing and emerging standards. The HydroDWG has previously run IEs to test early versions of OGC WaterML2.0 part 1 - timeseries. This IE is focussing on two key exchange scenarios: Sharing rating tables and gauging observations between water agencies. Through the use of standard OGC web services, rating tables and associated data will be made available from water agencies. The (Australian) Bureau of Meteorology will retrieve rating tables on-demand from water authorities, allowing the Bureau to run conversions of data within their own systems. Exposing rating tables and gaugings for online analysis and educational purposes. A web client will be developed to enable exploration and visualization of rating tables, gaugings and related metadata for monitoring points. The client gives a quick view into available rating tables, their periods of applicability and the standard deviation of observations against the relationship. An example of this client running can be seen at the link provided. The result of the IE will form the basis for the standardisation of WaterML2.0 part 2. The use of the standard will lead to increased transparency and accessibility of rating tables, while also improving general understanding of this important hydrological concept.

  18. AMS Prototyping Activities

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott

    2008-01-01

    This slide presentation reviews the activity around the Asynchronous Message Service (AMS) prototype. An AMS reference implementation has been available since late 2005. It is aimed at supporting message exchange both in on-board environments and over space links. The implementation incoroporates all mandatory elements of the draft recommendation from July 2007: (1) MAMS, AMS, and RAMS protocols. (2) Failover, heartbeats, resync. (3) "Hooks" for security, but no cipher suites included in the distribution. The performance is reviewed, and a Benchmark latency test over VxWorks Message Queues is shown as histograms of a count vs microseconds per 1000-byte message

  19. MO-AB-204-00: Interoperability in Radiation Oncology: IHE-RO Committee Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    You’ve experienced the frustration: vendor A’s device claims to work with vendor B’s device, but the practice doesn’t match the promise. Getting devices working together is the hidden art that Radiology and Radiation Oncology staff have to master. To assist with that difficult process, the Integrating the Healthcare Enterprise (IHE) effort was established in 1998, with the coordination of the Radiological Society of North America. Integrating the Healthcare Enterprise (IHE) is a consortium of healthcare professionals and industry partners focused on improving the way computer systems interconnect and exchange information. This is done by coordinating the use of published standardsmore » like DICOM and HL7. Several clinical and operational IHE domains exist in the healthcare arena, including Radiology and Radiation Oncology. The ASTRO-sponsored IHE Radiation Oncology (IHE-RO) domain focuses on radiation oncology specific information exchange. This session will explore the IHE Radiology and IHE RO process for; IHE solicitation process for new profiles. Improving the way computer systems interconnect and exchange information in the healthcare enterprise Supporting interconnectivity descriptions and proof of adherence by vendors Testing and assuring the vendor solutions to connectivity problems. Including IHE profiles in RFPs for future software and hardware purchases. Learning Objectives: Understand IHE role in improving interoperability in health care. Understand process of profile development and implantation. Understand how vendors prove adherence to IHE RO profiles. S. Hadley, ASTRO Supported Activity.« less

  20. Communication Interactions: It Takes Two

    ERIC Educational Resources Information Center

    Stremel, Kathleen

    2008-01-01

    Communication is the exchange of a message between two or more people. Every one communicates in many different ways and for many different reasons. Communication can be expressive or receptive. Children who are deaf-blind may never learn to talk. However, they can express themselves to you. They can receive the messages you send them. Through…

  1. Interoperability in healthcare: major challenges in the creation of the enterprise environment

    NASA Astrophysics Data System (ADS)

    Lindsköld, L.; Wintell, M.; Lundberg, N.

    2009-02-01

    There is today a lack of interoperability in healthcare although the need for it is obvious. A new healthcare enterprise environment has been deployed for secure healthcare interoperability in the Western Region in Sweden (WRS). This paper is an empirical overview of the new enterprise environment supporting regional shared and transparent radiology domain information in the WRS. The enterprise environment compromises 17 radiology departments, 1,5 million inhabitants, using different RIS and PACS in a joint work-oriented network and additional cardiology, dentistry and clinical physiology departments. More than 160 terabytes of information are stored in the enterprise repository. Interoperability is developed according to the IHE mission, i.e. applying standards such as Digital Imaging and Communication in Medicine (DICOM) and Health Level 7 (HL7) to address specific clinical communication needs and support optimal patient care. The entire enterprise environment is implemented and used daily in WRS. The central prerequisites in the development of the enterprise environment in western region of Sweden were: 1) information harmonization, 2) reuse of standardized messages e.g. HL7 v2.x and v3.x, 3) development of a holistic information domain including both text and images, and 4) to create a continuous and dynamic update functionality. The central challenges in this project were: 1) the many different vendors acting in the region and the negotiations with them to apply communication roles/profiles such as HL7 (CDA, CCR), DICOM, and XML, 2) the question of whom owns the data, and 3) incomplete technical standards. This study concludes that to create a workflow that runs within an enterprise environment there are a number of central prerequisites and challenges that needs to be in place. This calls for negotiations on an international, national and regional level with standardization organizations, vendors, health management and health personnel.

  2. Patients in transition--improving hospital-home care collaboration through electronic messaging: providers' perspectives.

    PubMed

    Melby, Line; Brattheim, Berit J; Hellesø, Ragnhild

    2015-12-01

    To explore how the use of electronic messages support hospital and community care nurses' collaboration and communication concerning patients' admittance to and discharges from hospitals. Nurses in hospitals and in community care play a crucial role in the transfer of patients between the home and the hospital. Several studies have shown that transition situations are challenging due to a lack of communication and information exchange. Information and communication technologies may support nurses' work in these transition situations. An electronic message system was introduced in Norway to support patient transitions across the health care sector. A descriptive, qualitative interview study was conducted. One hospital and three adjacent communities were included in the study. We conducted semi-structured interviews with hospital nurses and community care nurses. In total, 41 persons were included in the study. The analysis stemmed from three main topics related to the aims of e-messaging: efficiency, quality and safety. These were further divided into sub-themes. All informants agreed that electronic messaging is more efficient, i.e. less time-consuming than previous means of communication. The shift from predominantly oral communication to writing electronic messages has brought attention to the content of the information exchanged, thereby leading to more conscious communication. Electronic messaging enables improved information security, thereby enhancing patient safety, but this depends on nurses using the system as intended. Nurses consider electronic messaging to be a useful tool for communication and collaboration in patient transitions. Patient transitions are demanding situations both for patients and for the nurses who facilitate the transitions. The introduction of information and communication technologies can support nurses' work in the transition situations, and this is likely to benefit the patients. © 2015 John Wiley & Sons Ltd.

  3. Report of Defense Science Board Task Force on Industry-to-Industry International Armaments Cooperation. Phase II. Japan

    DTIC Science & Technology

    1984-06-01

    TEMPERATURE MAT’LS IMAGE RECOGNITION ROCKET PROPULSION SPEECH RECOGNITION/TRANSLATION COMPUTER-AIDED DESIGN ARTIFICIAL INTELLIGENCE PRODUCTION TECHNOLOGY...planning, intelligence exchange, and logistics. While not called out in the Guidelines, any further standardization in equipments and interoperability...COST AND TIME THAN DEVELCPING THEM -ESTABLISHMENT OF PRODUCTIVE LONG-TERM BUSINESS RELATIONSH IPS WITH JAPANESE COMPAN IES * PROBLEM -POSSIBILITY OF

  4. Command and Control for Joint Air Operations

    DTIC Science & Technology

    2010-01-12

    systems, to include collaborative air planning tools such as the theater battle management core system ( TBMCS ). Operational level air planning occurs in...sight communications and data exchange equipment in order to respond to joint force requirements. For example, the TBMCS is often used. The use of ATO...generation and dissemination software portions of TBMCS has been standardized. This ATO feature allows the JAOC to be interoperable with other

  5. Open data models for smart health interconnected applications: the example of openEHR.

    PubMed

    Demski, Hans; Garde, Sebastian; Hildebrand, Claudia

    2016-10-22

    Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.

  6. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard

    PubMed Central

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong

    2014-01-01

    Objectives Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Methods Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. Results In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. Conclusions A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models. PMID:24627817

  7. Mobile Assisted Security in Wireless Sensor Networks

    DTIC Science & Technology

    2015-08-03

    server from Google’s DNS, Chromecast and the content server does the 3-way TCP Handshake which is followed by Client Hello and Server Hello TLS messages...utilized TLS v1.2, except NTP servers and google’s DNS server. In the TLS v1.2, after handshake, client and server sends Client Hello and Server Hello ...Messages in order. In Client Hello messages, client offers a list of Cipher Suites that it supports. Each Cipher Suite defines the key exchange algorithm

  8. Virtualization of open-source secure web services to support data exchange in a pediatric critical care research network

    PubMed Central

    Sward, Katherine A; Newth, Christopher JL; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael

    2015-01-01

    Objectives To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Material and Methods Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Results Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Conclusions Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. PMID:25796596

  9. Empirical analysis of knowledge bases to support structured output in the Arden syntax.

    PubMed

    Jenders, Robert A

    2013-01-01

    Structured output has been suggested for the Arden Syntax to facilitate interoperability. Tabulate the components of WRITE statements in a corpus of medical logic modules (MLMs)in order to validate requiring structured output. WRITE statements were tabulated in 258 MLMs from 2 organizations. In a total of 351 WRITE statements, email destinations (226) predominated, and 39 orders and 40 coded output elements also were tabulated. Free-text strings predominated as the message data. Arden WRITE statements contain considerable potentially structured data now included as free text. A future, normative structured WRITE statement must address a variety of data types and destinations.

  10. Multiphase complete exchange: A theoretical analysis

    NASA Technical Reports Server (NTRS)

    Bokhari, Shahid H.

    1993-01-01

    Complete Exchange requires each of N processors to send a unique message to each of the remaining N-1 processors. For a circuit switched hypercube with N = 2(sub d) processors, the Direct and Standard algorithms for Complete Exchange are optimal for very large and very small message sizes, respectively. For intermediate sizes, a hybrid Multiphase algorithm is better. This carries out Direct exchanges on a set of subcubes whose dimensions are a partition of the integer d. The best such algorithm for a given message size m could hitherto only be found by enumerating all partitions of d. The Multiphase algorithm is analyzed assuming a high performance communication network. It is proved that only algorithms corresponding to equipartitions of d (partitions in which the maximum and minimum elements differ by at most 1) can possibly be optimal. The run times of these algorithms plotted against m form a hull of optimality. It is proved that, although there is an exponential number of partitions, (1) the number of faces on this hull is Theta(square root of d), (2) the hull can be found in theta(square root of d) time, and (3) once it has been found, the optimal algorithm for any given m can be found in Theta(log d) time. These results provide a very fast technique for minimizing communication overhead in many important applications, such as matrix transpose, Fast Fourier transform, and ADI.

  11. A Multilayer Secure Biomedical Data Management System for Remotely Managing a Very Large Number of Diverse Personal Healthcare Devices.

    PubMed

    Park, KeeHyun; Lim, SeungHyeon

    2015-01-01

    In this paper, a multilayer secure biomedical data management system for managing a very large number of diverse personal health devices is proposed. The system has the following characteristics: the system supports international standard communication protocols to achieve interoperability. The system is integrated in the sense that both a PHD communication system and a remote PHD management system work together as a single system. Finally, the system proposed in this paper provides user/message authentication processes to securely transmit biomedical data measured by PHDs based on the concept of a biomedical signature. Some experiments, including the stress test, have been conducted to show that the system proposed/constructed in this study performs very well even when a very large number of PHDs are used. For a stress test, up to 1,200 threads are made to represent the same number of PHD agents. The loss ratio of the ISO/IEEE 11073 messages in the normal system is as high as 14% when 1,200 PHD agents are connected. On the other hand, no message loss occurs in the multilayered system proposed in this study, which demonstrates the superiority of the multilayered system to the normal system with regard to heavy traffic.

  12. A Multilayer Secure Biomedical Data Management System for Remotely Managing a Very Large Number of Diverse Personal Healthcare Devices

    PubMed Central

    Lim, SeungHyeon

    2015-01-01

    In this paper, a multilayer secure biomedical data management system for managing a very large number of diverse personal health devices is proposed. The system has the following characteristics: the system supports international standard communication protocols to achieve interoperability. The system is integrated in the sense that both a PHD communication system and a remote PHD management system work together as a single system. Finally, the system proposed in this paper provides user/message authentication processes to securely transmit biomedical data measured by PHDs based on the concept of a biomedical signature. Some experiments, including the stress test, have been conducted to show that the system proposed/constructed in this study performs very well even when a very large number of PHDs are used. For a stress test, up to 1,200 threads are made to represent the same number of PHD agents. The loss ratio of the ISO/IEEE 11073 messages in the normal system is as high as 14% when 1,200 PHD agents are connected. On the other hand, no message loss occurs in the multilayered system proposed in this study, which demonstrates the superiority of the multilayered system to the normal system with regard to heavy traffic. PMID:26247034

  13. Applying secret sharing for HIS backup exchange.

    PubMed

    Kuroda, Tomohiro; Kimura, Eizen; Matsumura, Yasushi; Yamashita, Yoshinori; Hiramatsu, Haruhiko; Kume, Naoto; Sato, Atsushi

    2013-01-01

    To secure business continuity is indispensable for hospitals to fulfill its social responsibility under disasters. Although to back up the data of the hospital information system (HIS) at multiple remote sites is a key strategy of business continuity plan (BCP), the requirements to treat privacy sensitive data jack up the cost for the backup. The secret sharing is a method to split an original secret message up so that each individual piece is meaningless, but putting sufficient number of pieces together to reveal the original message. The secret sharing method eases us to exchange HIS backups between multiple hospitals. This paper evaluated the feasibility of the commercial secret sharing solution for HIS backup through several simulations. The result shows that the commercial solution is feasible to realize reasonable HIS backup exchange platform when template of contract between participating hospitals is ready.

  14. The development of MML (Medical Markup Language) version 3.0 as a medical document exchange format for HL7 messages.

    PubMed

    Guo, Jinqiu; Takada, Akira; Tanaka, Koji; Sato, Junzo; Suzuki, Muneou; Suzuki, Toshiaki; Nakashima, Yusei; Araki, Kenji; Yoshihara, Hiroyuki

    2004-12-01

    Medical Markup Language (MML), as a set of standards, has been developed over the last 8 years to allow the exchange of medical data between different medical information providers. MML Version 2.21 used XML as a metalanguage and was announced in 1999. In 2001, MML was updated to Version 2.3, which contained 12 modules. The latest version--Version 3.0--is based on the HL7 Clinical Document Architecture (CDA). During the development of this new version, the structure of MML Version 2.3 was analyzed, subdivided into several categories, and redefined so the information defined in MML could be described in HL7 CDA Level One. As a result of this development, it has become possible to exchange MML Version 3.0 medical documents via HL7 messages.

  15. Computer Networking for Collegial Exchange among Teachers: A Summary of Findings and Recommendations. Technical Report.

    ERIC Educational Resources Information Center

    West, Mary Maxwell; McSwiney, Eileen

    Asynchronous computer-based conferencing offers several unique capabilities as a medium. Participants can read and write messages at whatever time is convenient for them, groups can interact even though participants are geographically separated, and messages are available to readers almost instantly. Because the medium has served for over a decade…

  16. Use of an Electronic Discussion Group for High School Publications Advisers: A Descriptive Pilot Study.

    ERIC Educational Resources Information Center

    Blick, Eddie

    This study aimed to catalog the nature of written message exchanges on a network computer bulletin board, HSJOURN, which caters mainly to high school journalism teachers and publications advisers. The study analyzed the content of messages between December 1993 and January 1995 and cataloged them in the following categories: announcements;…

  17. Young People's Everyday Literacies: The Language Features of Instant Messaging

    ERIC Educational Resources Information Center

    Haas, Christina; Takayoshi, Pamela

    2011-01-01

    In this article, we examine writing in the context of new communication technologies as a kind of everyday literacy. Using an inductive approach developed from grounded theory, we analyzed a 32,000-word corpus of college students' Instant Messaging (IM) exchanges. Through our analysis of this corpus, we identify a fifteen-item taxonomy of IM…

  18. A Symposium in Rhetoric.

    ERIC Educational Resources Information Center

    Tanner, William E., Ed.; And Others

    The six articles in this collection explore the following topics relating to rhetoric: the distinction between the truth value and the exchange value of a message and between the signifier and the signified in a message; the rhetoric of silence in modern fiction; the way in which readers are influenced not only by what writers say but by how they…

  19. Text messaging to support a perinatal collaborative care model for depression: A multi-methods inquiry.

    PubMed

    Bhat, Amritha; Mao, Johnny; Unützer, Jürgen; Reed, Susan; Unger, Jennifer

    Mental health care integrated into obstetric settings improves access to perinatal depression treatments. Digital interactions such as text messaging between patient and provider can further improve access. We describe the use of text messaging within a perinatal Collaborative Care (CC) program, and explore the association of text messaging content with perinatal depression outcomes. We analyzed data from an open treatment trial of perinatal CC in a rural obstetric clinic. Twenty five women with Patient Health Questionnaire-9 score of ≥10 enrolled in CC, and used text messaging to communicate with their Care Manager(CM). We used surveys and focus groups to assessacceptability of text messaging with surveys and focus groups. We calculated the number of text messages exchanged, and analyzed content to understand usage patterns. We explored association between text messaging content and depression outcomes. CMs initiated 85.4% messages, and patients responded to 86.9% messages. CMs used text messaging for appointment reminders, and patients used it to obtain obstetric and parenting information. CMs had concerns about the likelihood of boundary violations. Patients appreciated the asynchronous nature of text messaging. Text messaging is feasible and acceptable within a perinatal CC program. We need further research into the effectiveness of text messaging content, and response protocols. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Cultural Exchange Through BD.

    ERIC Educational Resources Information Center

    Lawlor, Patricia M.

    1985-01-01

    Discusses the development of and the popularity of comic strips (the BD or "bande dessinee" in French) in both France and the United States. Argues that comic strips play a major role in French-American cultural exchange because they express each culture and convey their message both visually and linguistically. (SED)

  1. Education On-Line.

    ERIC Educational Resources Information Center

    Andres, Yvonne Marie

    1993-01-01

    Estimated 50,000 teachers worldwide are using the Internet to tap university computerized library catalogs, exchange E-mail, read news bulletins, and join "chat" groups on various topics. FrEdMail (Free Educational Electronic Mail) is a nonprofit chain of computer bulletin boards giving schools the ability to exchange messages,…

  2. An analysis of four error detection and correction schemes for the proposed Federal standard 1024 (land mobile radio)

    NASA Astrophysics Data System (ADS)

    Lohrmann, Carol A.

    1990-03-01

    Interoperability of commercial Land Mobile Radios (LMR) and the military's tactical LMR is highly desirable if the U.S. government is to respond effectively in a national emergency or in a joint military operation. This ability to talk securely and immediately across agency and military service boundaries is often overlooked. One way to ensure interoperability is to develop and promote Federal communication standards (FS). This thesis surveys one area of the proposed FS 1024 for LMRs; namely, the error detection and correction (EDAC) of the message indicator (MI) bits used for cryptographic synchronization. Several EDAC codes are examined (Hamming, Quadratic Residue, hard decision Golay and soft decision Golay), tested on three FORTRAN programmed channel simulations (INMARSAT, Gaussian and constant burst width), compared and analyzed (based on bit error rates and percent of error-free super-frame runs) so that a best code can be recommended. Out of the four codes under study, the soft decision Golay code (24,12) is evaluated to be the best. This finding is based on the code's ability to detect and correct errors as well as the relative ease of implementation of the algorithm.

  3. Roadmap for Testing and Validation of Electric Vehicle Communication Standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pratt, Richard M.; Tuffner, Francis K.; Gowri, Krishnan

    Vehicle to grid communication standards are critical to the charge management and interoperability among plug-in electric vehicles (PEVs), charging stations and utility providers. The Society of Automobile Engineers (SAE), International Organization for Standardization (ISO), International Electrotechnical Commission (IEC) and the ZigBee Alliance are developing requirements for communication messages and protocols. While interoperability standards development has been in progress for more than two years, no definitive guidelines are available for the automobile manufacturers, charging station manufacturers or utility backhaul network systems. At present, there is a wide range of proprietary communication options developed and supported in the industry. Recent work bymore » the Electric Power Research Institute (EPRI), in collaboration with SAE and automobile manufacturers, has identified performance requirements and developed a test plan based on possible communication pathways using power line communication (PLC). Though the communication pathways and power line communication technology options are identified, much work needs to be done in developing application software and testing of communication modules before these can be deployed in production vehicles. This paper presents a roadmap and results from testing power line communication modules developed to meet the requirements of SAE J2847/1 standard.« less

  4. Multisensor interoperability for persistent surveillance and FOB protection with multiple technologies during the TNT exercise at Camp Roberts, California

    NASA Astrophysics Data System (ADS)

    Murarka, Naveen; Chambers, Jon

    2012-06-01

    Multiple sensors, providing actionable intelligence to the war fighter, often have difficulty interoperating with each other. Northrop Grumman (NG) is dedicated to solving these problems and providing complete solutions for persistent surveillance. In August, 2011, NG was invited to participate in the Tactical Network Topology (TNT) Capabilities Based Experimentation at Camp Roberts, CA to demonstrate integrated system capabilities providing Forward Operating Base (FOB) protection. This experiment was an opportunity to leverage previous efforts from NG's Rotorcraft Avionics Innovation Laboratory (RAIL) to integrate five prime systems with widely different capabilities. The five systems included a Hostile Fire and Missile Warning Sensor System, SCORPION II Unattended Ground Sensor system, Smart Integrated Vehicle Area Network (SiVAN), STARLite Synthetic Aperture Radar (SAR)/Ground Moving Target Indications (GMTI) radar system, and a vehicle with Target Location Module (TLM) and Laser Designation Module (LDM). These systems were integrated with each other and a Tactical Operations Center (TOC) equipped with RaptorX and Falconview providing a Common Operational Picture (COP) via Cursor on Target (CoT) messages. This paper will discuss this exercise, and the lessons learned, by integrating these five prime systems for persistent surveillance and FOB protection.

  5. An Approach Using MIP Products for the Development of the Coalition Battle Management Language Standard

    DTIC Science & Technology

    2013-06-01

    collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or...Control Information Exchange Data Model (JC3IEDM). The Coalition Battle Management Language (CBML) being developed by the Simulation Interoperability

  6. Low-Cost Manufacturing, Usability, and Security: An Analysis of Bluetooth Simple Pairing and Wi-Fi Protected Setup

    NASA Astrophysics Data System (ADS)

    Kuo, Cynthia; Walker, Jesse; Perrig, Adrian

    Bluetooth Simple Pairing and Wi-Fi Protected Setup specify mechanisms for exchanging authentication credentials in wireless networks. Both Simple Pairing and Protected Setup support multiple setup mechanisms, which increases security risks and hurts the user experience. To improve the security and usability of these specifications, we suggest defining a common baseline for hardware features and a consistent, interoperable user experience across devices.

  7. Patient Privacy, Consent, and Identity Management in Health Information Exchange: Issues for the Military Health System.

    PubMed

    Hosek, Susan D; Straus, Susan G

    2013-01-01

    The Military Health System (MHS) and the Veterans Health Administration (VHA) have been among the nation's leaders in health information technology (IT), including the development of health IT systems and electronic health records that summarize patients' care from multiple providers. Health IT interoperability within MHS and across MHS partners, including VHA, is one of ten goals in the current MHS Strategic Plan. As a step toward achieving improved interoperability, the MHS is seeking to develop a research roadmap to better coordinate health IT research efforts, address IT capability gaps, and reduce programmatic risk for its enterprise projects. This article contributes to that effort by identifying gaps in research, policy, and practice involving patient privacy, consent, and identity management that need to be addressed to bring about improved quality and efficiency of care through health information exchange. Major challenges include (1) designing a meaningful patient consent procedure, (2) recording patients' consent preferences and designing procedures to implement restrictions on disclosures of protected health information, and (3) advancing knowledge regarding the best technical approaches to performing patient identity matches and how best to monitor results over time. Using a sociotechnical framework, this article suggests steps for overcoming these challenges and topics for future research.

  8. Ocean Data Interoperability Platform (ODIP): Developing a Common Framework for Marine Data Management on a Global Scale

    NASA Astrophysics Data System (ADS)

    Glaves, H. M.; Schaap, D.

    2014-12-01

    As marine research becomes increasingly multidisciplinary in its approach there has been a corresponding rise in the demand for large quantities of high quality interoperable data. A number of regional initiatives are already addressing this requirement through the establishment of e-infrastructures to improve the discovery and access of marine data. Projects such as Geo-Seas and SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and IMOS in Australia have implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these regional initiatives has been developed to address their own requirements and independently of other regions. To establish a common framework for marine data management on a global scale these is a need to develop interoperability solutions that can be implemented across these initiatives.Through a series of workshops attended by the relevant domain specialists, the Ocean Data Interoperability Platform (ODIP) project has identified areas of commonality between the regional infrastructures and used these as the foundation for the development of three prototype interoperability solutions addressing: the use of brokering services for the purposes of providing access to the data available in the regional data discovery and access services including via the GEOSS portal the development of interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) portal the establishment of a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE) These prototypes will be used to underpin the development of a common global approach to the management of marine data which can be promoted to the wider marine research community. ODIP is a community lead project that is currently focussed on regional initiatives in Europe, the USA and Australia but which is seeking to expand this framework to include other regional marine data infrastructures.

  9. 78 FR 29190 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Order Approving the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-17

    ... entered into an order management system (including orders received via telephone or instant message) and... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-69561; File No. SR-FINRA-2013-013] Self..., 2013, Financial Industry Regulatory Authority, Inc. (``FINRA'') filed with the Securities and Exchange...

  10. Direct2Experts: a pilot national network to demonstrate interoperability among research-networking platforms.

    PubMed

    Weber, Griffin M; Barnett, William; Conlon, Mike; Eichmann, David; Kibbe, Warren; Falk-Krzesinski, Holly; Halaas, Michael; Johnson, Layne; Meeks, Eric; Mitchell, Donald; Schleyer, Titus; Stallings, Sarah; Warden, Michael; Kahlon, Maninder

    2011-12-01

    Research-networking tools use data-mining and social networking to enable expertise discovery, matchmaking and collaboration, which are important facets of team science and translational research. Several commercial and academic platforms have been built, and many institutions have deployed these products to help their investigators find local collaborators. Recent studies, though, have shown the growing importance of multiuniversity teams in science. Unfortunately, the lack of a standard data-exchange model and resistance of universities to share information about their faculty have presented barriers to forming an institutionally supported national network. This case report describes an initiative, which, in only 6 months, achieved interoperability among seven major research-networking products at 28 universities by taking an approach that focused on addressing institutional concerns and encouraging their participation. With this necessary groundwork in place, the second phase of this effort can begin, which will expand the network's functionality and focus on the end users.

  11. Restructuring an EHR system and the Medical Markup Language (MML) standard to improve interoperability by archetype technology.

    PubMed

    Kobayashi, Shinji; Kume, Naoto; Yoshihara, Hiroyuki

    2015-01-01

    In 2001, we developed an EHR system for regional healthcare information inter-exchange and to provide individual patient data to patients. This system was adopted in three regions in Japan. We also developed a Medical Markup Language (MML) standard for inter- and intra-hospital communications. The system was built on a legacy platform, however, and had not been appropriately maintained or updated to meet clinical requirements. To improve future maintenance costs, we reconstructed the EHR system using archetype technology on the Ruby on Rails platform, and generated MML equivalent forms from archetypes. The system was deployed as a cloud-based system for preliminary use as a regional EHR. The system now has the capability to catch up with new requirements, maintaining semantic interoperability with archetype technology. It is also more flexible than the legacy EHR system.

  12. GéoSAS: A modular and interoperable Open Source Spatial Data Infrastructure for research

    NASA Astrophysics Data System (ADS)

    Bera, R.; Squividant, H.; Le Henaff, G.; Pichelin, P.; Ruiz, L.; Launay, J.; Vanhouteghem, J.; Aurousseau, P.; Cudennec, C.

    2015-05-01

    To-date, the commonest way to deal with geographical information and processes still appears to consume local resources, i.e. locally stored data processed on a local desktop or server. The maturity and subsequent growing use of OGC standards to exchange data on the World Wide Web, enhanced in Europe by the INSPIRE Directive, is bound to change the way people (and among them research scientists, especially in environmental sciences) make use of, and manage, spatial data. A clever use of OGC standards can help scientists to better store, share and use data, in particular for modelling. We propose a framework for online processing by making an intensive use of OGC standards. We illustrate it using the Spatial Data Infrastructure (SDI) GéoSAS which is the SDI set up for researchers' needs in our department. It is based on the existing open source, modular and interoperable Spatial Data Architecture geOrchestra.

  13. Flight tests with a data link used for air traffic control information exchange

    NASA Technical Reports Server (NTRS)

    Knox, Charles E.; Scanlon, Charles H.

    1991-01-01

    Previous studies showed that air traffic control (ATC) message exchange with a data link offers the potential benefits of increased airspace system safety and efficiency. To accomplish these benefits, data link can be used to reduce communication errors and relieve overloaded ATC voice radio frequencies, which hamper efficient message exchange during peak traffic periods. Flight tests with commercial airline pilots as test subjects were conducted in the NASA Transport Systems Research Vehicle Boeing 737 airplane to contrast flight operations that used current voice communications with flight operations that used data link to transmit both strategic and tactical ATC clearances during a typical commercial airflight from takeoff to landing. The results of these tests that used data link as the primary communication source with ATC showed flight crew acceptance, a perceived reduction in crew work load, and a reduction in crew communication errors.

  14. Simulation of Controller Pilot Data Link Communications over VHF Digital Link Mode 3

    NASA Technical Reports Server (NTRS)

    Bretmersky, Steven C.; Murawski, Robert; Nguyen, Thanh C.; Raghavan, Rajesh S.

    2004-01-01

    The Federal Aviation Administration (FAA) has established an operational plan for the future Air Traffic Management (ATM) system, in which the Controller Pilot Data Link Communications (CPDLC) is envisioned to evolve into digital messaging that will take on an ever increasing role in controller to pilot communications, significantly changing the way the National Airspace System (NAS) is operating. According to FAA, CPDLC represents the first phase of the transition from the current analog voice system to an International Civil Aviation Organization (ICAO) compliant system in which digital communication becomes the alternate and perhaps primary method of routine communication. The CPDLC application is an Air Traffic Service (ATS) application in which pilots and controllers exchange messages via an addressed data link. CPDLC includes a set of clearance, information, and request message elements that correspond to existing phraseology employed by current Air Traffic Control (ATC) procedures. These message elements encompass altitude assignments, crossing constraints, lateral deviations, route changes and clearances, speed assignments, radio frequency assignments, and various requests for information. The pilot is provided with the capability to respond to messages, to request clearances and information, to report information, and to declare/rescind an emergency. A 'free text' capability is also provided to exchange information not conforming to defined formats. This paper presents simulated results of the aeronautical telecommunication application Controller Pilot Data Link Communications over VHF Digital Link Mode 3 (VDL Mode 3). The objective of this simulation study was to determine the impact of CPDLC traffic loads, in terms of timely message delivery and capacity of the VDL Mode 3 subnetwork. The traffic model is based on and is used for generating air/ground messages with different priorities. Communication is modeled for the en route domain of the Cleveland Center air traffic (ZOB ARTCC).

  15. Interoperability after deployment: persistent challenges and regional strategies in Denmark.

    PubMed

    Kierkegaard, Patrick

    2015-04-01

    The European Union has identified Denmark as one of the countries who have the potential to provide leadership and inspiration for other countries in eHealth implementation and adoption. However, Denmark has historically struggled to facilitate data exchange between their public hospitals' electronic health records (EHRs). Furthermore, state-led projects failed to adequately address the challenges of interoperability after deployment. Changes in the organizational setup and division of responsibilities concerning the future of eHealth implementations in hospitals took place, which granted the Danish regions the full responsibility for all hospital systems, specifically the consolidation of EHRs to one system per region. The regions reduced the number of different EHRs to six systems by 2014. Additionally, the first version of the National Health Record was launched to provide health care practitioners with an overview of a patient's data stored in all EHRs across the regions and within the various health sectors. The governance of national eHealth implementation plays a crucial role in the development and diffusion of interoperable technologies. Changes in the organizational setup and redistribution of responsibilities between the Danish regions and the state play a pivotal role in producing viable and coherent solutions in a timely manner. Interoperability initiatives are best managed on a regional level or by the authorities responsible for the provision of local health care services. Cross-regional communication is essential during the initial phases of planning in order to set a common goal for countrywide harmonization, coherence and collaboration. © The Author 2015. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  16. Fair and optimistic quantum contract signing

    NASA Astrophysics Data System (ADS)

    Paunković, N.; Bouda, J.; Mateus, P.

    2011-12-01

    We present a fair and optimistic quantum-contract-signing protocol between two clients that requires no communication with the third trusted party during the exchange phase. We discuss its fairness and show that it is possible to design such a protocol for which the probability of a dishonest client to cheat becomes negligible and scales as N-1/2, where N is the number of messages exchanged between the clients. Our protocol is not based on the exchange of signed messages: Its fairness is based on the laws of quantum mechanics. Thus, it is abuse free, and the clients do not have to generate new keys for each message during the exchange phase. We discuss a real-life scenario when measurement errors and qubit-state corruption due to noisy channels and imperfect quantum memories occur and argue that for a real, good-enough measurement apparatus, transmission channels, and quantum memories, our protocol would still be fair. Apart from stable quantum memories, the other segments of our protocol could be implemented by today's technology, as they require in essence the same type of apparatus as the one needed for the Bennett-Brassard 1984 (BB84) cryptographic protocol. Finally, we briefly discuss two alternative versions of the protocol, one that uses only two states [based on the Bennett 1992 (B92) protocol] and the other that uses entangled pairs, and show that it is possible to generalize our protocol to an arbitrary number of clients.

  17. Simple Algorithms for Distributed Leader Election in Anonymous Synchronous Rings and Complete Networks Inspired by Neural Development in Fruit Flies.

    PubMed

    Xu, Lei; Jeavons, Peter

    2015-11-01

    Leader election in anonymous rings and complete networks is a very practical problem in distributed computing. Previous algorithms for this problem are generally designed for a classical message passing model where complex messages are exchanged. However, the need to send and receive complex messages makes such algorithms less practical for some real applications. We present some simple synchronous algorithms for distributed leader election in anonymous rings and complete networks that are inspired by the development of the neural system of the fruit fly. Our leader election algorithms all assume that only one-bit messages are broadcast by nodes in the network and processors are only able to distinguish between silence and the arrival of one or more messages. These restrictions allow implementations to use a simpler message-passing architecture. Even with these harsh restrictions our algorithms are shown to achieve good time and message complexity both analytically and experimentally.

  18. Oklahoma Retailers' Perspectives on Mutual Benefit Exchange to Limit Point-of-Sale Tobacco Advertisements.

    PubMed

    Chan, Andie; Douglas, Malinda Reddish; Ling, Pamela M

    2015-09-01

    Businesses changing their practices in ways that support tobacco control efforts recently have gained interest, as demonstrated by CVS Health's voluntary policy to end tobacco sales. Point-of-sale (POS) advertisements are associated with youth smoking initiation, increased tobacco consumption, and reduced quit attempts among smokers. There is interest in encouraging retailers to limit tobacco POS advertisements voluntarily. This qualitative exploratory study describes Oklahoma tobacco retailers' perspectives on a mutual benefit exchange approach, and preferred message and messenger qualities that would entice them to take voluntary action to limit tobacco POS advertisements. This study found that mutual benefit exchange could be a viable option along with education and law as strategies to create behavior change among tobacco retailers. Many retailers stated that they would be willing to remove noncontractual POS advertisements for a 6-month commitment period when presented with mutual exchange benefit, tailored message, and appropriate messenger. Mutual benefit exchange, as a behavior change strategy to encourage voluntary removal of POS tobacco advertisements, was acceptable to retailers, could enhance local tobacco control in states with preemption, and may contribute to setting the foundation for broader legislative efforts. © 2015 Society for Public Health Education.

  19. Oklahoma Retailers’ Perspectives on Mutual Benefit Exchange to Limit Point-of-Sale Tobacco Advertisements

    PubMed Central

    Chan, Andie; Douglas, Malinda Reddish; Ling, Pamela M.

    2015-01-01

    Businesses changing their practices in ways that support tobacco control efforts recently have gained interest, as demonstrated by CVS Health’s voluntary policy to end tobacco sales. Point of sale (POS) advertisements are associated with youth smoking initiation, increased tobacco consumption, and reduced quit attempts among smokers. There is interest in encouraging retailers to limit tobacco POS advertisements voluntarily. This qualitative exploratory study describes Oklahoma tobacco retailers’ perspectives on a mutual benefit exchange approach, and preferred message and messenger qualities that would entice them to take voluntary action to limit tobacco POS advertisements. This study found mutual benefit exchange could be a viable option along with education and law as strategies to create behavior change among tobacco retailers. Many retailers stated that they would be willing to remove non-contractual POS advertisements for a six-month commitment period when presented with mutual exchange benefit, tailored message, and appropriate messenger. Mutual benefit exchange, as a behavior change strategy to encourage voluntary removal of POS tobacco advertisements, was acceptable to retailers, could enhance local tobacco control in states with preemption, and may contribute to setting the foundation for broader legislative efforts. PMID:25767197

  20. Future Interoperability of Camp Protection Systems (FICAPS)

    NASA Astrophysics Data System (ADS)

    Caron, Sylvie; Gündisch, Rainer; Marchand, Alain; Stahl, Karl-Hermann

    2013-05-01

    The FICAPS Project has been established as a Project of the European Defence Agency based on an initiative of Germany and France. Goal of this Project was to derive Guidelines, which by a proper implementation in future developments improve Camp Protection Systems (CPS) by enabling and improving interoperability between Camp Protection Systems and its Equipments of different Nations involved in multinational missions. These Guidelines shall allow for: • Real-time information exchange between equipments and systems of different suppliers and nations (even via SatCom), • Quick and easy replacement of equipments (even of different Nations) at run-time in the field by means of plug and play capability, thus lowering the operational and logistic costs and making the system highly available, • Enhancement of system capabilities (open and modular systems) by adding new equipment with new capabilities (just plug-in, automatic adjustment of the HMI Human Machine Interface) without costly and time consuming validation and test on system level (validation and test can be done on Equipment level), Four scenarios have been identified to summarize the interoperability requirements from an operational viewpoint. To prove the definitions given in the Guideline Document, a French and a German Demonstration System, based on existing national assets, were realized. Demonstrations, showing the capabilities given by the defined interoperability requirements with respect to the operational scenarios, were performed. Demonstrations included remote control of a CPS by another CPS, remote sensor control (Electro-Optic/InfraRed EO/IR) and remote effector control. This capability can be applied to extend the protection area or to protect distant infrastructural assets Demonstrations have been performed. The required interoperability functionality was shown successfully. Even if the focus of the FICAPS project was on camp protection, the solution found is also appropriate for other force protection and ISR (Intelligence Surveillance Reconnaissance) tasks not only due to its flexibility but also due to the chosen interfacing.

  1. Balancing Contention and Synchronization on the Intel Paragon

    NASA Technical Reports Server (NTRS)

    Bokhari, Shahid H.; Nicol, David M.

    1996-01-01

    The Intel Paragon is a mesh-connected distributed memory parallel computer. It uses an oblivious and deterministic message routing algorithm: this permits us to develop highly optimized schedules for frequently needed communication patterns. The complete exchange is one such pattern. Several approaches are available for carrying it out on the mesh. We study an algorithm developed by Scott. This algorithm assumes that a communication link can carry one message at a time and that a node can only transmit one message at a time. It requires global synchronization to enforce a schedule of transmissions. Unfortunately global synchronization has substantial overhead on the Paragon. At the same time the powerful interconnection mechanism of this machine permits 2 or 3 messages to share a communication link with minor overhead. It can also overlap multiple message transmission from the same node to some extent. We develop a generalization of Scott's algorithm that executes complete exchange with a prescribed contention. Schedules that incur greater contention require fewer synchronization steps. This permits us to tradeoff contention against synchronization overhead. We describe the performance of this algorithm and compare it with Scott's original algorithm as well as with a naive algorithm that does not take interconnection structure into account. The Bounded contention algorithm is always better than Scott's algorithm and outperforms the naive algorithm for all but the smallest message sizes. The naive algorithm fails to work on meshes larger than 12 x 12. These results show that due consideration of processor interconnect and machine performance parameters is necessary to obtain peak performance from the Paragon and its successor mesh machines.

  2. Measures for interoperability of phenotypic data: minimum information requirements and formatting.

    PubMed

    Ćwiek-Kupczyńska, Hanna; Altmann, Thomas; Arend, Daniel; Arnaud, Elizabeth; Chen, Dijun; Cornut, Guillaume; Fiorani, Fabio; Frohmberg, Wojciech; Junker, Astrid; Klukas, Christian; Lange, Matthias; Mazurek, Cezary; Nafissi, Anahita; Neveu, Pascal; van Oeveren, Jan; Pommier, Cyril; Poorter, Hendrik; Rocca-Serra, Philippe; Sansone, Susanna-Assunta; Scholz, Uwe; van Schriek, Marco; Seren, Ümit; Usadel, Björn; Weise, Stephan; Kersey, Paul; Krajewski, Paweł

    2016-01-01

    Plant phenotypic data shrouds a wealth of information which, when accurately analysed and linked to other data types, brings to light the knowledge about the mechanisms of life. As phenotyping is a field of research comprising manifold, diverse and time-consuming experiments, the findings can be fostered by reusing and combining existing datasets. Their correct interpretation, and thus replicability, comparability and interoperability, is possible provided that the collected observations are equipped with an adequate set of metadata. So far there have been no common standards governing phenotypic data description, which hampered data exchange and reuse. In this paper we propose the guidelines for proper handling of the information about plant phenotyping experiments, in terms of both the recommended content of the description and its formatting. We provide a document called "Minimum Information About a Plant Phenotyping Experiment", which specifies what information about each experiment should be given, and a Phenotyping Configuration for the ISA-Tab format, which allows to practically organise this information within a dataset. We provide examples of ISA-Tab-formatted phenotypic data, and a general description of a few systems where the recommendations have been implemented. Acceptance of the rules described in this paper by the plant phenotyping community will help to achieve findable, accessible, interoperable and reusable data.

  3. The Osseus platform: a prototype for advanced web-based distributed simulation

    NASA Astrophysics Data System (ADS)

    Franceschini, Derrick; Riecken, Mark

    2016-05-01

    Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.

  4. Coalition Warrior Interoperability Demonstration 2011 Trial 2.32 - Managing Military Civilian Messaging (M2CM) Summary Report

    DTIC Science & Technology

    2011-08-01

    Recommendation: Smartphone/ Android technology and the creation of “ Apps ” for emergency management tools should be considered essential for the...and Training Center …………………………………………………….. 14 6.2 Communications ………………………………………………………………………………….. 17 6.2.1 Android ...effectively. Monmouth University’s Android application was not handling COGs effectively. The programmer’s (doctoral students) were on constant standby

  5. Design and Evaluation of the MINTACS SeeTrack Exchange (MINSTE) Concept Demonstrator

    DTIC Science & Technology

    2009-04-01

    software products . URL - http://www.esri.com/ 3 The Technical Cooperation Program (TTCP) is an international organisation that collaborates in defence...off-the-shelf (COTS) products . This provides a basis for implementing interoperability across application, vendor and organisation boundaries. XML...Network a suite of data analysis tools, such as ArcGIS products : 15 DSTO-GD-0574 “…represents a great opportunity for the bringing together of a COP

  6. Health Care IT Collaboration in Massachusetts: The Experience of Creating Regional Connectivity

    PubMed Central

    Halamka, John; Aranow, Meg; Ascenzo, Carl; Bates, David; Debor, Greg; Glaser, John; Goroll, Allan; Stowe, Jim; Tripathi, Micky; Vineyard, Gordon

    2005-01-01

    The state of Massachusetts has significant early experience in planning for and implementing interoperability networks for exchange of clinical and financal data. Members of our evolving data-sharing organizations gained valuable experience that is of potential benefit to others regarding the governance, policies, and technologies underpinning regional health information organizations. We describe the history, roles, and evolution of organizations and their plans for and success with pilot projects. PMID:16049225

  7. Differing Strategies to Meet Information‐Sharing Needs: Publicly Supported Community Health Information Exchanges Versus Health Systems’ Enterprise Health Information Exchanges

    PubMed Central

    KASH, BITA A.

    2016-01-01

    Policy Points: Community health information exchanges have the characteristics of a public good, and they support population health initiatives at the state and national levels. However, current policy equally incentivizes health systems to create their own information exchanges covering more narrowly defined populations.Noninteroperable electronic health records and vendors’ expensive custom interfaces are hindering health information exchanges. Moreover, vendors are imposing the costs of interoperability on health systems and community health information exchanges.Health systems are creating networks of targeted physicians and facilities by funding connections to their own enterprise health information exchanges. These private networks may change referral patterns and foster more integration with outpatient providers. Context The United States has invested billions of dollars to encourage the adoption of and implement the information technologies necessary for health information exchange (HIE), enabling providers to efficiently and effectively share patient information with other providers. Health care providers now have multiple options for obtaining and sharing patient information. Community HIEs facilitate information sharing for a broad group of providers within a region. Enterprise HIEs are operated by health systems and share information among affiliated hospitals and providers. We sought to identify why hospitals and health systems choose either to participate in community HIEs or to establish enterprise HIEs. Methods We conducted semistructured interviews with 40 policymakers, community and enterprise HIE leaders, and health care executives from 19 different organizations. Our qualitative analysis used a general inductive and comparative approach to identify factors influencing participation in, and the success of, each approach to HIE. Findings Enterprise HIEs support health systems' strategic goals through the control of an information technology network consisting of desired trading partners. Community HIEs support obtaining patient information from the broadest set of providers, but with more dispersed benefits to all participants, the community, and patients. Although not an either/or decision, community and enterprise HIEs compete for finite organizational resources like time, skilled staff, and money. Both approaches face challenges due to vendor costs and less‐than‐interoperable technology. Conclusions Both community and enterprise HIEs support aggregating clinical data and following patients across settings. Although they can be complementary, community and enterprise HIEs nonetheless compete for providers’ attention and organizational resources. Health policymakers might try to encourage the type of widespread information exchange pursued by community HIEs, but the business case for enterprise HIEs clearly is stronger. The sustainability of a community HIE, potentially a public good, may necessitate ongoing public funding and supportive regulation. PMID:26994710

  8. Differing Strategies to Meet Information-Sharing Needs: Publicly Supported Community Health Information Exchanges Versus Health Systems' Enterprise Health Information Exchanges.

    PubMed

    Vest, Joshua R; Kash, Bita A

    2016-03-01

    Community health information exchanges have the characteristics of a public good, and they support population health initiatives at the state and national levels. However, current policy equally incentivizes health systems to create their own information exchanges covering more narrowly defined populations. Noninteroperable electronic health records and vendors' expensive custom interfaces are hindering health information exchanges. Moreover, vendors are imposing the costs of interoperability on health systems and community health information exchanges. Health systems are creating networks of targeted physicians and facilities by funding connections to their own enterprise health information exchanges. These private networks may change referral patterns and foster more integration with outpatient providers. The United States has invested billions of dollars to encourage the adoption of and implement the information technologies necessary for health information exchange (HIE), enabling providers to efficiently and effectively share patient information with other providers. Health care providers now have multiple options for obtaining and sharing patient information. Community HIEs facilitate information sharing for a broad group of providers within a region. Enterprise HIEs are operated by health systems and share information among affiliated hospitals and providers. We sought to identify why hospitals and health systems choose either to participate in community HIEs or to establish enterprise HIEs. We conducted semistructured interviews with 40 policymakers, community and enterprise HIE leaders, and health care executives from 19 different organizations. Our qualitative analysis used a general inductive and comparative approach to identify factors influencing participation in, and the success of, each approach to HIE. Enterprise HIEs support health systems' strategic goals through the control of an information technology network consisting of desired trading partners. Community HIEs support obtaining patient information from the broadest set of providers, but with more dispersed benefits to all participants, the community, and patients. Although not an either/or decision, community and enterprise HIEs compete for finite organizational resources like time, skilled staff, and money. Both approaches face challenges due to vendor costs and less-than-interoperable technology. Both community and enterprise HIEs support aggregating clinical data and following patients across settings. Although they can be complementary, community and enterprise HIEs nonetheless compete for providers' attention and organizational resources. Health policymakers might try to encourage the type of widespread information exchange pursued by community HIEs, but the business case for enterprise HIEs clearly is stronger. The sustainability of a community HIE, potentially a public good, may necessitate ongoing public funding and supportive regulation. © 2016 Milbank Memorial Fund.

  9. Audience Design through Social Interaction during Group Discussion

    PubMed Central

    Rogers, Shane L.; Fay, Nicolas; Maybery, Murray

    2013-01-01

    This paper contrasts two accounts of audience design during multiparty communication: audience design as a strategic individual-level message adjustment or as a non-strategic interaction-level message adjustment. Using a non-interactive communication task, Experiment 1 showed that people distinguish between messages designed for oneself and messages designed for another person; consistent with strategic message design, messages designed for another person/s were longer (number of words) than those designed for oneself. However, audience size did not affect message length (messages designed for different sized audiences were similar in length). Using an interactive communication task Experiment 2 showed that as group size increased so too did communicative effort (number of words exchanged between interlocutors). Consistent with a non-strategic account, as group members were added more social interaction was necessary to coordinate the group's collective situation model. Experiment 3 validates and extends the production measures used in Experiment 1 and 2 using a comprehension task. Taken together, our results indicate that audience design arises as a non-strategic outcome of social interaction during group discussion. PMID:23437343

  10. 77 FR 37724 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-22

    ... (``Qualifying Trading Permit Holders''), the Exchange may determine on a class-by-class basis to permit SAL responses by all CBOE Market-Makers and Qualifying Trading Permit Holders. The proposed rule change allows... auction messages and eliminates the concept of Qualifying Trading Permit Holders under this provision...

  11. 78 FR 9098 - Self-Regulatory Organizations; Miami International Securities Exchange LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-07

    ... executed contract for the MIAX Clearing Trade Drop (``CTD''), a messaging interface that will provide real... proposes to establish a new Port Fee for the MIAX CTD. CTD provides Exchange Members, their clearing firms... are routed to a CTD connection containing certain information. The information includes, among other...

  12. Communication performance analysis and comparison of two patterns for data exchange between nodes in WorldFIP fieldbus network.

    PubMed

    Liang, Geng; Wang, Hong; Li, Wen; Li, Dazhong

    2010-10-01

    Data exchange patterns between nodes in WorldFIP fieldbus network are quite important and meaningful in improving the communication performance of WorldFIP network. Based on the basic communication ways supported in WorldFIP protocol, we propose two patterns for implementation of data exchange between peer nodes over WorldFIP network. Effects on communication performance of WorldFIP network in terms of some network parameters, such as number of bytes in user's data and turn-around time, in both the proposed patterns, are analyzed at length when different network speeds are applied. Such effects with the patterns of periodic message transmission using acknowledged and non-acknowledged messages, are also studied. Communication performance in both the proposed patterns are analyzed and compared. Practical applications of the research are presented. Through the study, it can be seen that different data exchange patterns make a great difference in improving communication efficiency with different network parameters, which is quite useful and helpful in the practical design of distributed systems based on WorldFIP network. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Endlessly Circulating Messages in IEEE 1588-2008 Systems

    DTIC Science & Technology

    2014-05-13

    section 13.5 of the standard. Once the timing topology is established by the BMC algo- rithm, each clock synchronizes to its master by exchanging timing ...Endlessly Circulating Messages in IEEE 1588-2008 Systems David Broman Patricia Derler Ankush Desai John Eidson Sanjit A. Seshia Electrical...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the

  14. Intimate Strangers and Estranged Intimates: An Investigation of the Impact of Instant Messaging and Short Message Service on the Size and Strength of Social Networks in Kuwait

    ERIC Educational Resources Information Center

    Al-Sanaa, Bashaiar

    2009-01-01

    Information and communication technologies (ICT) have revolutionized how people experience spatial proximity, reality, and connectivity. These technologies provide inexpensive access to anything and anyone in the world. They also replicate face-to-face interaction in cyber-space and allow for participation in numerous modes of social exchange. …

  15. A Message Exchange Protocol in Command and Control Systems Integration, using the JC3IEDM

    DTIC Science & Technology

    2014-06-01

    19TH International Command and Control Research and Technology Symposium C2 Agility: Lessons Learned from Research and Operations. A Message...distribution unlimited 13. SUPPLEMENTARY NOTES Presented at the 18th International Command & Control Research & Technology Symposium (ICCRTS) held 16...presents approaches of integration, compares their technologies , points out their advantages, proposes requirements, and provides the design of a protocol

  16. A Qualitative Study of Client-Clinician Text Exchanges in a Mobile Health Intervention for Individuals With Psychotic Disorders and Substance Use.

    PubMed

    Aschbrenner, Kelly A; Naslund, John A; Gill, Lydia E; Bartels, Stephen J; Ben-Zeev, Dror

    2016-01-01

    Mobile health (mHealth) approaches have the potential to transform prevention, wellness, and illness management for people with dual diagnosis consisting of co-occurring mental illness and substance use disorders by providing timely and cost-effective interventions in clients' natural environments. However, little is known about how clients interact with mHealth interventions to manage their illness. This qualitative study explored the content of mobile phone text messages between clients with dual diagnosis and a clinician who engaged them in daily assessment and intervention text exchanges. Seventeen participants with psychotic disorders and substance use were enrolled in a 12-week single-arm trial of an mHealth intervention focusing on illness management. The clinician (i.e., mobile interventionist) sent daily text messages to participants' privately owned mobile phones to assess their medication adherence and clinical status. The clinician provided other illness management and wellness suggestions flexibly, in response to participants' needs and preferences. In this qualitative study we conducted a thematic analysis of the client-clinician text exchanges that occurred over the course of the intervention. Seven major content themes in client-clinician text message exchanges were identified: mental health symptoms; mental health coping strategies; mental health treatment and management; lifestyle behaviors; social relationships and leisure activities; motivation and personal goal setting; and independent living. Participants were interested in discussing strategies for coping with mental health symptoms (e.g., cognitive restructuring, social support) and health behavior change (e.g., increased physical activity, dietary changes). Our findings suggest that client-centered text messaging has the potential to be an important component of illness management for people with dual diagnosis. This approach is able to offer coping strategies that are tailored to clients' needs and preferences in real time when help is needed.

  17. Unified messaging solution for biosurveillance and disease surveillance.

    PubMed

    Abellera, John P; Srinivasan, Arunkumar; Danos, C Scott; McNabb, Scott; Rhodes, Barry

    2007-10-11

    Biosurveillance and disease surveillance systems serve different purposes. However, the richness and quality of an existing data stream and infrastructure used in biosurveillance may prove beneficial for any state-based electronic disease surveillance system, especially if an electronic laboratory data feed does not exist between a hospital and state-based system. The use of an Enterprise Application Integration(EAI) engine, such as the BioSense Integrator,will be necessary to map heterogeneous messages into standard representations, then validate and route them [1] to a disparate system. This poster illustrates the use of an existing BioSense Integrator in order to create a unified message to support the exchange of electronic lab messages necessary for reportable disease notification. An evaluation of the infrastructure for data messaging will be examined and presented, along with a cost and benefit analysis between hospital and state-based system.

  18. How Online Peer-to-Peer Conversation Shapes the Effects of a Message About Healthy Sleep.

    PubMed

    Robbins, Rebecca; Niederdeppe, Jeff

    2017-02-01

    Conversation about health messages and campaigns is common, and message-related conversations are increasingly recognized as a consequential factor in shaping message effects. The evidence base is limited, however, about the conditions under which conversation may help or hinder health communication efforts. In this study, college students (N = 301) first watched a short sleep video and were randomly assigned to either talk with a partner in an online chat conversation or proceed directly to a short survey. Unknown to participants, the chat partner was a confederate coached to say positive things about sleep and the message ('positive' chat condition), negative things ('negative' chat condition), or unrelated things ('natural' chat condition). All respondents completed a short survey on beliefs about sleep, reactions to the message, and intentions to get adequate sleep. Respondents had greater intentions to engage in healthy sleep when they engaged in positive conversation following message exposure than when they engaged in negative conversation after the message (p < 0.001). Positive emotion experienced in response to the message and positive chat perceptions were significant predictors (p < 0.05) of intentions to achieve healthy sleep. Health message designers may benefit from understanding how messages are exchanged in peer-to-peer conversation to better predict and explain their effects.

  19. How online peer-to-peer conversation shapes the effects of a message about healthy sleep

    PubMed Central

    Robbins, Rebecca; Niederdeppe, Jeff

    2016-01-01

    Conversation about health messages and campaigns is common, and message-related conversations are increasingly recognized as a consequential factor in shaping message effects. The evidence base is limited, however, about the conditions under which conversation may help or hinder health communication efforts. In this study, college students (N = 301) first watched a short sleep video and were randomly assigned to either talk with a partner in an online chat conversation or proceed directly to a short survey. Unknown to participants, the chat partner was a confederate coached to say positive things about sleep and the message (‘positive’ chat condition), negative things (‘negative’ chat condition), or unrelated things (‘natural’ chat condition). All respondents completed a short survey on beliefs about sleep, reactions to the message, and intentions to get adequate sleep. Respondents had greater intentions to engage in healthy sleep when they engaged in positive conversation following message exposure than when they engaged in negative conversation after the message (p < .001). Positive emotion experienced in response to the message and positive chat perceptions were significant predictors (p < .05) of intentions to achieve healthy sleep. Health message designers may benefit from understanding how messages are exchanged in peer-to-peer conversation to better predict and explain their effects. PMID:27492421

  20. STAR Online Meta-Data Collection Framework: Integration with the Pre-existing Controls Infrastructure

    NASA Astrophysics Data System (ADS)

    Arkhipkin, D.; Lauret, J.

    2017-10-01

    One of the STAR experiment’s modular Messaging Interface and Reliable Architecture framework (MIRA) integration goals is to provide seamless and automatic connections with the existing control systems. After an initial proof of concept and operation of the MIRA system as a parallel data collection system for online use and real-time monitoring, the STAR Software and Computing group is now working on the integration of Experimental Physics and Industrial Control System (EPICS) with MIRA’s interfaces. This integration goals are to allow functional interoperability and, later on, to replace the existing/legacy Detector Control System components at the service level. In this report, we describe the evolutionary integration process and, as an example, will discuss the EPICS Alarm Handler conversion. We review the complete upgrade procedure starting with the integration of EPICS-originated alarm signals propagation into MIRA, followed by the replacement of the existing operator interface based on Motif Editor and Display Manager (MEDM) with modern portable web-based Alarm Handler interface. To achieve this aim, we have built an EPICS-to-MQTT [8] bridging service, and recreated the functionality of the original Alarm Handler using low-latency web messaging technologies. The integration of EPICS alarm handling into our messaging framework allowed STAR to improve the DCS alarm awareness of existing STAR DAQ and RTS services, which use MIRA as a primary source of experiment control information.

  1. Key exchange using biometric identity based encryption for sharing encrypted data in cloud environment

    NASA Astrophysics Data System (ADS)

    Hassan, Waleed K.; Al-Assam, Hisham

    2017-05-01

    The main problem associated with using symmetric/ asymmetric keys is how to securely store and exchange the keys between the parties over open networks particularly in the open environment such as cloud computing. Public Key Infrastructure (PKI) have been providing a practical solution for session key exchange for loads of web services. The key limitation of PKI solution is not only the need for a trusted third partly (e.g. certificate authority) but also the absent link between data owner and the encryption keys. The latter is arguably more important where accessing data needs to be linked with identify of the owner. Currently available key exchange protocols depend on using trusted couriers or secure channels, which can be subject to man-in-the-middle attack and various other attacks. This paper proposes a new protocol for Key Exchange using Biometric Identity Based Encryption (KE-BIBE) that enables parties to securely exchange cryptographic keys even an adversary is monitoring the communication channel between the parties. The proposed protocol combines biometrics with IBE in order to provide a secure way to access symmetric keys based on the identity of the users in unsecure environment. In the KE-BIOBE protocol, the message is first encrypted by the data owner using a traditional symmetric key before migrating it to a cloud storage. The symmetric key is then encrypted using public biometrics of the users selected by data owner to decrypt the message based on Fuzzy Identity-Based Encryption. Only the selected users will be able to decrypt the message by providing a fresh sample of their biometric data. The paper argues that the proposed solution eliminates the needs for a key distribution centre in traditional cryptography. It will also give data owner the power of finegrained sharing of encrypted data by control who can access their data.

  2. SmartVeh: Secure and Efficient Message Access Control and Authentication for Vehicular Cloud Computing.

    PubMed

    Huang, Qinlong; Yang, Yixian; Shi, Yuxiang

    2018-02-24

    With the growing number of vehicles and popularity of various services in vehicular cloud computing (VCC), message exchanging among vehicles under traffic conditions and in emergency situations is one of the most pressing demands, and has attracted significant attention. However, it is an important challenge to authenticate the legitimate sources of broadcast messages and achieve fine-grained message access control. In this work, we propose SmartVeh, a secure and efficient message access control and authentication scheme in VCC. A hierarchical, attribute-based encryption technique is utilized to achieve fine-grained and flexible message sharing, which ensures that vehicles whose persistent or dynamic attributes satisfy the access policies can access the broadcast message with equipped on-board units (OBUs). Message authentication is enforced by integrating an attribute-based signature, which achieves message authentication and maintains the anonymity of the vehicles. In order to reduce the computations of the OBUs in the vehicles, we outsource the heavy computations of encryption, decryption and signing to a cloud server and road-side units. The theoretical analysis and simulation results reveal that our secure and efficient scheme is suitable for VCC.

  3. SmartVeh: Secure and Efficient Message Access Control and Authentication for Vehicular Cloud Computing

    PubMed Central

    Yang, Yixian; Shi, Yuxiang

    2018-01-01

    With the growing number of vehicles and popularity of various services in vehicular cloud computing (VCC), message exchanging among vehicles under traffic conditions and in emergency situations is one of the most pressing demands, and has attracted significant attention. However, it is an important challenge to authenticate the legitimate sources of broadcast messages and achieve fine-grained message access control. In this work, we propose SmartVeh, a secure and efficient message access control and authentication scheme in VCC. A hierarchical, attribute-based encryption technique is utilized to achieve fine-grained and flexible message sharing, which ensures that vehicles whose persistent or dynamic attributes satisfy the access policies can access the broadcast message with equipped on-board units (OBUs). Message authentication is enforced by integrating an attribute-based signature, which achieves message authentication and maintains the anonymity of the vehicles. In order to reduce the computations of the OBUs in the vehicles, we outsource the heavy computations of encryption, decryption and signing to a cloud server and road-side units. The theoretical analysis and simulation results reveal that our secure and efficient scheme is suitable for VCC. PMID:29495269

  4. Organization model for Mobile Wireless Sensor Networks inspired in Artificial Bee Colony

    NASA Astrophysics Data System (ADS)

    Freire Roberto, Guilherme; Castilho Maschi, Luis Fernando; Pigatto, Daniel Fernando; Jaquie Castelo Branco, Kalinka Regina Lucas; Alves Neves, Leandro; Montez, Carlos; Sandro Roschildt Pinto, Alex

    2015-01-01

    The purpose of this study is to find a self-organizing model for MWSN based on bee colonies in order to reduce the number of messages transmitted among nodes, and thus reduce the overall consumption energy while maintaining the efficiency of message delivery. The results obtained in this article are originated from simulations carried out with SINALGO software, which demonstrates the effectiveness of the proposed approach. The BeeAODV (Bee Ad-Hoc On Demand Distance Vector) proposed in this paper allows to considerably reduce message exchanges whether compared to AODV (Ad-Hoc On Demand Distance Vector).

  5. Potential effects of the introduction of the discrete address beacon system data link on air/ground information transfer problems

    NASA Technical Reports Server (NTRS)

    Grayson, R. L.

    1981-01-01

    This study of Aviation Safety Reporting System reports suggests that benefits should accure from implementation of discrete address beacon system data link. The phase enhanced terminal information system service is expected to provide better terminal information than present systems by improving currency and accuracy. In the exchange of air traffic control messages, discrete address insures that only the intended recipient receives and acts on a specific message. Visual displays and printer copy of messages should mitigate many of the reported problems associated with voice communications. The problems that remain unaffected include error in addressing the intended recipient and messages whose content is wrong but are otherwise correct as to format and reasonableness.

  6. Remote Asynchronous Message Service Gateway

    NASA Technical Reports Server (NTRS)

    Wang, Shin-Ywan; Burleigh, Scott C.

    2011-01-01

    The Remote Asynchronous Message Service (RAMS) gateway is a special-purpose AMS application node that enables exchange of AMS messages between nodes residing in different AMS "continua," notionally in different geographical locations. JPL s implementation of RAMS gateway functionality is integrated with the ION (Interplanetary Overlay Network) implementation of the DTN (Delay-Tolerant Networking) bundle protocol, and with JPL s implementation of AMS itself. RAMS protocol data units are encapsulated in ION bundles and are forwarded to the neighboring RAMS gateways identified in the source gateway s AMS management information base. Each RAMS gateway has interfaces in two communication environments: the AMS message space it serves, and the RAMS network - the grid or tree of mutually aware RAMS gateways - that enables AMS messages produced in one message space to be forwarded to other message spaces of the same venture. Each gateway opens persistent, private RAMS network communication channels to the RAMS gateways of other message spaces for the same venture, in other continua. The interconnected RAMS gateways use these communication channels to forward message petition assertions and cancellations among themselves. Each RAMS gateway subscribes locally to all subjects that are of interest in any of the linked message spaces. On receiving its copy of a message on any of these subjects, the RAMS gateway node uses the RAMS network to forward the message to every other RAMS gateway whose message space contains at least one node that has subscribed to messages on that subject. On receiving a message via the RAMS network from some other RAMS gateway, the RAMS gateway node forwards the message to all subscribers in its own message space.

  7. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    PubMed

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community.

  8. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    PubMed Central

    2010-01-01

    Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community. PMID:20591161

  9. Data transfer using complete bipartite graph

    NASA Astrophysics Data System (ADS)

    Chandrasekaran, V. M.; Praba, B.; Manimaran, A.; Kailash, G.

    2017-11-01

    Information exchange extent is an estimation of the amount of information sent between two focuses on a framework in a given time period. It is an extremely significant perception in present world. There are many ways of message passing in the present situations. Some of them are through encryption, decryption, by using complete bipartite graph. In this paper, we recommend a method for communication using messages through encryption of a complete bipartite graph.

  10. mCare: using secure mobile technology to support soldier reintegration and rehabilitation.

    PubMed

    Poropatich, Ronald K; Pavliscsak, Holly H; Tong, James C; Little, Jeanette R; McVeigh, Francis L

    2014-06-01

    The U.S. Army Medical Department conducted a pilot mobile health project to determine the requirements for coordination of care for "Wounded Warriors" using mobile messaging. The primary objective was to determine if a secure mobile health (mhealth) intervention provided to geographically dispersed patients would improve contact rates and positively impact the military healthcare system. Over 21 months, volunteers enrolled in a Health Insurance Portability and Accountability Act-compliant, secure mobile messaging initiative called mCare. The study included males and females, 18-61 years old, with a minimum of 60 days of outpatient recovery. Volunteers were required to have a compatible phone. The mhealth intervention included appointment reminders, health and wellness tips, announcements, and other relevant information to this population exchanged between care teams and patients. Provider respondents reported that 85% would refer patients to mCare, and 56% noted improvement in appointment attendance (n=90). Patient responses also revealed high acceptability of mCare and refined the frequency and delivery times (n=114). The pilot project resulted in over 84,000 outbound messages and improved contact rates by 176%. The mCare pilot project demonstrated the feasibility and administrative effectiveness of a scalable mhealth application using secure mobile messaging and information exchanges, including personalized patient education.

  11. Development of an HL7 interface engine, based on tree structure and streaming algorithm, for large-size messages which include image data.

    PubMed

    Um, Ki Sung; Kwak, Yun Sik; Cho, Hune; Kim, Il Kon

    2005-11-01

    A basic assumption of Health Level Seven (HL7) protocol is 'No limitation of message length'. However, most existing commercial HL7 interface engines do limit message length because they use the string array method, which is run in the main memory for the HL7 message parsing process. Specifically, messages with image and multi-media data create a long string array and thus cause the computer system to raise critical and fatal problem. Consequently, HL7 messages cannot handle the image and multi-media data necessary in modern medical records. This study aims to solve this problem with the 'streaming algorithm' method. This new method for HL7 message parsing applies the character-stream object which process character by character between the main memory and hard disk device with the consequence that the processing load on main memory could be alleviated. The main functions of this new engine are generating, parsing, validating, browsing, sending, and receiving HL7 messages. Also, the engine can parse and generate XML-formatted HL7 messages. This new HL7 engine successfully exchanged HL7 messages with 10 megabyte size images and discharge summary information between two university hospitals.

  12. 77 FR 65754 - Self-Regulatory Organizations; C2 Options Exchange, Incorporated; Order Approving Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-30

    ... Organizations; C2 Options Exchange, Incorporated; Order Approving Proposed Rule Change Relating to the Complex...,\\2\\ a proposed rule change to modify C2 Rule 6.13(c), ``Process for Complex Order RFR Auction,'' to... at the start of a Complex Order Auction (``COA''); and (ii) require responses to an RFR message...

  13. Using carbon emissions, oxygen consumption, and energy retention estimates to calculate dietary energy partitioning and estimate forage intake by beef steers

    USDA-ARS?s Scientific Manuscript database

    Take home Message: Estimating ME intake by grazing cattle seems possible using respiration gas exchange estimates. Introduction: We hypothesized that carbon dioxide, methane, and oxigen exchange estimates in breath clouds could be used as biomarkers to ultimately estimate dry matter intake in grazi...

  14. Molecular digital pathology: progress and potential of exchanging molecular data.

    PubMed

    Roy, Somak; Pfeifer, John D; LaFramboise, William A; Pantanowitz, Liron

    2016-09-01

    Many of the demands to perform next generation sequencing (NGS) in the clinical laboratory can be resolved using the principles of telepathology. Molecular telepathology can allow facilities to outsource all or a portion of their NGS operation such as cloud computing, bioinformatics pipelines, variant data management, and knowledge curation. Clinical pathology laboratories can electronically share diverse types of molecular data with reference laboratories, technology service providers, and/or regulatory agencies. Exchange of electronic molecular data allows laboratories to perform validation of rare diseases using foreign data, check the accuracy of their test results against benchmarks, and leverage in silico proficiency testing. This review covers the emerging subject of molecular telepathology, describes clinical use cases for the appropriate exchange of molecular data, and highlights key issues such as data integrity, interoperable formats for massive genomic datasets, security, malpractice and emerging regulations involved with this novel practice.

  15. Transatlantic Current. Number 7. October 2012. Building Future Transatlantic Interoperability Around a Robust NATO Response Force

    DTIC Science & Technology

    2012-10-01

    inss short, multinational skills have reached an all time high, though there is more road ahead than already traveled . However, this accrued wealth of...training centers such as the bilateral U.S.- Romanian Joint Task Force– East at Kogalniceanu Airbase Romania. Stand up a U.S. Corps Forward Element in...which facilitates practical cooperation downstream, either within NATO or in any coalition operations. Increase U.S. and Allied Exchange Students

  16. Managing Complex Interoperability Solutions using Model-Driven Architecture

    DTIC Science & Technology

    2011-06-01

    such as Oracle or MySQL . Each data model for a specific RDBMS is a distinct PSM. Or the system may want to exchange information with other C2...reduced number of transformations, e.g., from an RDBMS physical schema to the corresponding SQL script needed to instantiate the tables in a relational...tance of models. In engineering, a model serves several purposes: 1. It presents an abstract view of a complex system or of a complex information

  17. Electronic Health Records: VA and DOD Need to Support Cost and Schedule Claims, Develop Interoperability Plans, and Improve Collaboration

    DTIC Science & Technology

    2014-02-01

    Page 13 GAO-14-302 Electronic Health Records known as the Captain James A. Lovell Federal Health Care Center ( FHCC ). The FHCC is unique in...Healthcare Management System Modernization DOD Department of Defense FHCC Federal Health Care Center FHIE Federal Health Information Exchange GCPR... FHCC and, in accordance with the fiscal year 2010 NDAA, defined the relationship between the two departments for operating the new, integrated

  18. Environmental Public Health Tracking: Health and Environment Linked for Information Exchange-Atlanta (HEXIX-Atlanta: A cooperative Program Between CDC and NASA for Development of an Environmental Public Health Tracking Network in the Atlanta Metropolitan Area

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Niskar, Amanda Sue

    2005-01-01

    The Centers for Disease Control and Prevention (CDC) is coordinating HELIX- Atlanta to provide information regarding the five-county Metropolitan Atlanta Area (Clayton, Cobb, DeKalb, Fulton, and Gwinett) via a network of integrated environmental monitoring and public health data systems so that all sectors can take action to prevent and control environmentally related health effects. The HELIX-Atlanta Network is a tool to access interoperable information systems with optional information technology linkage functionality driven by scientific rationale. HELIX-Atlanta is a collaborative effort with local, state, federal, and academic partners, including the NASA Marshall Space Flight Center. The HELIX-Atlanta Partners identified the following HELIX-Atlanta initial focus areas: childhood lead poisoning, short-latency cancers, developmental disabilities, birth defects, vital records, respiratory health, age of housing, remote sensing data, and environmental monitoring, HELIX-Atlanta Partners identified and evaluated information systems containing information on the above focus areas. The information system evaluations resulted in recommendations for what resources would be needed to interoperate selected information systems in compliance with the CDC Public Health Information Network (PHIN). This presentation will discuss the collaborative process of building a network that links health and environment data for information exchange, including NASA remote sensing data, for use in HELIX-Atlanta.

  19. Hydrographic processing considerations in the “Big Data” age: An overview of technology trends in ocean and coastal surveys

    NASA Astrophysics Data System (ADS)

    Holland, M.; Hoggarth, A.; Nicholson, J.

    2016-04-01

    The quantity of information generated by survey sensors for ocean and coastal zone mapping has reached the “Big Data” age. This is influenced by the number of survey sensors available to conduct a survey, high data resolution, commercial availability, as well as an increased use of autonomous platforms. The number of users of sophisticated survey information is also growing with the increase in data volume. This is leading to a greater demand and broader use of the processed results, which includes marine archeology, disaster response, and many other applications. Data processing and exchange techniques are evolving to ensure this increased accuracy in acquired data meets the user demand, and leads to an improved understanding of the ocean environment. This includes the use of automated processing, models that maintain the best possible representation of varying resolution data to reduce duplication, as well as data plug-ins and interoperability standards. Through the adoption of interoperable standards, data can be exchanged between stakeholders and used many times in any GIS to support an even wider range of activities. The growing importance of Marine Spatial Data Infrastructure (MSDI) is also contributing to the increased access of marine information to support sustainable use of ocean and coastal environments. This paper offers an industry perspective on trends in hydrographic surveying and processing, and the increased use of marine spatial data.

  20. Computational toxicology using the OpenTox application programming interface and Bioclipse

    PubMed Central

    2011-01-01

    Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173

  1. Data interoperability software solution for emergency reaction in the Europe Union

    NASA Astrophysics Data System (ADS)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2015-07-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency-first responders: the Netherlands-Germany border fire.

  2. The Development of a Graphical User Interface Engine for the Convenient Use of the HL7 Version 2.x Interface Engine

    PubMed Central

    Kim, Hwa Sun; Cho, Hune

    2011-01-01

    Objectives The Health Level Seven Interface Engine (HL7 IE), developed by Kyungpook National University, has been employed in health information systems, however users without a background in programming have reported difficulties in using it. Therefore, we developed a graphical user interface (GUI) engine to make the use of the HL7 IE more convenient. Methods The GUI engine was directly connected with the HL7 IE to handle the HL7 version 2.x messages. Furthermore, the information exchange rules (called the mapping data), represented by a conceptual graph in the GUI engine, were transformed into program objects that were made available to the HL7 IE; the mapping data were stored as binary files for reuse. The usefulness of the GUI engine was examined through information exchange tests between an HL7 version 2.x message and a health information database system. Results Users could easily create HL7 version 2.x messages by creating a conceptual graph through the GUI engine without requiring assistance from programmers. In addition, time could be saved when creating new information exchange rules by reusing the stored mapping data. Conclusions The GUI engine was not able to incorporate information types (e.g., extensible markup language, XML) other than the HL7 version 2.x messages and the database, because it was designed exclusively for the HL7 IE protocol. However, in future work, by including additional parsers to manage XML-based information such as Continuity of Care Documents (CCD) and Continuity of Care Records (CCR), we plan to ensure that the GUI engine will be more widely accessible for the health field. PMID:22259723

  3. The Development of a Graphical User Interface Engine for the Convenient Use of the HL7 Version 2.x Interface Engine.

    PubMed

    Kim, Hwa Sun; Cho, Hune; Lee, In Keun

    2011-12-01

    The Health Level Seven Interface Engine (HL7 IE), developed by Kyungpook National University, has been employed in health information systems, however users without a background in programming have reported difficulties in using it. Therefore, we developed a graphical user interface (GUI) engine to make the use of the HL7 IE more convenient. The GUI engine was directly connected with the HL7 IE to handle the HL7 version 2.x messages. Furthermore, the information exchange rules (called the mapping data), represented by a conceptual graph in the GUI engine, were transformed into program objects that were made available to the HL7 IE; the mapping data were stored as binary files for reuse. The usefulness of the GUI engine was examined through information exchange tests between an HL7 version 2.x message and a health information database system. Users could easily create HL7 version 2.x messages by creating a conceptual graph through the GUI engine without requiring assistance from programmers. In addition, time could be saved when creating new information exchange rules by reusing the stored mapping data. The GUI engine was not able to incorporate information types (e.g., extensible markup language, XML) other than the HL7 version 2.x messages and the database, because it was designed exclusively for the HL7 IE protocol. However, in future work, by including additional parsers to manage XML-based information such as Continuity of Care Documents (CCD) and Continuity of Care Records (CCR), we plan to ensure that the GUI engine will be more widely accessible for the health field.

  4. An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring

    PubMed Central

    Mora-Jiménez, Inmaculada; Ramos-López, Javier; Quintanilla Fernández, Teresa; García-García, Antonio; Díez-Mazuela, Daniel; García-Alberola, Arcadi

    2018-01-01

    Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT)), and a complex-domain (heart rate variability (HRV)). Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages) and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT). The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain. PMID:29494497

  5. DISTANT EARLY WARNING SYSTEM for Tsunamis - A wide-area and multi-hazard approach

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Lendholt, Matthias; Wächter, Joachim

    2010-05-01

    The DEWS (Distant Early Warning System) [1] project, funded under the 6th Framework Programme of the European Union, has the objective to create a new generation of interoperable early warning systems based on an open sensor platform. This platform integrates OGC [2] SWE [3] compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements in the case of tsunami early warning. Based on the upstream information flow DEWS focuses on the improvement of downstream capacities of warning centres especially by improving information logistics for effective and targeted warning message aggregation for a multilingual environment. Multiple telecommunication channels will be used for the dissemination of warning messages. Wherever possible, existing standards have been integrated. The Command and Control User Interface (CCUI), a rich client application based on Eclipse RCP (Rich Client Platform) [4] and the open source GIS uDig [5], integrates various OGC services. Using WMS (Web Map Service) [6] and WFS (Web Feature Service) [7] spatial data are utilized to depict the situation picture and to integrate a simulation system via WPS (Web Processing Service) [8] to identify affected areas. Warning messages are compiled and transmitted in the OASIS [9] CAP (Common Alerting Protocol) [10] standard together with addressing information defined via EDXL-DE (Emergency Data Exchange Language - Distribution Element) [11]. Internal interfaces are realized with SOAP [12] web services. Based on results of GITEWS [13] - in particular the GITEWS Tsunami Service Bus [14] - the DEWS approach provides an implementation for tsunami early warning systems but other geological paradigms are going to follow, e.g. volcanic eruptions or landslides. Therefore in future also multi-hazard functionality is conceivable. The specific software architecture of DEWS makes it possible to dock varying sensors to the system and to extend the CCUI with hazard specific functionality. The presentation covers the DEWS project, the system architecture and the CCUI in conjunction with details of information logistics. The DEWS Wide Area Centre connecting national centres to allow the international communication and warning exchange is presented also. REFERENCES: [1] DEWS, www.dews-online.org [2] OGC, www.opengeospatial.org [3] SWE, www.opengeospatial.org/projects/groups/sensorweb [4] Eclipse RCP, www.eclipse.org/home/categories/rcp.php [5] uDig, udig.refractions.net [6] WMS, www.opengeospatial.org/standards/wms [7] WFS, www.opengeospatial.org/standards/wfs [8] WPS, www.opengeospatial.org/standards/wps [9] OASIS, www.oasis-open.org [10] CAP, www.oasis-open.org/specs/#capv1.1 [11] EDXL-DE, www.oasis-open.org/specs/#edxlde-v1.0 [12] SOAP, www.w3.org/TR/soap [13] GITEWS (German Indonesian Tsunami Early Warning System) is a project of the German Federal Government to aid the recon¬struction of the tsunami-prone Indian Ocean region, www.gitews.org [14] The Tsunami Service Bus is the GITEWS sensor system integration platform offering standardised services for the detection and monitoring of tsunamis

  6. OTACT: ONU Turning with Adaptive Cycle Times in Long-Reach PONs

    NASA Astrophysics Data System (ADS)

    Zare, Sajjad; Ghaffarpour Rahbar, Akbar

    2015-01-01

    With the expansion of PON networks as Long-Reach PON (LR-PON) networks, the problem of degrading the efficiency of centralized bandwidth allocation algorithms threatens this network due to high propagation delay. This is because these algorithms are based on bandwidth negotiation messages frequently exchanged between the optical line terminal (OLT) in the Central Office and optical network units (ONUs) near the users, which become seriously delayed when the network is extended. To solve this problem, some decentralized algorithms are proposed based on bandwidth negotiation messages frequently exchanged between the Remote Node (RN)/Local Exchange (LX) and ONUs near the users. The network has a relatively high delay since there are relatively large distances between RN/LX and ONUs, and therefore, control messages should travel twice between ONUs and RN/LX in order to go from one ONU to another ONU. In this paper, we propose a novel framework, called ONU Turning with Adaptive Cycle Times (OTACT), that uses Power Line Communication (PLC) to connect two adjacent ONUs. Since there is a large population density in urban areas, ONUs are closer to each other. Thus, the efficiency of the proposed method is high. We investigate the performance of the proposed scheme in contrast with other decentralized schemes under the worst case conditions. Simulation results show that the average upstream packet delay can be decreased under the proposed scheme.

  7. The influence of reading motives on the responses after reading blogs.

    PubMed

    Huang, Li-Shia; Chou, Yu-Jen; Lin, Che-Hung

    2008-06-01

    As the number of blogs increases dramatically, these online forums have become important media people use to share feelings and information. Previous research of blogs focuses on writers (i.e., bloggers), but the influence of blogs also requires investigations from readers' perspectives. This study therefore explores motives for reading blogs and discusses their effects on the responses after reading blogs. According to a factor analysis of 204 respondents in Taiwan, motives for reading blogs consist of affective exchange, information search, entertainment, and getting on the bandwagon. A regression analysis suggests the effects of these motives on three major responses--opinion acceptance, interaction intentions, and word-of-mouth (WOM) intentions--reflect the influence of blogs. Specifically, readers who focus on affective exchanges believe blog messages, interact with bloggers, and spread messages to others. Information search and entertainment motives positively affect opinion acceptance; blog readers who focus on information and those who read for fun both view blogs as trustworthy sources. Getting on the bandwagon also positively influences interaction and WOM intentions; these readers interact with bloggers and transmit messages to others.

  8. The research and realization of multi-platform real-time message-oriented middleware in large-scale air traffic control system

    NASA Astrophysics Data System (ADS)

    Liang, Haijun; Ren, Jialong; Song, Tao

    2017-05-01

    Operating requirement of air traffic control system, the multi-platform real-time message-oriented middleware was studied and realized, which is composed of CDCC and CDCS. The former provides application process interface, while the latter realizes data synchronism of CDCC and data exchange. MQM, as one important part of it, provides message queue management and, encrypt and compress data during transmitting procedure. The practical system application verifies that the middleware can simplify the development of air traffic control system, enhance its stability, improve its systematic function and make it convenient for maintenance and reuse.

  9. Layered virus protection for the operations and administrative messaging system

    NASA Technical Reports Server (NTRS)

    Cortez, R. H.

    2002-01-01

    NASA's Deep Space Network (DSN) is critical in supporting the wide variety of operating and plannedunmanned flight projects. For day-to-day operations it relies on email communication between the three Deep Space Communication Complexes (Canberra, Goldstone, Madrid) and NASA's Jet Propulsion Laboratory. The Operations & Administrative Messaging system, based on the Microsoft Windows NTand Exchange platform, provides the infrastructure that is required for reliable, mission-critical messaging. The reliability of this system, however, is threatened by the proliferation of email viruses that continue to spread at alarming rates. A layered approach to email security has been implemented across the DSN to protect against this threat.

  10. Instant messaging at the hospital: supporting articulation work?

    PubMed

    Iversen, Tobias Buschmann; Melby, Line; Toussaint, Pieter

    2013-09-01

    Clinical work is increasingly fragmented and requires extensive articulation and coordination. Computer systems may support such work. In this study, we investigate how instant messaging functions as a tool for supporting articulation work at the hospital. This paper aims to describe the characteristics of instant messaging communication in terms of number and length of messages, distribution over time, and the number of participants included in conversations. We also aim to determine what kind of articulation work is supported by analysing message content. Analysis of one month's worth of instant messages sent through the perioperative coordination and communication system at a Danish hospital. Instant messaging was found to be used extensively for articulation work, mostly through short, simple conversational exchanges. It is used particularly often for communication concerning the patient, specifically, the coordination and logistics of patient care. Instant messaging is used by all actors involved in the perioperative domain. Articulation work and clinical work are hard to separate in a real clinical setting. Predefined messages and strict workflow design do not suffice when supporting communication in the context of collaborative clinical work. Flexibility is of vital importance, and this needs to be reflected in the design of supportive communication systems. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Processing biological literature with customizable Web services supporting interoperable formats.

    PubMed

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. © The Author(s) 2014. Published by Oxford University Press.

  12. Processing biological literature with customizable Web services supporting interoperable formats

    PubMed Central

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. PMID:25006225

  13. Flight tests show potential benefits of data link as primary communication medium

    NASA Technical Reports Server (NTRS)

    Scanlon, Charles H.; Knox, Charles E.

    1991-01-01

    Message exchange for air traffic control (ATC) purposes via data link offers the potential benefits of increasing the airspace system safety and efficiency. This is accomplished by reducing communication errors and relieving the overloaded ATC radio frequencies, which hamper efficient message exchanges during peak traffic periods in many busy terminal areas. However, the many uses and advantages of data link create additional questions concerning the interface among the human-users and the cockpit and ground systems. A flight test was conducted in the NASA Langley B-737 airplane to contrast flight operations using current voice communications with the use of data link for transmitting both strategic and tactical ATC clearances during a typical commercial airline flight from takeoff to landing. Commercial airplane pilots were used as test subjects.

  14. Assessing Quality of Data Standards: Framework and Illustration Using XBRL GAAP Taxonomy

    NASA Astrophysics Data System (ADS)

    Zhu, Hongwei; Wu, Harris

    The primary purpose of data standards or metadata schemas is to improve the interoperability of data created by multiple standard users. Given the high cost of developing data standards, it is desirable to assess the quality of data standards. We develop a set of metrics and a framework for assessing data standard quality. The metrics include completeness and relevancy. Standard quality can also be indirectly measured by assessing interoperability of data instances. We evaluate the framework using data from the financial sector: the XBRL (eXtensible Business Reporting Language) GAAP (Generally Accepted Accounting Principles) taxonomy and US Securities and Exchange Commission (SEC) filings produced using the taxonomy by approximately 500 companies. The results show that the framework is useful and effective. Our analysis also reveals quality issues of the GAAP taxonomy and provides useful feedback to taxonomy users. The SEC has mandated that all publicly listed companies must submit their filings using XBRL. Our findings are timely and have practical implications that will ultimately help improve the quality of financial data.

  15. Virtualization of open-source secure web services to support data exchange in a pediatric critical care research network.

    PubMed

    Frey, Lewis J; Sward, Katherine A; Newth, Christopher J L; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael

    2015-11-01

    To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Relevance of health level 7 clinical document architecture and integrating the healthcare enterprise cross-enterprise document sharing profile for managing chronic wounds in a telemedicine context.

    PubMed

    Finet, Philippe; Gibaud, Bernard; Dameron, Olivier; Le Bouquin Jeannès, Régine

    2016-03-01

    The number of patients with complications associated with chronic diseases increases with the ageing population. In particular, complex chronic wounds raise the re-admission rate in hospitals. In this context, the implementation of a telemedicine application in Basse-Normandie, France, contributes to reduce hospital stays and transport. This application requires a new collaboration among general practitioners, private duty nurses and the hospital staff. However, the main constraint mentioned by the users of this system is the lack of interoperability between the information system of this application and various partners' information systems. To improve medical data exchanges, the authors propose a new implementation based on the introduction of interoperable clinical documents and a digital document repository for managing the sharing of the documents between the telemedicine application users. They then show that this technical solution is suitable for any telemedicine application and any document sharing system in a healthcare facility or network.

  17. Direct2Experts: a pilot national network to demonstrate interoperability among research-networking platforms

    PubMed Central

    Barnett, William; Conlon, Mike; Eichmann, David; Kibbe, Warren; Falk-Krzesinski, Holly; Halaas, Michael; Johnson, Layne; Meeks, Eric; Mitchell, Donald; Schleyer, Titus; Stallings, Sarah; Warden, Michael; Kahlon, Maninder

    2011-01-01

    Research-networking tools use data-mining and social networking to enable expertise discovery, matchmaking and collaboration, which are important facets of team science and translational research. Several commercial and academic platforms have been built, and many institutions have deployed these products to help their investigators find local collaborators. Recent studies, though, have shown the growing importance of multiuniversity teams in science. Unfortunately, the lack of a standard data-exchange model and resistance of universities to share information about their faculty have presented barriers to forming an institutionally supported national network. This case report describes an initiative, which, in only 6 months, achieved interoperability among seven major research-networking products at 28 universities by taking an approach that focused on addressing institutional concerns and encouraging their participation. With this necessary groundwork in place, the second phase of this effort can begin, which will expand the network's functionality and focus on the end users. PMID:22037890

  18. Emergence of a Common Modeling Architecture for Earth System Science (Invited)

    NASA Astrophysics Data System (ADS)

    Deluca, C.

    2010-12-01

    Common modeling architecture can be viewed as a natural outcome of common modeling infrastructure. The development of model utility and coupling packages (ESMF, MCT, OpenMI, etc.) over the last decade represents the realization of a community vision for common model infrastructure. The adoption of these packages has led to increased technical communication among modeling centers and newly coupled modeling systems. However, adoption has also exposed aspects of interoperability that must be addressed before easy exchange of model components among different groups can be achieved. These aspects include common physical architecture (how a model is divided into components) and model metadata and usage conventions. The National Unified Operational Prediction Capability (NUOPC), an operational weather prediction consortium, is collaborating with weather and climate researchers to define a common model architecture that encompasses these advanced aspects of interoperability and looks to future needs. The nature and structure of the emergent common modeling architecture will be discussed along with its implications for future model development.

  19. Towards Standardized Patient Data Exchange: Integrating a FHIR Based API for the Open Medical Record System.

    PubMed

    Kasthurirathne, Suranga N; Mamlin, Burke; Grieve, Grahame; Biondich, Paul

    2015-01-01

    Interoperability is essential to address limitations caused by the ad hoc implementation of clinical information systems and the distributed nature of modern medical care. The HL7 V2 and V3 standards have played a significant role in ensuring interoperability for healthcare. FHIR is a next generation standard created to address fundamental limitations in HL7 V2 and V3. FHIR is particularly relevant to OpenMRS, an Open Source Medical Record System widely used across emerging economies. FHIR has the potential to allow OpenMRS to move away from a bespoke, application specific API to a standards based API. We describe efforts to design and implement a FHIR based API for the OpenMRS platform. Lessons learned from this effort were used to define long term plans to transition from the legacy OpenMRS API to a FHIR based API that greatly reduces the learning curve for developers and helps enhance adhernce to standards.

  20. The Units Ontology: a tool for integrating units of measurement in science

    PubMed Central

    Gkoutos, Georgios V.; Schofield, Paul N.; Hoehndorf, Robert

    2012-01-01

    Units are basic scientific tools that render meaning to numerical data. Their standardization and formalization caters for the report, exchange, process, reproducibility and integration of quantitative measurements. Ontologies are means that facilitate the integration of data and knowledge allowing interoperability and semantic information processing between diverse biomedical resources and domains. Here, we present the Units Ontology (UO), an ontology currently being used in many scientific resources for the standardized description of units of measurements. PMID:23060432

  1. PharmML in Action: an Interoperable Language for Modeling and Simulation

    PubMed Central

    Bizzotto, R; Smith, G; Yvon, F; Kristensen, NR; Swat, MJ

    2017-01-01

    PharmML1 is an XML‐based exchange format2, 3, 4 created with a focus on nonlinear mixed‐effect (NLME) models used in pharmacometrics,5, 6 but providing a very general framework that also allows describing mathematical and statistical models such as single‐subject or nonlinear and multivariate regression models. This tutorial provides an overview of the structure of this language, brief suggestions on how to work with it, and use cases demonstrating its power and flexibility. PMID:28575551

  2. Semantic Integration for Marine Science Interoperability Using Web Technologies

    NASA Astrophysics Data System (ADS)

    Rueda, C.; Bermudez, L.; Graybeal, J.; Isenor, A. W.

    2008-12-01

    The Marine Metadata Interoperability Project, MMI (http://marinemetadata.org) promotes the exchange, integration, and use of marine data through enhanced data publishing, discovery, documentation, and accessibility. A key effort is the definition of an Architectural Framework and Operational Concept for Semantic Interoperability (http://marinemetadata.org/sfc), which is complemented with the development of tools that realize critical use cases in semantic interoperability. In this presentation, we describe a set of such Semantic Web tools that allow performing important interoperability tasks, ranging from the creation of controlled vocabularies and the mapping of terms across multiple ontologies, to the online registration, storage, and search services needed to work with the ontologies (http://mmisw.org). This set of services uses Web standards and technologies, including Resource Description Framework (RDF), Web Ontology language (OWL), Web services, and toolkits for Rich Internet Application development. We will describe the following components: MMI Ontology Registry: The MMI Ontology Registry and Repository provides registry and storage services for ontologies. Entries in the registry are associated with projects defined by the registered users. Also, sophisticated search functions, for example according to metadata items and vocabulary terms, are provided. Client applications can submit search requests using the WC3 SPARQL Query Language for RDF. Voc2RDF: This component converts an ASCII comma-delimited set of terms and definitions into an RDF file. Voc2RDF facilitates the creation of controlled vocabularies by using a simple form-based user interface. Created vocabularies and their descriptive metadata can be submitted to the MMI Ontology Registry for versioning and community access. VINE: The Vocabulary Integration Environment component allows the user to map vocabulary terms across multiple ontologies. Various relationships can be established, for example exactMatch, narrowerThan, and subClassOf. VINE can compute inferred mappings based on the given associations. Attributes about each mapping, like comments and a confidence level, can also be included. VINE also supports registering and storing resulting mapping files in the Ontology Registry. The presentation will describe the application of semantic technologies in general, and our planned applications in particular, to solve data management problems in the marine and environmental sciences.

  3. Empowerment of patients in online discussions about medicine use.

    PubMed

    van Berkel, Jasper J; Lambooij, Mattijs S; Hegger, Ingrid

    2015-04-08

    Patient empowerment is crucial in the successful self-management of people with chronic diseases. In this study, we investigated whether discussions about medicine use taking place on online message boards contribute to patient empowerment and could subsequently result in the more effective use of medicines. We discuss the extent to which patient empowerment processes occur in discussions on online message boards, focusing on patients with three disorders with different characteristics: diabetes, Amyotrophic Lateral Sclerosis (ALS) and Attention Deficit / Hyperactivity Disorder (ADHD). Because information is an important factor in both patient empowerment and self-management, we also evaluate the quality of the information being exchanged. We used a deductive thematic analysis method based on pre-existing categories. We gathered and analysed 5532 posts related to the conditions ADHD, ALS and diabetes from seven message boards (three for ADHD, three for diabetes, and one for ALS). We coded the posts for empowerment processes and the quality of the information exchanged. We identified patient empowerment processes in posts related to all three disorders. There is some variation in the frequency of these processes, but they show a similar order in the results: patients used the online message boards to exchange information, share personal experiences and for empathy or support. The type of information shared in these processes could contribute to the patient's self-efficacy when it comes to medicine use. The exchanged information was either correct or largely harmless. We also observed a tendency whereby participants correct previously posted incorrect information, and refer people to a healthcare professional following a request for medical advice, e.g. concerning the choice of medicines or dosage. Our findings show that patient empowerment processes occur in posts related to all three disorders. The type of information shared in these processes can contribute to the patient's self-efficacy when it comes to medicine use. The tendency to refer people to a healthcare professional shows that patients still reserve an important role for healthcare professionals in the care process, despite the development towards more self-management.

  4. An Early Model for Value and Sustainability in Health Information Exchanges: Qualitative Study

    PubMed Central

    2018-01-01

    Background The primary value relative to health information exchange has been seen in terms of cost savings relative to laboratory and radiology testing, emergency department expenditures, and admissions. However, models are needed to statistically quantify value and sustainability and better understand the dependent and mediating factors that contribute to value and sustainability. Objective The purpose of this study was to provide a basis for early model development for health information exchange value and sustainability. Methods A qualitative study was conducted with 21 interviews of eHealth Exchange participants across 10 organizations. Using a grounded theory approach and 3.0 as a relative frequency threshold, 5 main categories and 16 subcategories emerged. Results This study identifies 3 core current perceived value factors and 5 potential perceived value factors—how interviewees predict health information exchanges may evolve as there are more participants. These value factors were used as the foundation for early model development for sustainability of health information exchange. Conclusions Using the value factors from the interviews, the study provides the basis for early model development for health information exchange value and sustainability. This basis includes factors from the research: fostering consumer engagement; establishing a provider directory; quantifying use, cost, and clinical outcomes; ensuring data integrity through patient matching; and increasing awareness, usefulness, interoperability, and sustainability of eHealth Exchange. PMID:29712623

  5. An Early Model for Value and Sustainability in Health Information Exchanges: Qualitative Study.

    PubMed

    Feldman, Sue S

    2018-04-30

    The primary value relative to health information exchange has been seen in terms of cost savings relative to laboratory and radiology testing, emergency department expenditures, and admissions. However, models are needed to statistically quantify value and sustainability and better understand the dependent and mediating factors that contribute to value and sustainability. The purpose of this study was to provide a basis for early model development for health information exchange value and sustainability. A qualitative study was conducted with 21 interviews of eHealth Exchange participants across 10 organizations. Using a grounded theory approach and 3.0 as a relative frequency threshold, 5 main categories and 16 subcategories emerged. This study identifies 3 core current perceived value factors and 5 potential perceived value factors-how interviewees predict health information exchanges may evolve as there are more participants. These value factors were used as the foundation for early model development for sustainability of health information exchange. Using the value factors from the interviews, the study provides the basis for early model development for health information exchange value and sustainability. This basis includes factors from the research: fostering consumer engagement; establishing a provider directory; quantifying use, cost, and clinical outcomes; ensuring data integrity through patient matching; and increasing awareness, usefulness, interoperability, and sustainability of eHealth Exchange. ©Sue S Feldman. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 30.04.2018.

  6. Electronic signatures for long-lasting storage purposes in electronic archives.

    PubMed

    Pharow, Peter; Blobel, Bernd

    2005-03-01

    Communication and co-operation in healthcare and welfare require a certain set of trusted third party (TTP) services describing both status and relation of communicating principals as well as their corresponding keys and attributes. Additional TTP services are needed to provide trustworthy information about dynamic issues of communication and co-operation such as time and location of processes, workflow relations, and system behaviour. Legal and ethical requirements demand securely stored patient information and well-defined access rights. Among others, electronic signatures based on asymmetric cryptography are important means for securing the integrity of a message or file as well as for accountability purposes including non-repudiation of both origin and receipt. Electronic signatures along with certified time stamps or time signatures are especially important for electronic archives in general, electronic health records (EHR) in particular, and especially for typical purposes of long-lasting storage. Apart from technical storage problems (e.g. lifetime of the storage devices, interoperability of retrieval and presentation software), this paper identifies mechanisms of e.g. re-signing and re-stamping of data items, files, messages, sets of archived items or documents, archive structures, and even whole archives.

  7. Feasibility of 30-day hospital readmission prediction modeling based on health information exchange data.

    PubMed

    Swain, Matthew J; Kharrazi, Hadi

    2015-12-01

    Unplanned 30-day hospital readmission account for roughly $17 billion in annual Medicare spending. Many factors contribute to unplanned hospital readmissions and multiple models have been developed over the years to predict them. Most researchers have used insurance claims or administrative data to train and operationalize their Readmission Risk Prediction Models (RRPMs). Some RRPM developers have also used electronic health records data; however, using health informatics exchange data has been uncommon among such predictive models and can be beneficial in its ability to provide real-time alerts to providers at the point of care. We conducted a semi-systematic review of readmission predictive factors published prior to March 2013. Then, we extracted and merged all significant variables listed in those articles for RRPMs. Finally, we matched these variables with common HL7 messages transmitted by a sample of health information exchange organizations (HIO). The semi-systematic review resulted in identification of 32 articles and 297 predictive variables. The mapping of these variables with common HL7 segments resulted in an 89.2% total coverage, with the DG1 (diagnosis) segment having the highest coverage of 39.4%. The PID (patient identification) and OBX (observation results) segments cover 13.9% and 9.1% of the variables. Evaluating the same coverage in three sample HIOs showed data incompleteness. HIOs can utilize HL7 messages to develop unique RRPMs for their stakeholders; however, data completeness of exchanged messages should meet certain thresholds. If data quality standards are met by stakeholders, HIOs would be able to provide real-time RRPMs that not only predict intra-hospital readmissions but also inter-hospital cases. A RRPM derived using HIO data exchanged through may prove to be a useful method to prevent unplanned hospital readmissions. In order for the RRPM derived from HIO data to be effective, hospitals must actively exchange clinical information through the HIO and develop actionable methods that integrate into the workflow of providers to ensure that patients at high-risk for readmission receive the care they need. Copyright © 2015. Published by Elsevier Ireland Ltd.

  8. Cooperative runtime monitoring

    NASA Astrophysics Data System (ADS)

    Hallé, Sylvain

    2013-11-01

    Requirements on message-based interactions can be formalised as an interface contract that specifies constraints on the sequence of possible messages that can be exchanged by multiple parties. At runtime, each peer can monitor incoming messages and check that the contract is correctly being followed by their respective senders. We introduce cooperative runtime monitoring, where a recipient 'delegates' its monitoring task to the sender, which is required to provide evidence that the message it sends complies with the contract. In turn, this evidence can be quickly checked by the recipient, which is then guaranteed of the sender's compliance to the contract without doing the monitoring computation by itself. A particular application of this concept is shown on web services, where service providers can monitor and enforce contract compliance of third-party clients at a small cost on the server side, while avoiding to certify or digitally sign them.

  9. Sharing clinical decisions for multimorbidity case management using social network and open-source tools.

    PubMed

    Martínez-García, Alicia; Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Leal, Sandra; Parra, Carlos

    2013-12-01

    Social networks applied through Web 2.0 tools have gained importance in health domain, because they produce improvements on the communication and coordination capabilities among health professionals. This is highly relevant for multimorbidity patients care because there is a large number of health professionals in charge of patient care, and this requires to obtain clinical consensus in their decisions. Our objective is to develop a tool for collaborative work among health professionals for multimorbidity patient care. We describe the architecture to incorporate decision support functionalities in a social network tool to enable the adoption of shared decisions among health professionals from different care levels. As part of the first stage of the project, this paper describes the results obtained in a pilot study about acceptance and use of the social network component in our healthcare setting. At Virgen del Rocío University Hospital we have designed and developed the Shared Care Platform (SCP) to provide support in the continuity of care for multimorbidity patients. The SCP has two consecutively developed components: social network component, called Clinical Wall, and Clinical Decision Support (CDS) system. The Clinical Wall contains a record where health professionals are able to debate and define shared decisions. We conducted a pilot study to assess the use and acceptance of the SCP by healthcare professionals through questionnaire based on the theory of the Technology Acceptance Model. In March 2012 we released and deployed the SCP, but only with the social network component. The pilot project lasted 6 months in the hospital and 2 primary care centers. From March to September 2012 we created 16 records in the Clinical Wall, all with a high priority. A total of 10 professionals took part in the exchange of messages: 3 internists and 7 general practitioners generated 33 messages. 12 of the 16 record (75%) were answered by the destination health professionals. The professionals valued positively all the items in the questionnaire. As part of the SCP, opensource tools for CDS will be incorporated to provide recommendations for medication and problem interactions, as well as to calculate indexes or scales from validated questionnaires. They will receive the patient summary information provided by the regional Electronic Health Record system through a web service with the information defined according to the virtual Medical Record specification. Clinical Wall has been developed to allow communication and coordination between the healthcare professionals involved in multimorbidity patient care. Agreed decisions were about coordination for appointment changing, patient conditions, diagnosis tests, and prescription changes and renewal. The application of interoperability standards and open source software can bridge the gap between knowledge and clinical practice, while enabling interoperability and scalability. Open source with the social network encourages adoption and facilitates collaboration. Although the results obtained for use indicators are still not as high as it was expected, based on the promising results obtained in the acceptance questionnaire of SMP, we expect that the new CDS tools will increase the use by the health professionals. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Communicating Concepts about Altruism in Interstellar Messages

    NASA Astrophysics Data System (ADS)

    Vakoch, Douglas A.

    2002-01-01

    This project identifies key principles of altruism that can be translated into interstellar messages for communication with extraterrestrial intelligence. The message contents will focus specifically on the evolution of altruism, drawing on recent insights in evolutionary biology, with particular emphasis on sociobiological accounts of kin selection and reciprocal altruism. This focus on altruism for message contents has several advantages. First, the subject can be translated into interstellar messages both via an existing formal interstellar language and via pictorial messages. For example, aspects of reciprocal altruism can be described through mathematical modeling, such as game theoretic approaches, which in turn can be described readily in the interstellar language Lincos. Second, concentrating on altruism as a message content may facilitate communications with extraterrestrial intelligence. Some scientists have argued that humans may be expected to communicate something about their moral status and development in an exchange with extraterrestrials. One of the most salient ways that terrestrial and extraterrestrial civilizations might be expected to evaluate one another is in terms of ethical motivations. Indeed, current search strategies assume some measure of altruism on the part of transmitting civilizations; with no guarantee of a response, the other civilization would be providing information to us with no direct payoff. Thus, concepts about altruism provide an appropriate content for interstellar messages, because the concepts themselves might be understood by extraterrestrial civilizations.

  11. A system for analysis and classification of voice communications

    NASA Technical Reports Server (NTRS)

    Older, H. J.; Jenney, L. L.; Garland, L.

    1973-01-01

    A method for analysis and classification of verbal communications typically associated with manned space missions or simulations was developed. The study was carried out in two phases. Phase 1 was devoted to identification of crew tasks and activities which require voice communication for accomplishment or reporting. Phase 2 entailed development of a message classification system and a preliminary test of its feasibility. The classification system permits voice communications to be analyzed to three progressively more specific levels of detail and to be described in terms of message content, purpose, and the participants in the information exchange. A coding technique was devised to allow messages to be recorded by an eight-digit number.

  12. Integration of IEEE 1451 and HL7 exchanging information for patients' sensor data.

    PubMed

    Kim, Wooshik; Lim, Suyoung; Ahn, Jinsoo; Nah, Jiyoung; Kim, Namhyun

    2010-12-01

    HL7 (Health Level 7) is a standard developed for exchanging incompatible healthcare information generated from programs or devices among heterogenous medical information systems. At present, HL7 is growing as a global standard. However, the HL7 standard does not support effective methods for treating data from various medical sensors, especially from mobile sensors. As ubiquitous systems are growing, HL7 must communicate with various medical transducers. In the area of sensor fields, IEEE 1451 is a group of standards for controlling transducers and for communicating data from/to various transducers. In this paper, we present the possibility of interoperability between the two standards, i.e., HL7 and IEEE 1451. After we present a method to integrate them and show the preliminary results of this approach.

  13. Integrating hospital information systems in healthcare institutions: a mediation architecture.

    PubMed

    El Azami, Ikram; Cherkaoui Malki, Mohammed Ouçamah; Tahon, Christian

    2012-10-01

    Many studies have examined the integration of information systems into healthcare institutions, leading to several standards in the healthcare domain (CORBAmed: Common Object Request Broker Architecture in Medicine; HL7: Health Level Seven International; DICOM: Digital Imaging and Communications in Medicine; and IHE: Integrating the Healthcare Enterprise). Due to the existence of a wide diversity of heterogeneous systems, three essential factors are necessary to fully integrate a system: data, functions and workflow. However, most of the previous studies have dealt with only one or two of these factors and this makes the system integration unsatisfactory. In this paper, we propose a flexible, scalable architecture for Hospital Information Systems (HIS). Our main purpose is to provide a practical solution to insure HIS interoperability so that healthcare institutions can communicate without being obliged to change their local information systems and without altering the tasks of the healthcare professionals. Our architecture is a mediation architecture with 3 levels: 1) a database level, 2) a middleware level and 3) a user interface level. The mediation is based on two central components: the Mediator and the Adapter. Using the XML format allows us to establish a structured, secured exchange of healthcare data. The notion of medical ontology is introduced to solve semantic conflicts and to unify the language used for the exchange. Our mediation architecture provides an effective, promising model that promotes the integration of hospital information systems that are autonomous, heterogeneous, semantically interoperable and platform-independent.

  14. An uncertainty-based distributed fault detection mechanism for wireless sensor networks.

    PubMed

    Yang, Yang; Gao, Zhipeng; Zhou, Hang; Qiu, Xuesong

    2014-04-25

    Exchanging too many messages for fault detection will cause not only a degradation of the network quality of service, but also represents a huge burden on the limited energy of sensors. Therefore, we propose an uncertainty-based distributed fault detection through aided judgment of neighbors for wireless sensor networks. The algorithm considers the serious influence of sensing measurement loss and therefore uses Markov decision processes for filling in missing data. Most important of all, fault misjudgments caused by uncertainty conditions are the main drawbacks of traditional distributed fault detection mechanisms. We draw on the experience of evidence fusion rules based on information entropy theory and the degree of disagreement function to increase the accuracy of fault detection. Simulation results demonstrate our algorithm can effectively reduce communication energy overhead due to message exchanges and provide a higher detection accuracy ratio.

  15. Challenge: How Effective is Routing for Wireless Networking

    DTIC Science & Technology

    2017-03-03

    sage (called a“ hello ”) to all of its neighbors. If a series of hello messages are exchanged between two users, a link is considered to exist between...these schemes. A brief description of ETX is as follows. For a given window of time, the number of hello packets that a user receives from a neighbor is...counted. A cost is then assigned to the link based on how many hello messages were heard; a link that has fewer hellos successfully transmitted across

  16. How Effective is Routing for Wireless Networking

    DTIC Science & Technology

    2016-03-05

    Routing (LAR) [31]. The basic mechanism of how link-based routing schemes operate is as follows: a user broadcasts a control message (called a “ hello ...to all of its neighbors. If a series of hello messages are exchanged between two users, a link is considered to exist between them. Routes are then be...description of ETX is as follows. For a given window of time, the number of hello packets that a user receives from a neighbor is counted. A cost is then

  17. Challenge: How Effective is Routing for Wireless Networking

    DTIC Science & Technology

    2015-09-07

    sage (called a“ hello ”) to all of its neighbors. If a series of hello messages are exchanged between two users, a link is considered to exist between...these schemes. A brief description of ETX is as follows. For a given window of time, the number of hello packets that a user receives from a neighbor is...counted. A cost is then assigned to the link based on how many hello messages were heard; a link that has fewer hellos successfully transmitted across

  18. Recommended Methodology for Inter-Service/Agency Automated Message Processing Exchange (I-S/A AMPE). Cost and Schedule Analysis of Security Alternatives.

    DTIC Science & Technology

    1982-02-23

    segregate the computer and storage from the outside world 2. Administrative security to control access to secure computer facilities 3. Network security to...Classification Alternative A- 8 NETWORK KG GENSER DSSCS AMPE TERMINALS TP No. 022-4668-A Figure A-2. Dedicated Switching Architecture Alternative A- 9...communications protocol with the network and GENSER message transmission to the - I-S/A AMPE processor. 7. DSSCS TPU - Handles communications protocol with

  19. Modelling and approaching pragmatic interoperability of distributed geoscience data

    NASA Astrophysics Data System (ADS)

    Ma, Xiaogang

    2010-05-01

    Interoperability of geodata, which is essential for sharing information and discovering insights within a cyberinfrastructure, is receiving increasing attention. A key requirement of interoperability in the context of geodata sharing is that data provided by local sources can be accessed, decoded, understood and appropriately used by external users. Various researchers have discussed that there are four levels in data interoperability issues: system, syntax, schematics and semantics, which respectively relate to the platform, encoding, structure and meaning of geodata. Ontology-driven approaches have been significantly studied addressing schematic and semantic interoperability issues of geodata in the last decade. There are different types, e.g. top-level ontologies, domain ontologies and application ontologies and display forms, e.g. glossaries, thesauri, conceptual schemas and logical theories. Many geodata providers are maintaining their identified local application ontologies in order to drive standardization in local databases. However, semantic heterogeneities often exist between these local ontologies, even though they are derived from equivalent disciplines. In contrast, common ontologies are being studied in different geoscience disciplines (e.g., NAMD, SWEET, etc.) as a standardization procedure to coordinate diverse local ontologies. Semantic mediation, e.g. mapping between local ontologies, or mapping local ontologies to common ontologies, has been studied as an effective way of achieving semantic interoperability between local ontologies thus reconciling semantic heterogeneities in multi-source geodata. Nevertheless, confusion still exists in the research field of semantic interoperability. One problem is caused by eliminating elements of local pragmatic contexts in semantic mediation. Comparing to the context-independent feature of a common domain ontology, local application ontologies are closely related to elements (e.g., people, time, location, intention, procedure, consequence, etc.) of local pragmatic contexts and thus context-dependent. Elimination of these elements will inevitably lead to information loss in semantic mediation between local ontologies. Correspondingly, understanding and effect of exchanged data in a new context may differ from that in its original context. Another problem is the dilemma on how to find a balance between flexibility and standardization of local ontologies, because ontologies are not fixed, but continuously evolving. It is commonly realized that we cannot use a unified ontology to replace all local ontologies because they are context-dependent and need flexibility. However, without coordination of standards, freely developed local ontologies and databases will bring enormous work of mediation between them. Finding a balance between standardization and flexibility for evolving ontologies, in a practical sense, requires negotiations (i.e. conversations, agreements and collaborations) between different local pragmatic contexts. The purpose of this work is to set up a computer-friendly model representing local pragmatic contexts (i.e. geodata sources), and propose a practical semantic negotiation procedure for approaching pragmatic interoperability between local pragmatic contexts. Information agents, objective facts and subjective dimensions are reviewed as elements of a conceptual model for representing pragmatic contexts. The author uses them to draw a practical semantic negotiation procedure approaching pragmatic interoperability of distributed geodata. The proposed conceptual model and semantic negotiation procedure were encoded with Description Logic, and then applied to analyze and manipulate semantic negotiations between different local ontologies within the National Mineral Resources Assessment (NMRA) project of China, which involves multi-source and multi-subject geodata sharing.

  20. A Content Analysis of E-mail Communication between Patients and Their Providers: Patients Get the Message

    PubMed Central

    White, Casey B.; Moyer, Cheryl A.; Stern, David T.; Katz, Steven J.

    2004-01-01

    Objective: E-mail use in the clinical setting has been slow to diffuse for several reasons, including providers' concerns about patients' inappropriate and inefficient use of the technology. This study examined the content of a random sample of patient–physician e-mail messages to determine the validity of those concerns. Design: A qualitative analysis of patient–physician e-mail messages was performed. Measurements: A total of 3,007 patient–physician e-mail messages were collected over 11 months as part of a randomized, controlled trial of a triage-based e-mail system in two primary care centers (including 98 physicians); 10% of messages were randomly selected for review. Messages were coded across such domains as message type, number of requests per e-mail, inclusion of sensitive content, necessity of a physician response, and message tone. Results: The majority (82.8%) of messages addressed a single issue. The most common message types included information updates to the physicians (41.4%), prescription renewals (24.2%), health questions (13.2%), questions about test results (10.9%), referrals (8.8%), “other” (including thank yous, apologies) (8.8%), appointments (5.4%), requests for non-health-related information (4.8%), and billing questions (0.3%). Overall, messages were concise, formal, and medically relevant. Very few (5.1%) included sensitive content, and none included urgent messages. Less than half (43.2%) required a physician response. Conclusion: A triage-based e-mail system promoted e-mail exchanges appropriate for primary care. Most patients adhered to guidelines aimed at focusing content, limiting the number of requests per message, and avoiding urgent requests or highly sensitive content. Thus, physicians' concerns about the content of patients' e-mails may be unwarranted. PMID:15064295

  1. Supporting BPMN choreography with system integration artefacts for enterprise process collaboration

    NASA Astrophysics Data System (ADS)

    Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2014-07-01

    Business Process Model and Notation (BPMN) choreography modelling depicts externally visible message exchanges between collaborating processes of enterprise information systems. Implementation of choreography relies on designing system integration solutions to realise message exchanges between independently developed systems. Enterprise integration patterns (EIPs) are widely accepted artefacts to design integration solutions. If the choreography model represents coordination requirements between processes with behaviour mismatches, the integration designer needs to analyse the routing requirements and address these requirements by manually designing EIP message routers. As collaboration scales and complexity increases, manual design becomes inefficient. Thus, the research problem of this paper is to explore a method to automatically identify routing requirements from BPMN choreography model and to accordingly design routing in the integration solution. To achieve this goal, recurring behaviour mismatch scenarios are analysed as patterns, and corresponding solutions are proposed as EIP routers. Using this method, a choreography model can be analysed by computer to identify occurrences of mismatch patterns, leading to corresponding router selection. A case study demonstrates that the proposed method enables computer-assisted integration design to implement choreography. A further experiment reveals that the method is effective to improve the design quality and reduce time cost.

  2. Asynchronous Communication of TLNS3DMB Boundary Exchange

    NASA Technical Reports Server (NTRS)

    Hammond, Dana P.

    1997-01-01

    This paper describes the recognition of implicit serialization due to coarse-grain, synchronous communication and demonstrates the conversion to asynchronous communication for the exchange of boundary condition information in the Thin-Layer Navier Stokes 3-Dimensional Multi Block (TLNS3DMB) code. The implementation details of using asynchronous communication is provided including buffer allocation, message identification, and barrier control. The IBM SP2 was used for the tests presented.

  3. Development of Extended Content Standards for Biodiversity Data

    NASA Astrophysics Data System (ADS)

    Hugo, Wim; Schmidt, Jochen; Saarenmaa, Hannu

    2013-04-01

    Interoperability in the field of Biodiversity observation has been strongly driven by the development of a number of global initiatives (GEO, GBIF, OGC, TDWG, GenBank, …) and its supporting standards (OGC-WxS, OGC-SOS, Darwin Core (DwC), NetCDF, …). To a large extent, these initiatives have focused on discoverability and standardization of syntactic and schematic interoperability. Semantic interoperability is more complex, requiring development of domain-dependent conceptual data models, and extension of these models with appropriate ontologies (typically manifested as controlled vocabularies). Biodiversity content has been standardized partly, for example through Darwin Core for occurrence data and associated taxonomy, and through Genbank for genetic data, but other contexts of biodiversity observation have lagged behind - making it difficult to achieve semantic interoperability between distributed data sources. With this in mind, WG8 of GEO BON (charged with data and systems interoperability) has started a work programme to address a number of concerns, one of which is the gap in content standards required to make Biodiversity data truly interoperable. The paper reports on the framework developed by WG8 for the classification of Biodiversity observation data into 'families' of use cases and its supporting data schema, where gaps, if any, in the availability if content standards have been identified, and how these are to be addressed by way of an abstract data model and the development of associated content standards. It is proposed that a minimum set of standards (1) will be required to address the scope of Biodiversity content, aligned with levels and dimensions of observation, and based on the 'Essential Biodiversity Variables' (2) being developed by GEO BON . The content standards are envisaged as loosely separated from the syntactic and schematic standards used for the base data exchange: typically, services would offer an existing data standard (DwC, WFS, SOS, NetCDF), with a use-case dependent 'payload' embedded into the data stream. This enables the re-use of the abstract schema, and sometimes the implementation specification (for example XML, JSON, or NetCDF conventions) across services. An explicit aim will be to make the XML implementation specification re-usable as a DwC and a GML (SOS end WFS) extension. (1) Olga Lyashevska, Keith D. Farnsworth, How many dimensions of biodiversity do we need?, Ecological Indicators, Volume 18, July 2012, Pages 485-492, ISSN 1470-160X, 10.1016/j.ecolind.2011.12.016. (2) GEO BON: Workshop on Essential Biodiversity Variables (27-29 February 2012, Frascati, Italy). (http://www.earthobservations.org/geobon_docs_20120227.shtml)

  4. Improving newborn screening laboratory test ordering and result reporting using health information exchange

    PubMed Central

    van Dyck, Peter C; Rinaldo, Piero; McDonald, Clement; Howell, R Rodrey; Zuckerman, Alan; Downing, Gregory

    2010-01-01

    Capture, coding and communication of newborn screening (NBS) information represent a challenge for public health laboratories, health departments, hospitals, and ambulatory care practices. An increasing number of conditions targeted for screening and the complexity of interpretation contribute to a growing need for integrated information-management strategies. This makes NBS an important test of tools and architecture for electronic health information exchange (HIE) in this convergence of individual patient care and population health activities. For this reason, the American Health Information Community undertook three tasks described in this paper. First, a newborn screening use case was established to facilitate standards harmonization for common terminology and interoperability specifications guiding HIE. Second, newborn screening coding and terminology were developed for integration into electronic HIE activities. Finally, clarification of privacy, security, and clinical laboratory regulatory requirements governing information exchange was provided, serving as a framework to establish pathways for improving screening program timeliness, effectiveness, and efficiency of quality patient care services. PMID:20064796

  5. A Proposed Information Architecture for Telehealth System Interoperability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warren, S.; Craft, R.L.; Parks, R.C.

    1999-04-07

    Telemedicine technology is rapidly evolving. Whereas early telemedicine consultations relied primarily on video conferencing, consultations today may utilize video conferencing, medical peripherals, store-and-forward capabilities, electronic patient record management software, and/or a host of other emerging technologies. These remote care systems rely increasingly on distributed, collaborative information technology during the care delivery process, in its many forms. While these leading-edge systems are bellwethers for highly advanced telemedicine, the remote care market today is still immature. Most telemedicine systems are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor providesmore » and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver entire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. We propose a secure, object-oriented information architecture for telemedicine systems that promotes plug-and-play interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a lego-like fashion to achieve the desired device or system functionality. The architecture will support various ongoing standards work in the medical device arena.« less

  6. Implementing Internet of Things in a military command and control environment

    NASA Astrophysics Data System (ADS)

    Raglin, Adrienne; Metu, Somiya; Russell, Stephen; Budulas, Peter

    2017-05-01

    While the term Internet of Things (IoT) has been coined relatively recently, it has deep roots in multiple other areas of research including cyber-physical systems, pervasive and ubiquitous computing, embedded systems, mobile ad-hoc networks, wireless sensor networks, cellular networks, wearable computing, cloud computing, big data analytics, and intelligent agents. As the Internet of Things, these technologies have created a landscape of diverse heterogeneous capabilities and protocols that will require adaptive controls to effect linkages and changes that are useful to end users. In the context of military applications, it will be necessary to integrate disparate IoT devices into a common platform that necessarily must interoperate with proprietary military protocols, data structures, and systems. In this environment, IoT devices and data will not be homogeneous and provenance-controlled (i.e. single vendor/source/supplier owned). This paper presents a discussion of the challenges of integrating varied IoT devices and related software in a military environment. A review of contemporary commercial IoT protocols is given and as a practical example, a middleware implementation is proffered that provides transparent interoperability through a proactive message dissemination system. The implementation is described as a framework through which military applications can integrate and utilize commercial IoT in conjunction with existing military sensor networks and command and control (C2) systems.

  7. New Secure E-mail System Based on Bio-Chaos Key Generation and Modified AES Algorithm

    NASA Astrophysics Data System (ADS)

    Hoomod, Haider K.; Radi, A. M.

    2018-05-01

    The E-mail messages exchanged between sender’s Mailbox and recipient’s Mailbox over the open systems and insecure Networks. These messages may be vulnerable to eavesdropping and itself poses a real threat to the privacy and data integrity from unauthorized persons. The E-mail Security includes the following properties (Confidentiality, Authentication, Message integrity). We need a safe encryption algorithm to encrypt Email messages such as the algorithm Advanced Encryption Standard (AES) or Data Encryption Standard DES, as well as biometric recognition and chaotic system. The proposed E-mail system security uses modified AES algorithm and uses secret key-bio-chaos that consist of biometric (Fingerprint) and chaotic system (Lu and Lorenz). This modification makes the proposed system more sensitive and random. The execution time for both encryption and decryption of the proposed system is much less from original AES, in addition to being compatible with all Mail Servers.

  8. Electronic health information quality challenges and interventions to improve public health surveillance data and practice.

    PubMed

    Dixon, Brian E; Siegel, Jason A; Oemig, Tanya V; Grannis, Shaun J

    2013-01-01

    We examined completeness, an attribute of data quality, in the context of electronic laboratory reporting (ELR) of notifiable disease information to public health agencies. We extracted more than seven million ELR messages from multiple clinical information systems in two states. We calculated and compared the completeness of various data fields within the messages that were identified to be important to public health reporting processes. We compared unaltered, original messages from source systems with similar messages from another state as well as messages enriched by a health information exchange (HIE). Our analysis focused on calculating completeness (i.e., the number of nonmissing values) for fields deemed important for inclusion in notifiable disease case reports. The completeness of data fields for laboratory transactions varied across clinical information systems and jurisdictions. Fields identifying the patient and test results were usually complete (97%-100%). Fields containing patient demographics, patient contact information, and provider contact information were suboptimal (6%-89%). Transactions enhanced by the HIE were found to be more complete (increases ranged from 2% to 25%) than the original messages. ELR data from clinical information systems can be of suboptimal quality. Public health monitoring of data sources and augmentation of ELR message content using HIE services can improve data quality.

  9. DICOM static and dynamic representation through unified modeling language

    NASA Astrophysics Data System (ADS)

    Martinez-Martinez, Alfonso; Jimenez-Alaniz, Juan R.; Gonzalez-Marquez, A.; Chavez-Avelar, N.

    2004-04-01

    The DICOM standard, as all standards, specifies in generic way the management in network and storage media environments of digital medical images and their related information. However, understanding the specifications for particular implementation is not a trivial work. Thus, this work is about understanding and modelling parts of the DICOM standard using Object Oriented methodologies, as part of software development processes. This has offered different static and dynamic views, according with the standard specifications, and the resultant models have been represented through the Unified Modelling Language (UML). The modelled parts are related to network conformance claim: Network Communication Support for Message Exchange, Message Exchange, Information Object Definitions, Service Class Specifications, Data Structures and Encoding, and Data Dictionary. The resultant models have given a better understanding about DICOM parts and have opened the possibility of create a software library to develop DICOM conformable PACS applications.

  10. An Uncertainty-Based Distributed Fault Detection Mechanism for Wireless Sensor Networks

    PubMed Central

    Yang, Yang; Gao, Zhipeng; Zhou, Hang; Qiu, Xuesong

    2014-01-01

    Exchanging too many messages for fault detection will cause not only a degradation of the network quality of service, but also represents a huge burden on the limited energy of sensors. Therefore, we propose an uncertainty-based distributed fault detection through aided judgment of neighbors for wireless sensor networks. The algorithm considers the serious influence of sensing measurement loss and therefore uses Markov decision processes for filling in missing data. Most important of all, fault misjudgments caused by uncertainty conditions are the main drawbacks of traditional distributed fault detection mechanisms. We draw on the experience of evidence fusion rules based on information entropy theory and the degree of disagreement function to increase the accuracy of fault detection. Simulation results demonstrate our algorithm can effectively reduce communication energy overhead due to message exchanges and provide a higher detection accuracy ratio. PMID:24776937

  11. LSST camera control system

    NASA Astrophysics Data System (ADS)

    Marshall, Stuart; Thaler, Jon; Schalk, Terry; Huffer, Michael

    2006-06-01

    The LSST Camera Control System (CCS) will manage the activities of the various camera subsystems and coordinate those activities with the LSST Observatory Control System (OCS). The CCS comprises a set of modules (nominally implemented in software) which are each responsible for managing one camera subsystem. Generally, a control module will be a long lived "server" process running on an embedded computer in the subsystem. Multiple control modules may run on a single computer or a module may be implemented in "firmware" on a subsystem. In any case control modules must exchange messages and status data with a master control module (MCM). The main features of this approach are: (1) control is distributed to the local subsystem level; (2) the systems follow a "Master/Slave" strategy; (3) coordination will be achieved by the exchange of messages through the interfaces between the CCS and its subsystems. The interface between the camera data acquisition system and its downstream clients is also presented.

  12. Infrastructure for Planetary Sciences: Universal planetary database development project

    NASA Astrophysics Data System (ADS)

    Kasaba, Yasumasa; Capria, M. T.; Crichton, D.; Zender, J.; Beebe, R.

    The International Planetary Data Alliance (IPDA), formally formed under COSPAR (Formal start: from the COSPAR 2008 at Montreal), is a joint international effort to enable global access and exchange of high quality planetary science data, and to establish archive stan-dards that make it easier to share the data across international boundaries. In 2008-2009, thanks to the many players from several agencies and institutions, we got fruitful results in 6 projects: (1) Inter-operable Planetary Data Access Protocol (PDAP) implementations [led by J. Salgado@ESA], (2) Small bodies interoperability [led by I. Shinohara@JAXA N. Hirata@U. Aizu], (3) PDAP assessment [led by Y. Yamamoto@JAXA], (4) Architecture and standards definition [led by D. Crichton@NASA], (5) Information model and data dictionary [led by S. Hughes@NASA], and (6) Venus Express Interoperability [led by N. Chanover@NMSU]. 'IPDA 2009-2010' is important, especially because the NASA/PDS system reformation is now reviewed as it develops for application at the international level. IPDA is the gate for the establishment of the future infrastructure. We are running 8 projects: (1) IPDA Assessment of PDS4 Data Standards [led by S. Hughes (NASA/JPL)], (2) IPDA Archive Guide [led by M.T. Capria (IASF/INAF) and D. Heather (ESA/PSA)], (3) IPDA Standards Identification [led by E. Rye (NASA/PDS) and G. Krishna (ISRO)], (4) Ancillary Data Standards [led by C. Acton (NASA/JPL)], (5) IPDA Registries Definition [led by D. Crichton (NASA/JPL)], (6) PDAP Specification [led by J. Salgado (ESA/PSA) and Y. Yamamoto (JAXA)], (7) In-teroperability Assessment [R. Beebe (NMSU) and D. Heather (ESA/PSA)], and (8) PDAP Geographic Information System (GIS) extension [N. Hirata (Univ. Aizu) and T. Hare (USGS: thare@usgs.gov)]. This paper presents our achievements and plans summarized in the IPDA 5th Steering Com-mittee meeting at DLR in July 2010. We are now just the gate for the establishment of the Infrastructure.

  13. Interoperability Context-Setting Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Hardin, Dave; Ambrosio, Ron

    2007-01-31

    As the deployment of automation technology advances, it touches upon many areas of our corporate and personal lives. A trend is emerging where systems are growing to the extent that integration is taking place with other systems to provide even greater capabilities more efficiently and effectively. GridWise™ provides a vision for this type of integration as it applies to the electric system. Imagine a time in the not too distant future when homeowners can offer the management of their electricity demand to participate in a more efficient and environmentally friendly operation of the electric power grid. They will do thismore » using technology that acts on their behalf in response to information from other components of the electric system. This technology will recognize their preferences to parameters such as comfort and the price of energy to form responses that optimize the local need to a signal that satisfies a higher-level need in the grid. For example, consider a particularly hot day with air stagnation in an area with a significant dependence on wind generation. To manage the forecasted peak electricity demand, the bulk system operator issues a critical peak price warning. Their automation systems alert electric service providers who distribute electricity from the wholesale electricity system to consumers. In response, the electric service providers use their automation systems to inform consumers of impending price increases for electricity. This information is passed to an energy management system at the premises, which acts on the homeowner’s behalf, to adjust the electricity usage of the onsite equipment (which might include generation from such sources as a fuel cell). The objective of such a system is to honor the agreement with the electricity service provider and reduce the homeowner’s bill while keeping the occupants as comfortable as possible. This will include actions such as moving the thermostat on the heating, ventilation, and air-conditioning (HVAC) unit up several degrees. The resulting load reduction becomes part of an aggregated response from the electricity service provider to the bulk system operator who is now in a better position to manage total system load with available generation. Looking across the electric system, from generating plants, to transmission substations, to the distribution system, to factories, office parks, and buildings, automation is growing, and the opportunities for unleashing new value propositions are exciting. How can we facilitate this change and do so in a way that ensures the reliability of electric resources for the wellbeing of our economy and security? The GridWise Architecture Council (GWAC) mission is to enable interoperability among the many entities that interact with the electric power system. A good definition of interoperability is, “The capability of two or more networks, systems, devices, applications, or components to exchange information between them and to use the information so exchanged.” As a step in the direction of enabling interoperability, the GWAC proposes a context-setting framework to organize concepts and terminology so that interoperability issues can be identified and debated, improvements to address issues articulated, and actions prioritized and coordinated across the electric power community.« less

  14. Documenting Models for Interoperability and Reusability ...

    EPA Pesticide Factsheets

    Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration between scientific communities, since component-based modeling can integrate models from different disciplines. Integrated Environmental Modeling (IEM) systems focus on transferring information between components by capturing a conceptual site model; establishing local metadata standards for input/output of models and databases; managing data flow between models and throughout the system; facilitating quality control of data exchanges (e.g., checking units, unit conversions, transfers between software languages); warning and error handling; and coordinating sensitivity/uncertainty analyses. Although many computational software systems facilitate communication between, and execution of, components, there are no common approaches, protocols, or standards for turn-key linkages between software systems and models, especially if modifying components is not the intent. Using a standard ontology, this paper reviews how models can be described for discovery, understanding, evaluation, access, and implementation to facilitate interoperability and reusability. In the proceedings of the International Environmental Modelling and Software Society (iEMSs), 8th International Congress on Environmental Mod

  15. Measuring interoperable EHR adoption and maturity: a Canadian example.

    PubMed

    Gheorghiu, Bobby; Hagens, Simon

    2016-01-25

    An interoperable electronic health record is a secure consolidated record of an individual's health history and care, designed to facilitate authorized information sharing across the care continuum.  Each Canadian province and territory has implemented such a system and for all, measuring adoption is essential to understanding progress and optimizing use in order to realize intended benefits. About 250,000 health professionals-approximately half of Canada's anticipated potential physician, nurse, pharmacist, and administrative users-indicated that they electronically access data, such as those found in provincial/territorial lab or drug information systems, in 2015.  Trends suggest further growth as maturity of use increases. There is strong interest in health information exchange through the iEHR in Canada, and continued growth in adoption is expected. Central to managing the evolution of digital health is access to robust data about who is using solutions, how they are used, where and when.  Stakeholders such as government, program leads, and health system administrators must critically assess progress and achievement of benefits, to inform future strategic and operational decisions.

  16. The 2006 NESCent Phyloinformatics Hackathon: A Field Report

    PubMed Central

    Lapp, Hilmar; Bala, Sendu; Balhoff, James P.; Bouck, Amy; Goto, Naohisa; Holder, Mark; Holland, Richard; Holloway, Alisha; Katayama, Toshiaki; Lewis, Paul O.; Mackey, Aaron J.; Osborne, Brian I.; Piel, William H.; Kosakovsky Pond, Sergei L.; Poon, Art F.Y.; Qiu, Wei-Gang; Stajich, Jason E.; Stoltzfus, Arlin; Thierer, Tobias; Vilella, Albert J.; Vos, Rutger A.; Zmasek, Christian M.; Zwickl, Derrick J.; Vision, Todd J.

    2007-01-01

    In December, 2006, a group of 26 software developers from some of the most widely used life science programming toolkits and phylogenetic software projects converged on Durham, North Carolina, for a Phyloinformatics Hackathon, an intense five-day collaborative software coding event sponsored by the National Evolutionary Synthesis Center (NESCent). The goal was to help researchers to integrate multiple phylogenetic software tools into automated workflows. Participants addressed deficiencies in interoperability between programs by implementing “glue code” and improving support for phylogenetic data exchange standards (particularly NEXUS) across the toolkits. The work was guided by use-cases compiled in advance by both developers and users, and the code was documented as it was developed. The resulting software is freely available for both users and developers through incorporation into the distributions of several widely-used open-source toolkits. We explain the motivation for the hackathon, how it was organized, and discuss some of the outcomes and lessons learned. We conclude that hackathons are an effective mode of solving problems in software interoperability and usability, and are underutilized in scientific software development.

  17. PIML: the Pathogen Information Markup Language.

    PubMed

    He, Yongqun; Vines, Richard R; Wattam, Alice R; Abramochkin, Georgiy V; Dickerman, Allan W; Eckart, J Dana; Sobral, Bruno W S

    2005-01-01

    A vast amount of information about human, animal and plant pathogens has been acquired, stored and displayed in varied formats through different resources, both electronically and otherwise. However, there is no community standard format for organizing this information or agreement on machine-readable format(s) for data exchange, thereby hampering interoperation efforts across information systems harboring such infectious disease data. The Pathogen Information Markup Language (PIML) is a free, open, XML-based format for representing pathogen information. XSLT-based visual presentations of valid PIML documents were developed and can be accessed through the PathInfo website or as part of the interoperable web services federation known as ToolBus/PathPort. Currently, detailed PIML documents are available for 21 pathogens deemed of high priority with regard to public health and national biological defense. A dynamic query system allows simple queries as well as comparisons among these pathogens. Continuing efforts are being taken to include other groups' supporting PIML and to develop more PIML documents. All the PIML-related information is accessible from http://www.vbi.vt.edu/pathport/pathinfo/

  18. A pilot study of distributed knowledge management and clinical decision support in the cloud.

    PubMed

    Dixon, Brian E; Simonaitis, Linas; Goldberg, Howard S; Paterno, Marilyn D; Schaeffer, Molly; Hongsermeier, Tonya; Wright, Adam; Middleton, Blackford

    2013-09-01

    Implement and perform pilot testing of web-based clinical decision support services using a novel framework for creating and managing clinical knowledge in a distributed fashion using the cloud. The pilot sought to (1) develop and test connectivity to an external clinical decision support (CDS) service, (2) assess the exchange of data to and knowledge from the external CDS service, and (3) capture lessons to guide expansion to more practice sites and users. The Clinical Decision Support Consortium created a repository of shared CDS knowledge for managing hypertension, diabetes, and coronary artery disease in a community cloud hosted by Partners HealthCare. A limited data set for primary care patients at a separate health system was securely transmitted to a CDS rules engine hosted in the cloud. Preventive care reminders triggered by the limited data set were returned for display to clinician end users for review and display. During a pilot study, we (1) monitored connectivity and system performance, (2) studied the exchange of data and decision support reminders between the two health systems, and (3) captured lessons. During the six month pilot study, there were 1339 patient encounters in which information was successfully exchanged. Preventive care reminders were displayed during 57% of patient visits, most often reminding physicians to monitor blood pressure for hypertensive patients (29%) and order eye exams for patients with diabetes (28%). Lessons learned were grouped into five themes: performance, governance, semantic interoperability, ongoing adjustments, and usability. Remote, asynchronous cloud-based decision support performed reasonably well, although issues concerning governance, semantic interoperability, and usability remain key challenges for successful adoption and use of cloud-based CDS that will require collaboration between biomedical informatics and computer science disciplines. Decision support in the cloud is feasible and may be a reasonable path toward achieving better support of clinical decision-making across the widest range of health care providers. Published by Elsevier B.V.

  19. Tool and data interoperability in the SSE system

    NASA Technical Reports Server (NTRS)

    Shotton, Chuck

    1988-01-01

    Information is given in viewgraph form on tool and data interoperability in the Software Support Environment (SSE). Information is given on industry problems, SSE system interoperability issues, SSE solutions to tool and data interoperability, and attainment of heterogeneous tool/data interoperability.

  20. UHF (Ultra High Frequency) Military Satellite Communications Ground Equipment Interoperability.

    DTIC Science & Technology

    1986-10-06

    crisis management requires interoperability between various services. These short-term crises often arise from unforeseen circumstances in which...Scheduler Qualcomm has prepared an interoperability study for the JTC3A (Reference 15) as a TA/CE for USCINCLANT ROC 5-84 requirements. It has defined a...interoperability is fundamental. A number of operational crises have occurred where interoperable communications or the lack of interoperable

  1. 47 CFR 97.111 - Authorized transmissions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ITU that it objects to such communications. The FCC will issue public notices of current arrangements for international communications. (2) Transmissions necessary to meet essential communication needs... exchange messages with a United States government station, necessary to providing communications in RACES...

  2. 47 CFR 97.111 - Authorized transmissions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... ITU that it objects to such communications. The FCC will issue public notices of current arrangements for international communications. (2) Transmissions necessary to meet essential communication needs... exchange messages with a United States government station, necessary to providing communications in RACES...

  3. 47 CFR 97.111 - Authorized transmissions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... ITU that it objects to such communications. The FCC will issue public notices of current arrangements for international communications. (2) Transmissions necessary to meet essential communication needs... exchange messages with a United States government station, necessary to providing communications in RACES...

  4. 47 CFR 97.111 - Authorized transmissions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ITU that it objects to such communications. The FCC will issue public notices of current arrangements for international communications. (2) Transmissions necessary to meet essential communication needs... exchange messages with a United States government station, necessary to providing communications in RACES...

  5. 47 CFR 97.111 - Authorized transmissions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ITU that it objects to such communications. The FCC will issue public notices of current arrangements for international communications. (2) Transmissions necessary to meet essential communication needs... exchange messages with a United States government station, necessary to providing communications in RACES...

  6. 36 CFR 1222.28 - What are the series level recordkeeping requirements?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... RECORDS ADMINISTRATION RECORDS MANAGEMENT CREATION AND MAINTENANCE OF FEDERAL RECORDS Agency Recordkeeping... systems adequately document agency policies, transactions, and activities, each program must develop... phone calls, meetings, instant messages, and electronic mail exchanges that include substantive...

  7. 36 CFR 1222.28 - What are the series level recordkeeping requirements?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... RECORDS ADMINISTRATION RECORDS MANAGEMENT CREATION AND MAINTENANCE OF FEDERAL RECORDS Agency Recordkeeping... systems adequately document agency policies, transactions, and activities, each program must develop... phone calls, meetings, instant messages, and electronic mail exchanges that include substantive...

  8. 36 CFR 1222.28 - What are the series level recordkeeping requirements?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... RECORDS ADMINISTRATION RECORDS MANAGEMENT CREATION AND MAINTENANCE OF FEDERAL RECORDS Agency Recordkeeping... systems adequately document agency policies, transactions, and activities, each program must develop... phone calls, meetings, instant messages, and electronic mail exchanges that include substantive...

  9. PowerON: the use of instant message counseling and the Internet to facilitate HIV/STD education and prevention.

    PubMed

    Moskowitz, David A; Melton, Dan; Owczarzak, Jill

    2009-10-01

    In recent years, Internet-based or online counseling has emerged as an effective way to assess psychological disorders and discuss destructive behaviors with individuals or groups of individuals. This study explores the application of online counseling to HIV/STD risk-taking behavior among men who have sex with men (MSM). PowerON, an organization that provides sexual health information to MSM exclusively online, used instant message technology to counsel MSM in real time through computer-mediated means. A sample of 279 transcripts of instant message exchanges between PowerON counselors and Gay.com users were recorded and qualitatively analyzed. Approximately 43% of the instant message sessions discussed information about HIV/STD testing. Risk-taking behaviors were addressed in 39% of the sessions. Information about HIV/STDs and general counseling were given in 23% and 18% of the counseling sessions, respectively. The data showed these instant message sessions to be a potentially feasible forum for HIV/STD counseling. Information ordinarily disseminated at health clinics could be successfully distributed through the Internet to MSM. 2009 Elsevier Ireland Ltd.

  10. PowerON: The use of instant message counseling and the Internet to facilitate HIV/STD education and prevention

    PubMed Central

    Moskowitz, David A.; Melton, Dan; Owczarzak, Jill

    2015-01-01

    Objective In recent years, Internet-based or online counseling has emerged as an effective way to assess psychological disorders and discuss destructive behaviors with individuals or groups of individuals. This study explores the application of online counseling to HIV/STD risk-taking behavior among men who have sex with men (MSM). Methods PowerON, an organization that provides sexual health information to MSM exclusively online, used instant message technology to counsel MSM in real time through computer-mediated means. A sample of 279 transcripts of instant message exchanges between PowerON counselors and Gay.com users were recorded and qualitatively analyzed. Results Approximately 43% of the instant message sessions discussed information about HIV/STD testing. Risk-taking behaviors were addressed in 39% of the sessions. Information about HIV/STDs and general counseling were given in 23% and 18% of the counseling sessions respectively. Conclusion The data showed these instant message sessions to be a potentially feasible forum for HIV/STD counseling. Practice Implications Information ordinarily disseminated at health clinics could be successfully distributed through the Internet to MSM. PMID:19217742

  11. Characterizing Deficiencies of Path-Based Routing for Wireless Multi-Hop Networks

    DTIC Science & Technology

    2017-05-01

    called a “ hello ”) to all of its neighbors. If a series of hello messages are exchanged between two users, a link is considered to exist between them...A brief description of ETX is as follows. For a given window of time, the number of hello packets that a user receives from a neighbor is counted. A...cost is then assigned to the link based on how many hello messages were heard; a link that has fewer hellos successfully transmitted across it will be

  12. The covert channel over HTTP protocol

    NASA Astrophysics Data System (ADS)

    Graniszewski, Waldemar; Krupski, Jacek; Szczypiorski, Krzysztof

    2016-09-01

    The paper presents a new steganographic method - the covert channel is created over HTTP protocol header, i.e. trailer field. HTTP protocol is one of the most frequently used in the Internet. The popularity of the Web servers and network traffic from, and to them, is one of the requirements for undetectable message exchange. To study this kind of the information hiding technique an application in Javascript language based on the Node.js framework was written. The results of the experiment that was performed to send a message in the covert channel are also presented.

  13. Virtual file system on NoSQL for processing high volumes of HL7 messages.

    PubMed

    Kimura, Eizen; Ishihara, Ken

    2015-01-01

    The Standardized Structured Medical Information Exchange (SS-MIX) is intended to be the standard repository for HL7 messages that depend on a local file system. However, its scalability is limited. We implemented a virtual file system using NoSQL to incorporate modern computing technology into SS-MIX and allow the system to integrate local patient IDs from different healthcare systems into a universal system. We discuss its implementation using the database MongoDB and describe its performance in a case study.

  14. Towards technical interoperability in telemedicine.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard Layne, II

    2004-05-01

    For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.

  15. The interoperability force in the ERP field

    NASA Astrophysics Data System (ADS)

    Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon

    2015-04-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.

  16. Geo3DML: A standard-based exchange format for 3D geological models

    NASA Astrophysics Data System (ADS)

    Wang, Zhangang; Qu, Honggang; Wu, Zixing; Wang, Xianghong

    2018-01-01

    A geological model (geomodel) in three-dimensional (3D) space is a digital representation of the Earth's subsurface, recognized by geologists and stored in resultant geological data (geodata). The increasing demand for data management and interoperable applications of geomodelscan be addressed by developing standard-based exchange formats for the representation of not only a single geological object, but also holistic geomodels. However, current standards such as GeoSciML cannot incorporate all the geomodel-related information. This paper presents Geo3DML for the exchange of 3D geomodels based on the existing Open Geospatial Consortium (OGC) standards. Geo3DML is based on a unified and formal representation of structural models, attribute models and hierarchical structures of interpreted resultant geodata in different dimensional views, including drills, cross-sections/geomaps and 3D models, which is compatible with the conceptual model of GeoSciML. Geo3DML aims to encode all geomodel-related information integrally in one framework, including the semantic and geometric information of geoobjects and their relationships, as well as visual information. At present, Geo3DML and some supporting tools have been released as a data-exchange standard by the China Geological Survey (CGS).

  17. Knowledge transfer and exchange processes for environmental health issues in Canadian Aboriginal communities.

    PubMed

    Jack, Susan M; Brooks, Sandy; Furgal, Chris M; Dobbins, Maureen

    2010-02-01

    Within Canadian Aboriginal communities, the process for utilizing environmental health research evidence in the development of policies and programs is not well understood. This fundamental qualitative descriptive study explored the perceptions of 28 environmental health researchers, senior external decision-makers and decision-makers working within Aboriginal communities about factors influencing knowledge transfer and exchange, beliefs about research evidence and Traditional Knowledge and the preferred communication channels for disseminating and receiving evidence. The results indicate that collaborative relationships between researchers and decision-makers, initiated early and maintained throughout a research project, promote both the efficient conduct of a study and increase the likelihood of knowledge transfer and exchange. Participants identified that empirical research findings and Traditional Knowledge are different and distinct types of evidence that should be equally valued and used where possible to provide a holistic understanding of environmental issues and support decisions in Aboriginal communities. To facilitate the dissemination of research findings within Aboriginal communities, participants described the elements required for successfully crafting key messages, locating and using credible messengers to deliver the messages, strategies for using cultural brokers and identifying the communication channels commonly used to disseminate and receive this type of information.

  18. Knowledge Transfer and Exchange Processes for Environmental Health Issues in Canadian Aboriginal Communities

    PubMed Central

    Jack, Susan M.; Brooks, Sandy; Furgal, Chris M.; Dobbins, Maureen

    2010-01-01

    Within Canadian Aboriginal communities, the process for utilizing environmental health research evidence in the development of policies and programs is not well understood. This fundamental qualitative descriptive study explored the perceptions of 28 environmental health researchers, senior external decision-makers and decision-makers working within Aboriginal communities about factors influencing knowledge transfer and exchange, beliefs about research evidence and Traditional Knowledge and the preferred communication channels for disseminating and receiving evidence. The results indicate that collaborative relationships between researchers and decision-makers, initiated early and maintained throughout a research project, promote both the efficient conduct of a study and increase the likelihood of knowledge transfer and exchange. Participants identified that empirical research findings and Traditional Knowledge are different and distinct types of evidence that should be equally valued and used where possible to provide a holistic understanding of environmental issues and support decisions in Aboriginal communities. To facilitate the dissemination of research findings within Aboriginal communities, participants described the elements required for successfully crafting key messages, locating and using credible messengers to deliver the messages, strategies for using cultural brokers and identifying the communication channels commonly used to disseminate and receive this type of information. PMID:20616996

  19. Application of Coalition Battle Management Language (C-BML) and C-BML Services to Live, Virtual, and Constructive (LVC) Simulation Environments

    DTIC Science & Technology

    2011-12-01

    Task Based Approach to Planning.” Paper 08F- SIW -033. In Proceed- ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability...Paper 06F- SIW -003. In Proceed- 2597 Blais ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability Standards Organi...MSDL).” Paper 10S- SIW -003. In Proceedings of the Spring Simulation Interoperability Workshop. Simulation Interoperability Standards Organization

  20. Maturity model for enterprise interoperability

    NASA Astrophysics Data System (ADS)

    Guédria, Wided; Naudet, Yannick; Chen, David

    2015-01-01

    Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.

  1. Communication security in open health care networks.

    PubMed

    Blobel, B; Pharow, P; Engel, K; Spiegel, V; Krohn, R

    1999-01-01

    Fulfilling the shared care paradigm, health care networks providing open systems' interoperability in health care are needed. Such communicating and co-operating health information systems, dealing with sensitive personal medical information across organisational, regional, national or even international boundaries, require appropriate security solutions. Based on the generic security model, within the European MEDSEC project an open approach for secure EDI like HL7, EDIFACT, XDT or XML has been developed. The consideration includes both securing the message in an unsecure network and the transport of the unprotected information via secure channels (SSL, TLS etc.). Regarding EDI, an open and widely usable security solution has been specified and practically implemented for the examples of secure mailing and secure file transfer (FTP) via wrapping the sensitive information expressed by the corresponding protocols. The results are currently prepared for standardisation.

  2. 7 CFR 2.89 - Chief Information Officer.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... continue, modify, or terminate an information technology program or project. (3) Provide advice and other... computer-based systems for message exchange, scheduling, computer conferencing, televideo technologies, and... removal or replacement of information technology project managers, when, in the opinion of the Chief...

  3. 7 CFR 2.24 - Assistant Secretary for Administration.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... determining whether to continue, modify, or terminate an information technology program or project. (iii... technology to improve productivity in the Department. (P) Plan, develop, install, and operate computer-based systems for message exchange, scheduling, computer conferencing, televideo technologies, and other...

  4. 7 CFR 2.89 - Chief Information Officer.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... continue, modify, or terminate an information technology program or project. (3) Provide advice and other... computer-based systems for message exchange, scheduling, computer conferencing, televideo technologies, and... removal or replacement of information technology project managers, when, in the opinion of the Chief...

  5. Asynchronous Data Retrieval from an Object-Oriented Database

    NASA Astrophysics Data System (ADS)

    Gilbert, Jonathan P.; Bic, Lubomir

    We present an object-oriented semantic database model which, similar to other object-oriented systems, combines the virtues of four concepts: the functional data model, a property inheritance hierarchy, abstract data types and message-driven computation. The main emphasis is on the last of these four concepts. We describe generic procedures that permit queries to be processed in a purely message-driven manner. A database is represented as a network of nodes and directed arcs, in which each node is a logical processing element, capable of communicating with other nodes by exchanging messages. This eliminates the need for shared memory and for centralized control during query processing. Hence, the model is suitable for implementation on a multiprocessor computer architecture, consisting of large numbers of loosely coupled processing elements.

  6. Comparison of a semi-automatic annotation tool and a natural language processing application for the generation of clinical statement entries.

    PubMed

    Lin, Ching-Heng; Wu, Nai-Yuan; Lai, Wei-Shao; Liou, Der-Ming

    2015-01-01

    Electronic medical records with encoded entries should enhance the semantic interoperability of document exchange. However, it remains a challenge to encode the narrative concept and to transform the coded concepts into a standard entry-level document. This study aimed to use a novel approach for the generation of entry-level interoperable clinical documents. Using HL7 clinical document architecture (CDA) as the example, we developed three pipelines to generate entry-level CDA documents. The first approach was a semi-automatic annotation pipeline (SAAP), the second was a natural language processing (NLP) pipeline, and the third merged the above two pipelines. We randomly selected 50 test documents from the i2b2 corpora to evaluate the performance of the three pipelines. The 50 randomly selected test documents contained 9365 words, including 588 Observation terms and 123 Procedure terms. For the Observation terms, the merged pipeline had a significantly higher F-measure than the NLP pipeline (0.89 vs 0.80, p<0.0001), but a similar F-measure to that of the SAAP (0.89 vs 0.87). For the Procedure terms, the F-measure was not significantly different among the three pipelines. The combination of a semi-automatic annotation approach and the NLP application seems to be a solution for generating entry-level interoperable clinical documents. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.comFor numbered affiliation see end of article.

  7. Enhanced semantic interoperability by profiling health informatics standards.

    PubMed

    López, Diego M; Blobel, Bernd

    2009-01-01

    Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.

  8. Designing a Distributed Space Systems Simulation in Accordance with the Simulation Interoperability Standards Organization (SISO)

    NASA Technical Reports Server (NTRS)

    Cowen, Benjamin

    2011-01-01

    Simulations are essential for engineering design. These virtual realities provide characteristic data to scientists and engineers in order to understand the details and complications of the desired mission. A standard development simulation package known as Trick is used in developing a source code to model a component (federate in HLA terms). The runtime executive is integrated into an HLA based distributed simulation. TrickHLA is used to extend a Trick simulation for a federation execution, develop a source code for communication between federates, as well as foster data input and output. The project incorporates international cooperation along with team collaboration. Interactions among federates occur throughout the simulation, thereby relying on simulation interoperability. Communication through the semester went on between participants to figure out how to create this data exchange. The NASA intern team is designing a Lunar Rover federate and a Lunar Shuttle federate. The Lunar Rover federate supports transportation across the lunar surface and is essential for fostering interactions with other federates on the lunar surface (Lunar Shuttle, Lunar Base Supply Depot and Mobile ISRU Plant) as well as transporting materials to the desired locations. The Lunar Shuttle federate transports materials to and from lunar orbit. Materials that it takes to the supply depot include fuel and cargo necessary to continue moon-base operations. This project analyzes modeling and simulation technologies as well as simulation interoperability. Each team from participating universities will work on and engineer their own federate(s) to participate in the SISO Spring 2011 Workshop SIW Smackdown in Boston, Massachusetts. This paper will focus on the Lunar Rover federate.

  9. NCC Simulation Model: Simulating the operations of the network control center, phase 2

    NASA Technical Reports Server (NTRS)

    Benjamin, Norman M.; Paul, Arthur S.; Gill, Tepper L.

    1992-01-01

    The simulation of the network control center (NCC) is in the second phase of development. This phase seeks to further develop the work performed in phase one. Phase one concentrated on the computer systems and interconnecting network. The focus of phase two will be the implementation of the network message dialogues and the resources controlled by the NCC. These resources are requested, initiated, monitored and analyzed via network messages. In the NCC network messages are presented in the form of packets that are routed across the network. These packets are generated, encoded, decoded and processed by the network host processors that generate and service the message traffic on the network that connects these hosts. As a result, the message traffic is used to characterize the work done by the NCC and the connected network. Phase one of the model development represented the NCC as a network of bi-directional single server queues and message generating sources. The generators represented the external segment processors. The served based queues represented the host processors. The NCC model consists of the internal and external processors which generate message traffic on the network that links these hosts. To fully realize the objective of phase two it is necessary to identify and model the processes in each internal processor. These processes live in the operating system of the internal host computers and handle tasks such as high speed message exchanging, ISN and NFE interface, event monitoring, network monitoring, and message logging. Inter process communication is achieved through the operating system facilities. The overall performance of the host is determined by its ability to service messages generated by both internal and external processors.

  10. A step-by-step methodology for enterprise interoperability projects

    NASA Astrophysics Data System (ADS)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  11. In Pursuit of Theoretical Ground in Behavior Change Support Systems: Analysis of Peer-to-Peer Communication in a Health-Related Online Community.

    PubMed

    Myneni, Sahiti; Cobb, Nathan; Cohen, Trevor

    2016-02-02

    Research studies involving health-related online communities have focused on examining network structure to understand mechanisms underlying behavior change. Content analysis of the messages exchanged in these communities has been limited to the "social support" perspective. However, existing behavior change theories suggest that message content plays a prominent role reflecting several sociocognitive factors that affect an individual's efforts to make a lifestyle change. An understanding of these factors is imperative to identify and harness the mechanisms of behavior change in the Health 2.0 era. The objective of this work is two-fold: (1) to harness digital communication data to capture essential meaning of communication and factors affecting a desired behavior change, and (2) to understand the applicability of existing behavior change theories to characterize peer-to-peer communication in online platforms. In this paper, we describe grounded theory-based qualitative analysis of digital communication in QuitNet, an online community promoting smoking cessation. A database of 16,492 de-identified public messages from 1456 users from March 1-April 30, 2007, was used in our study. We analyzed 795 messages using grounded theory techniques to ensure thematic saturation. This analysis enabled identification of key concepts contained in the messages exchanged by QuitNet members, allowing us to understand the sociobehavioral intricacies underlying an individual's efforts to cease smoking in a group setting. We further ascertained the relevance of the identified themes to theoretical constructs in existing behavior change theories (eg, Health Belief Model) and theoretically linked techniques of behavior change taxonomy. We identified 43 different concepts, which were then grouped under 12 themes based on analysis of 795 messages. Examples of concepts include "sleepiness," "pledge," "patch," "spouse," and "slip." Examples of themes include "traditions," "social support," "obstacles," "relapse," and "cravings." Results indicate that themes consisting of member-generated strategies such as "virtual bonfires" and "pledges" were related to the highest number of theoretical constructs from the existing behavior change theories. In addition, results indicate that the member-generated communication content supports sociocognitive constructs from more than one behavior change model, unlike the majority of the existing theory-driven interventions. With the onset of mobile phones and ubiquitous Internet connectivity, online social network data reflect the intricacies of human health behavior as experienced by health consumers in real time. This study offers methodological insights for qualitative investigations that examine the various kinds of behavioral constructs prevalent in the messages exchanged among users of online communities. Theoretically, this study establishes the manifestation of existing behavior change theories in QuitNet-like online health communities. Pragmatically, it sets the stage for real-time, data-driven sociobehavioral interventions promoting healthy lifestyle modifications by allowing us to understand the emergent user needs to sustain a desired behavior change.

  12. A change of course: The importance to DoD of international standards for electronic commerce

    NASA Astrophysics Data System (ADS)

    Payne, Judith E.

    1991-12-01

    The U.S. Department of Defense (DoD) is committed to using electronic commerce in the future with the over 300,000 vendors interested in doing business with DoD. Electronic commerce will move DoD from a paper-based world to one based on electronic transactions enabled by the exchange of formatted, electronic messages referred to as electronic data interchange (EDI). With electronic commerce, DoD plans to reduce costs, increase effectiveness, and make it easier for vendors to deal with DoD. Benefits from electronic commerce are enhanced when many businesses use the same standards for EDI messages themselves and their transmission. The fewer standards used, the less time and resources must be spent translating messages and agreeing on how to use different standards. To enhance benefits and smooth the transition to electronic commerce for itself and its vendors, DoD has chosen to use the widely accepted American National Standards Institute (ANSI) X12 standards for EDI messages, coupled with international standards for delivering messages and organizing addresses. In the past 18 months, EDI standards sponsored by a United Nations body and serving the same purpose as ANSI X12 message standards have begun to gain wider acceptance internationally.

  13. System requirements and benefits of a terminal information system for the Kansas City railroads

    DOT National Transportation Integrated Search

    1976-08-31

    The Kansas City Terminal Railway Company proposed that the Federal Railroad Administration assist in funding the implementation of a Terminal Information and Message Exchange system (TIME), designed to enhance the operations of the twelve railroads i...

  14. Microcomputer Data Management in an Introductory Physics Laboratory.

    ERIC Educational Resources Information Center

    Chonacky, Norman

    1982-01-01

    Discusses the use of a microcomputer/mini-floppy disk system by physics students to store and analyze experimental data and exchange messages with the lab instructor. Also discusses other uses, in particular those fostering critical thinking and hypothetico-deductive reasoning. (Author/SK)

  15. Self-Regulated Learning in Virtual Communities

    ERIC Educational Resources Information Center

    Delfino, Manuela; Dettori, Giuliana; Persico, Donatella

    2008-01-01

    This paper investigates self-regulated learning (SRL) in a virtual learning community of adults interacting through asynchronous textual communication. The investigation method chosen is interaction analysis, a qualitative/quantitative approach allowing a systematic study of the contents of the messages exchanged within online communities. The…

  16. 36 CFR § 1222.28 - What are the series level recordkeeping requirements?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... AND RECORDS ADMINISTRATION RECORDS MANAGEMENT CREATION AND MAINTENANCE OF FEDERAL RECORDS Agency... series and systems adequately document agency policies, transactions, and activities, each program must... documentation of phone calls, meetings, instant messages, and electronic mail exchanges that include substantive...

  17. PharmML in Action: an Interoperable Language for Modeling and Simulation.

    PubMed

    Bizzotto, R; Comets, E; Smith, G; Yvon, F; Kristensen, N R; Swat, M J

    2017-10-01

    PharmML is an XML-based exchange format created with a focus on nonlinear mixed-effect (NLME) models used in pharmacometrics, but providing a very general framework that also allows describing mathematical and statistical models such as single-subject or nonlinear and multivariate regression models. This tutorial provides an overview of the structure of this language, brief suggestions on how to work with it, and use cases demonstrating its power and flexibility. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  18. The Study on Collaborative Manufacturing Platform Based on Agent

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao-yan; Qu, Zheng-geng

    To fulfill the trends of knowledge-intensive in collaborative manufacturing development, we have described multi agent architecture supporting knowledge-based platform of collaborative manufacturing development platform. In virtue of wrapper service and communication capacity agents provided, the proposed architecture facilitates organization and collaboration of multi-disciplinary individuals and tools. By effectively supporting the formal representation, capture, retrieval and reuse of manufacturing knowledge, the generalized knowledge repository based on ontology library enable engineers to meaningfully exchange information and pass knowledge across boundaries. Intelligent agent technology increases traditional KBE systems efficiency and interoperability and provides comprehensive design environments for engineers.

  19. An ontology-based framework for bioinformatics workflows.

    PubMed

    Digiampietri, Luciano A; Perez-Alcazar, Jose de J; Medeiros, Claudia Bauzer

    2007-01-01

    The proliferation of bioinformatics activities brings new challenges - how to understand and organise these resources, how to exchange and reuse successful experimental procedures, and to provide interoperability among data and tools. This paper describes an effort toward these directions. It is based on combining research on ontology management, AI and scientific workflows to design, reuse and annotate bioinformatics experiments. The resulting framework supports automatic or interactive composition of tasks based on AI planning techniques and takes advantage of ontologies to support the specification and annotation of bioinformatics workflows. We validate our proposal with a prototype running on real data.

  20. Current Efforts in European Projects to Facilitate the Sharing of Scientific Observation Data

    NASA Astrophysics Data System (ADS)

    Bredel, Henning; Rieke, Matthes; Maso, Joan; Jirka, Simon; Stasch, Christoph

    2017-04-01

    This presentation is intended to provide an overview of currently ongoing efforts in European projects to facilitate and promote the interoperable sharing of scientific observation data. This will be illustrated through two examples: a prototypical portal developed in the ConnectinGEO project for matching available (in-situ) data sources to the needs of users and a joint activity of several research projects to harmonise the usage of the OGC Sensor Web Enablement standards for providing access to marine observation data. ENEON is an activity initiated by the European ConnectinGEO project to coordinate in-situ Earth observation networks with the aim to harmonise the access to observations, improve discoverability, and identify/close gaps in European earth observation data resources. In this context, ENEON commons has been developed as a supporting Web portal for facilitating discovery, access, re-use and creation of knowledge about observations, networks, and related activities (e.g. projects). The portal is based on developments resulting from the European WaterInnEU project and has been extended to cover the requirements for handling knowledge about in-situ earth observation networks. A first prototype of the portal was completed in January 2017 which offers functionality for interactive discussion, information exchange and querying information about data delivered by different observation networks. Within this presentation, we will introduce the presented prototype and initiate a discussion about potential future work directions. The second example concerns the harmonisation of data exchange in the marine domain. There are many organisation who operate ocean observatories or data archives. In recent years, the application of the OGC Sensor Web Enablement (SWE) technology has become more and more popular to increase the interoperability between marine observation networks. However, as the SWE standards were intentionally designed in a domain independent manner, there are still a significant degrees of freedom how the same information could be handled in the SWE framework. Thus, further domain-specific agreements are necessary to describe more precisely, how SWE standards shall be applied in specific contexts. Within this presentation we will report the current status of the marine SWE profiles initiative which has the aim to develop guidance and recommendations for the application of SWE standards for ocean observation data. This initiative which is supported by projects such as NeXOS, FixO3, ODIP 2, BRIDGES and SeaDataCloud has already lead to first results, which will be introduced in the proposed presentation. In summary we will introduce two different building blocks how earth observation networks can be coordinated to ensure better discoverability through intelligent portal solutions and to ensure a common, interoperable exchange of the collected data through dedicated domain profiles of Sensor Web standard.

  1. OOI CyberInfrastructure - Next Generation Oceanographic Research

    NASA Astrophysics Data System (ADS)

    Farcas, C.; Fox, P.; Arrott, M.; Farcas, E.; Klacansky, I.; Krueger, I.; Meisinger, M.; Orcutt, J.

    2008-12-01

    Software has become a key enabling technology for scientific discovery, observation, modeling, and exploitation of natural phenomena. New value emerges from the integration of individual subsystems into networked federations of capabilities exposed to the scientific community. Such data-intensive interoperability networks are crucial for future scientific collaborative research, as they open up new ways of fusing data from different sources and across various domains, and analysis on wide geographic areas. The recently established NSF OOI program, through its CyberInfrastructure component addresses this challenge by providing broad access from sensor networks for data acquisition up to computational grids for massive computations and binding infrastructure facilitating policy management and governance of the emerging system-of-scientific-systems. We provide insight into the integration core of this effort, namely, a hierarchic service-oriented architecture for a robust, performant, and maintainable implementation. We first discuss the relationship between data management and CI crosscutting concerns such as identity management, policy and governance, which define the organizational contexts for data access and usage. Next, we detail critical services including data ingestion, transformation, preservation, inventory, and presentation. To address interoperability issues between data represented in various formats we employ a semantic framework derived from the Earth System Grid technology, a canonical representation for scientific data based on DAP/OPeNDAP, and related data publishers such as ERDDAP. Finally, we briefly present the underlying transport based on a messaging infrastructure over the AMQP protocol, and the preservation based on a distributed file system through SDSC iRODS.

  2. Interoperability in digital electrocardiography: harmonization of ISO/IEEE x73-PHD and SCP-ECG.

    PubMed

    Trigo, Jesús D; Chiarugi, Franco; Alesanco, Alvaro; Martínez-Espronceda, Miguel; Serrano, Luis; Chronaki, Catherine E; Escayola, Javier; Martínez, Ignacio; García, José

    2010-11-01

    The ISO/IEEE 11073 (x73) family of standards is a reference frame for medical device interoperability. A draft for an ECG device specialization (ISO/IEEE 11073-10406-d02) has already been presented to the Personal Health Device (PHD) Working Group, and the Standard Communications Protocol for Computer-Assisted ElectroCardioGraphy (SCP-ECG) Standard for short-term diagnostic ECGs (EN1064:2005+A1:2007) has recently been approved as part of the x73 family (ISO 11073-91064:2009). These factors suggest the coordinated use of these two standards in foreseeable telecardiology environments, and hence the need to harmonize them. Such harmonization is the subject of this paper. Thus, a mapping of the mandatory attributes defined in the second draft of the ISO/IEEE 11073-10406-d02 and the minimum SCP-ECG fields is presented, and various other capabilities of the SCP-ECG Standard (such as the messaging part) are also analyzed from an x73-PHD point of view. As a result, this paper addresses and analyzes the implications of some inconsistencies in the coordinated use of these two standards. Finally, a proof-of-concept implementation of the draft x73-PHD ECG device specialization is presented, along with the conversion from x73-PHD to SCP-ECG. This paper, therefore, provides recommendations for future implementations of telecardiology systems that are compliant with both x73-PHD and SCP-ECG.

  3. Performance measurement integrated information framework in e-Manufacturing

    NASA Astrophysics Data System (ADS)

    Teran, Hilaida; Hernandez, Juan Carlos; Vizán, Antonio; Ríos, José

    2014-11-01

    The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronising manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to performance measurement (PM) processes, a critical area for decision making and implementing improvement actions in manufacturing. This paper proposes a PM information framework to integrate decision support systems in e-Manufacturing. Specifically, the proposed framework offers a homogeneous PM information exchange model that can be applied through decision support in e-Manufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a PM information platform and PM-Web services architecture. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the model.

  4. Improving EHR Capabilities to Facilitate Stage 3 Meaningful Use Care Coordination Criteria.

    PubMed

    Cross, Dori A; Cohen, Genna R; Nong, Paige; Day, Anya-Victoria; Vibbert, Danielle; Naraharisetti, Ramya; Adler-Milstein, Julia

    Primary care practices have been limited in their ability to leverage electronic health records (EHRs) and health information exchange (HIE) to improve care coordination, but will soon be incentivized to do so under proposed Stage 3 meaningful use criteria. We use mixed methods to understand how primary care practices manage, share and reconcile electronic patient information across care settings, and identify innovations in EHR design to support enhanced care coordination. Opportunities identified by practices focused on availability and usability of features that facilitate (1) generation of customized summary of care records, (2) team-based care approaches, and (3) management of the increased volume of electronic information generated and exchanged during care transitions. More broadly, vendors and policymakers need to continue to work together to improve interoperability as the key to effective care coordination. If these EHR innovations were widespread, the value of meeting the proposed Stage 3 care coordination criteria would be substantially enhanced.

  5. A reference architecture for integrated EHR in Colombia.

    PubMed

    de la Cruz, Edgar; Lopez, Diego M; Uribe, Gustavo; Gonzalez, Carolina; Blobel, Bernd

    2011-01-01

    The implementation of national EHR infrastructures has to start by a detailed definition of the overall structure and behavior of the EHR system (system architecture). Architectures have to be open, scalable, flexible, user accepted and user friendly, trustworthy, based on standards including terminologies and ontologies. The GCM provides an architectural framework created with the purpose of analyzing any kind of system, including EHR system´s architectures. The objective of this paper is to propose a reference architecture for the implementation of an integrated EHR in Colombia, based on the current state of system´s architectural models, and EHR standards. The proposed EHR architecture defines a set of services (elements) and their interfaces, to support the exchange of clinical documents, offering an open, scalable, flexible and semantically interoperable infrastructure. The architecture was tested in a pilot tele-consultation project in Colombia, where dental EHR are exchanged.

  6. MEETING REPORT: OMG Technical Committee Meeting in Orlando, FL, sees significant enhancement to CORBA

    NASA Astrophysics Data System (ADS)

    1998-06-01

    The Object Management Group (OMG) Platform Technology Committee (PTC) ratified its support for a new asynchronous messaging service for CORBA at OMG's recent Technical Committee Meeting in Orlando, FL. The meeting, held from 8 - 12 June, saw the PTC send the Messaging Service out for a final vote among the OMG membership. The Messaging Service, which will integrate Message Oriented Middleware (MOM) with CORBA, will give CORBA a true asynchronous messaging capability - something of great interest to users and developers. Formal adoption of the specification will most likely occur by the end of the year. The Messaging Service The Messaging Service, when adopted, will be the world's first standard for Message Oriented Middleware and will give CORBA a true asynchronous messaging capability. Asynchronous messaging allows developers to build simpler, richer client environments. With asynchronous messaging there is less need for multi-threaded clients because the Asynchronous Method Invocation is non-blocking, meaning the client thread can continue work while the application waits for a reply. David Curtis, Director of Platform Technology for OMG, said: `This messaging service is one of the more valuable additions to CORBA. It enhances CORBA's existing asynchronous messaging capabilities which is a feature of many popular message oriented middleware products. This service will allow better integration between ORBs and MOM products. This enhanced messaging capability will only make CORBA more valuable for builders of distributed object systems.' The Messaging Service is one of sixteen technologies currently being worked on by the PTC. Additionally, seventeen Revision Task Forces (RTFs) are working on keeping OMG specifications up to date. The purpose of these Revision Task Forces is to take input from the implementors of OMG specifications and clarify or make necessary changes based on the implementor's input. The RTFs also ensure that the specifications remain up to date with changes in the OMA and with industry advances in general. Domain work Thirty-eight technology processes are ongoing in the Domain Technology Committee (DTC). These range over a wide variety of industries, including healthcare, telecommunications, life sciences, manufacturing, business objects, electronic commerce, finance, transportation, utilities, and distributed simulation. These processes aim to enhance CORBA's value and provide interoperability for specific vertical industries. At the Orlando meeting, the Domain Technology Committee issued the following requests to industry: Telecom Wireless Access Request For Information (RFI); Statistics RFI; Clinical Image Access Service Request For Proposal (RFP); Distributed Simulation Request For Comment (RFC). The newly-formed Statistics group at OMG plans to standarize interfaces for Statistical Services in CORBA, and their RFI, to which any person or company can respond, asks for input and guidance as they start this work which will impact the broad spectrum of industries and processes which use statistics. The Clinical Image Access Service will standarize access to important medical images including digital x-rays, MRI scans, and other formats. The Distributed Simulation RFC, when complete, will establish the Distributed Simulation High-Level Architecture of the US Defense Military Simulation Office as an OMG standard. For the next 90 days any person or company, not only OMG members, may submit their comments on the submission. The OMG looks forward to its next meeting to be held in Helsinki, Finland, on 27 - 31 July and hosted by Nokia. OMG encourages anyone considering OMG membership to attend the meeting as a guest. For more information on attending call +1-508-820-4300 or e-mail info@omg.org. Note: descriptions for all RFPs, RFIs and RFCs in progress are available for viewing on the OMG Website at http://www.omg.org/schedule.htm, or contact OMG for a copy of the `Work in Progress' document. For more information on the OMG Technology Process please call Jeurgen Boldt, OMG Process Manager, at +1-508-820-4300 or email jeurgen@omg.org.

  7. Evidence-Based Support for the Characteristics of Tsunami Warning Messages for Local, Regional and Distant Sources

    NASA Astrophysics Data System (ADS)

    Gregg, C. E.; Johnston, D. M.; Sorensen, J. H.; Vogt Sorensen, B.; Whitmore, P.

    2014-12-01

    Many studies since 2004 have documented the dissemination and receipt of risk information for local to distant tsunamis and factors influencing people's responses. A few earlier tsunami studies and numerous studies of other hazards provide additional support for developing effective tsunami messages. This study explores evidence-based approaches to developing such messages for the Pacific and National Tsunami Warning Centers in the US. It extends a message metric developed for the NWS Tsunami Program. People at risk to tsunamis receive information from multiple sources through multiple channels. Sources are official and informal and environmental and social cues. Traditionally, official tsunami messages followed a linear dissemination path through relatively few channels from warning center to emergency management to public and media. However, the digital age has brought about a fundamental change in the dissemination and receipt of official and informal communications. Information is now disseminated in very non-linear paths and all end-user groups may receive the same message simultaneously. Research has demonstrated a range of factors that influence rapid respond to an initial real or perceived threat. Immediate response is less common than one involving delayed protective actions where people first engage in "milling behavior" to exchange information and confirm the warning before taking protective action. The most important message factors to achieve rapid response focus on the content and style of the message and the frequency of dissemination. Previously we developed a tsunami message metric consisting of 21 factors divided into message content and style and receiver characteristics. Initially, each factor was equally weighted to identify gaps, but here we extend the work by weighting specific factors. This utilizes recent research that identifies the most important determinants of protective action. We then discuss the prioritization of message information in the context of potentially limited space in evolving tsunami messages issued by the warning centers.

  8. Self-Organized Link State Aware Routing for Multiple Mobile Agents in Wireless Network

    NASA Astrophysics Data System (ADS)

    Oda, Akihiro; Nishi, Hiroaki

    Recently, the importance of data sharing structures in autonomous distributed networks has been increasing. A wireless sensor network is used for managing distributed data. This type of distributed network requires effective information exchanging methods for data sharing. To reduce the traffic of broadcasted messages, reduction of the amount of redundant information is indispensable. In order to reduce packet loss in mobile ad-hoc networks, QoS-sensitive routing algorithm have been frequently discussed. The topology of a wireless network is likely to change frequently according to the movement of mobile nodes, radio disturbance, or fading due to the continuous changes in the environment. Therefore, a packet routing algorithm should guarantee QoS by using some quality indicators of the wireless network. In this paper, a novel information exchanging algorithm developed using a hash function and a Boolean operation is proposed. This algorithm achieves efficient information exchanges by reducing the overhead of broadcasting messages, and it can guarantee QoS in a wireless network environment. It can be applied to a routing algorithm in a mobile ad-hoc network. In the proposed routing algorithm, a routing table is constructed by using the received signal strength indicator (RSSI), and the neighborhood information is periodically broadcasted depending on this table. The proposed hash-based routing entry management by using an extended MAC address can eliminate the overhead of message flooding. An analysis of the collision of hash values contributes to the determination of the length of the hash values, which is minimally required. Based on the verification of a mathematical theory, an optimum hash function for determining the length of hash values can be given. Simulations are carried out to evaluate the effectiveness of the proposed algorithm and to validate the theory in a general wireless network routing algorithm.

  9. Using E-Mail across Computer Networks.

    ERIC Educational Resources Information Center

    Hazari, Sunil

    1990-01-01

    Discusses the use of telecommunications technology to exchange electronic mail, files, and messages across different computer networks. Networks highlighted include ARPA Internet; BITNET; USENET; FidoNet; MCI Mail; and CompuServe. Examples of the successful use of networks in higher education are given. (Six references) (LRW)

  10. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    DTIC Science & Technology

    2017-10-01

    Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...Sept 2016 – 20 Sept 2017 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology ...efficiency through interoperable medical technologies . We played a leadership role on interoperability safety standards (AAMI, AAMI/UL Joint

  11. Interactive Supercomputing’s Star-P Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edelman, Alan; Husbands, Parry; Leibman, Steve

    2006-09-19

    The thesis of this extended abstract is simple. High productivity comes from high level infrastructures. To measure this, we introduce a methodology that goes beyond the tradition of timing software in serial and tuned parallel modes. We perform a classroom productivity study involving 29 students who have written a homework exercise in a low level language (MPI message passing) and a high level language (Star-P with MATLAB client). Our conclusions indicate what perhaps should be of little surprise: (1) the high level language is always far easier on the students than the low level language. (2) The early versions ofmore » the high level language perform inadequately compared to the tuned low level language, but later versions substantially catch up. Asymptotically, the analogy must hold that message passing is to high level language parallel programming as assembler is to high level environments such as MATLAB, Mathematica, Maple, or even Python. We follow the Kepner method that correctly realizes that traditional speedup numbers without some discussion of the human cost of reaching these numbers can fail to reflect the true human productivity cost of high performance computing. Traditional data compares low level message passing with serial computation. With the benefit of a high level language system in place, in our case Star-P running with MATLAB client, and with the benefit of a large data pool: 29 students, each running the same code ten times on three evolutions of the same platform, we can methodically demonstrate the productivity gains. To date we are not aware of any high level system as extensive and interoperable as Star-P, nor are we aware of an experiment of this kind performed with this volume of data.« less

  12. CINTEX: International Interoperability Extensions to EOSDIS

    NASA Technical Reports Server (NTRS)

    Graves, Sara J.

    1997-01-01

    A large part of the research under this cooperative agreement involved working with representatives of the DLR, NASDA, EDC, and NOAA-SAA data centers to propose a set of enhancements and additions to the EOSDIS Version 0 Information Management System (V0 IMS) Client/Server Message Protocol. Helen Conover of ITSL led this effort to provide for an additional geographic search specification (WRS Path/Row), data set- and data center-specific search criteria, search by granule ID, specification of data granule subsetting requests, data set-based ordering, and the addition of URLs to result messages. The V0 IMS Server Cookbook is an evolving document, providing resources and information to data centers setting up a VO IMS Server. Under this Cooperative Agreement, Helen Conover revised, reorganized, and expanded this document, and converted it to HTML. Ms. Conover has also worked extensively with the IRE RAS data center, CPSSI, in Russia. She served as the primary IMS contact for IRE-CPSSI and as IRE-CPSSI's liaison to other members of IMS and Web Gateway (WG) development teams. Her documentation of IMS problems in the IRE environment (Sun servers and low network bandwidth) led to a general restructuring of the V0 IMS Client message polling system. to the benefit of all IMS participants. In addition to the IMS server software and documentation. which are generally available to CINTEX sites, Ms. Conover also provided database design documentation and consulting, order tracking software, and hands-on testing and debug assistance to IRE. In the final pre-operational phase of IRE-CPSSI development, she also supplied information on configuration management, including ideas and processes in place at the Global Hydrology Resource Center (GHRC), an EOSDIS data center operated by ITSL.

  13. The Digital Divide and Patient Portals: Internet Access Explained Differences in Patient Portal Use for Secure Messaging by Age, Race, and Income.

    PubMed

    Graetz, Ilana; Gordon, Nancy; Fung, Vick; Hamity, Courtnee; Reed, Mary E

    2016-08-01

    Online access to health records and the ability to exchange secure messages with physicians can improve patient engagement and outcomes; however, the digital divide could limit access to web-based portals among disadvantaged groups. To understand whether sociodemographic differences in patient portal use for secure messaging can be explained by differences in internet access and care preferences. Cross-sectional survey to examine the association between patient sociodemographic characteristics and internet access and care preferences; then, the association between sociodemographic characteristics and secure message use with and without adjusting for internet access and care preference. One thousand forty-one patients with chronic conditions in a large integrated health care delivery system (76% response rate). Internet access, portal use for secure messaging, preference for in-person or online care, and sociodemographic and health characteristics. Internet access and preference mediated some of the differences in secure message use by age, race, and income. For example, using own computer to access the internet explained 52% of the association between race and secure message use and 60% of the association between income and use (Sobel-Goodman mediation test, P<0.001 for both). Education and sex-related differences in portal use remained statistically significant when controlling for internet access and preference. As the availability and use of patient portals increase, it is important to understand which patients have limited access and the barriers they may face. Improving internet access and making portals available across multiple platforms, including mobile, may reduce some disparities in secure message use.

  14. HuPSON: the human physiology simulation ontology.

    PubMed

    Gündel, Michaela; Younesi, Erfan; Malhotra, Ashutosh; Wang, Jiali; Li, Hui; Zhang, Bijun; de Bono, Bernard; Mevissen, Heinz-Theodor; Hofmann-Apitius, Martin

    2013-11-22

    Large biomedical simulation initiatives, such as the Virtual Physiological Human (VPH), are substantially dependent on controlled vocabularies to facilitate the exchange of information, of data and of models. Hindering these initiatives is a lack of a comprehensive ontology that covers the essential concepts of the simulation domain. We propose a first version of a newly constructed ontology, HuPSON, as a basis for shared semantics and interoperability of simulations, of models, of algorithms and of other resources in this domain. The ontology is based on the Basic Formal Ontology, and adheres to the MIREOT principles; the constructed ontology has been evaluated via structural features, competency questions and use case scenarios.The ontology is freely available at: http://www.scai.fraunhofer.de/en/business-research-areas/bioinformatics/downloads.html (owl files) and http://bishop.scai.fraunhofer.de/scaiview/ (browser). HuPSON provides a framework for a) annotating simulation experiments, b) retrieving relevant information that are required for modelling, c) enabling interoperability of algorithmic approaches used in biomedical simulation, d) comparing simulation results and e) linking knowledge-based approaches to simulation-based approaches. It is meant to foster a more rapid uptake of semantic technologies in the modelling and simulation domain, with particular focus on the VPH domain.

  15. Quality requirements for EHR archetypes.

    PubMed

    Kalra, Dipak; Tapuria, Archana; Austin, Tony; De Moor, Georges

    2012-01-01

    The realisation of semantic interoperability, in which any EHR data may be communicated between heterogeneous systems and fully understood by computers as well as people on receipt, is a challenging goal. Despite the use of standardised generic models for the EHR and standard terminology systems, too much optionality and variability exists in how particular clinical entries may be represented. Clinical archetypes provide a means of defining how generic models should be shaped and bound to terminology for specific kinds of clinical data. However, these will only contribute to semantic interoperability if libraries of archetypes can be built up consistently. This requires the establishment of design principles, editorial and governance policies, and further research to develop ways for archetype authors to structure clinical data and to use terminology consistently. Drawing on several years of work within communities of practice developing archetypes and implementing systems from them, this paper presents quality requirements for the development of archetypes. Clinical engagement on a wide scale is also needed to help grow libraries of good quality archetypes that can be certified. Vendor and eHealth programme engagement is needed to validate such archetypes and achieve safe, meaningful exchange of EHR data between systems.

  16. Interoperable Data Sharing for Diverse Scientific Disciplines

    NASA Astrophysics Data System (ADS)

    Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean

    2016-04-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.

  17. A Working Framework for Enabling International Science Data System Interoperability

    NASA Astrophysics Data System (ADS)

    Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.

    2016-07-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.

  18. Implementing PAT with Standards

    NASA Astrophysics Data System (ADS)

    Chandramohan, Laakshmana Sabari; Doolla, Suryanarayana; Khaparde, S. A.

    2016-02-01

    Perform Achieve Trade (PAT) is a market-based incentive mechanism to promote energy efficiency. The purpose of this work is to address the challenges inherent to inconsistent representation of business processes, and interoperability issues in PAT like cap-and-trade mechanisms especially when scaled. Studies by various agencies have highlighted that as the mechanism evolves including more industrial sectors and industries in its ambit, implementation will become more challenging. This paper analyses the major needs of PAT (namely tracking, monitoring, auditing & verifying energy-saving reports, and providing technical support & guidance to stakeholders); and how the aforesaid reasons affect them. Though current technologies can handle these challenges to an extent, standardization activities for implementation have been scanty for PAT and this work attempts to evolve them. The inconsistent modification of business processes, rules, and procedures across stakeholders, and interoperability among heterogeneous systems are addressed. This paper proposes the adoption of specifically two standards into PAT, namely Business Process Model and Notation for maintaining consistency in business process modelling, and Common Information Model (IEC 61970, 61968, 62325 combined) for information exchange. Detailed architecture and organization of these adoptions are reported. The work can be used by PAT implementing agencies, stakeholders, and standardization bodies.

  19. Communication in acute ambulatory care.

    PubMed

    Dean, Marleah; Oetzel, John; Sklar, David P

    2014-12-01

    Effective communication has been linked to better health outcomes, higher patient satisfaction, and treatment adherence. Communication in ambulatory care contexts is even more crucial, as providers typically do not know patients' medical histories or have established relationships, conversations are time constrained, interruptions are frequent, and the seriousness of patients' medical conditions may create additional tension during interactions. Yet, health communication often unduly emphasizes information exchange-the transmission and receipt of messages leading to a mutual understanding of a patient's condition, needs, and treatments. This approach does not take into account the importance of rapport building and contextual issues, and may ultimately limit the amount of information exchanged.The authors share the perspective of communication scientists to enrich the current approach to medical communication in ambulatory health care contexts, broadening the under standing of medical communication beyond information exchange to a more holistic, multilayered viewpoint, which includes rapport and contextual issues. The authors propose a socio-ecological model for understanding communication in acute ambulatory care. This model recognizes the relationship of individuals to their environment and emphasizes the importance of individual and contextual factors that influence patient-provider interactions. Its key elements include message exchange and individual, organizational, societal, and cultural factors. Using this model, and following the authors' recommendations, providers and medical educators can treat communication as a holistic process shaped by multiple layers. This is a step toward being able to negotiate conflicting demands, resolve tensions, and create encounters that lead to positive health outcomes.

  20. The MMI Semantic Framework: Rosetta Stones for Earth Sciences

    NASA Astrophysics Data System (ADS)

    Rueda, C.; Bermudez, L. E.; Graybeal, J.; Alexander, P.

    2009-12-01

    Semantic interoperability—the exchange of meaning among computer systems—is needed to successfully share data in Ocean Science and across all Earth sciences. The best approach toward semantic interoperability requires a designed framework, and operationally tested tools and infrastructure within that framework. Currently available technologies make a scientific semantic framework feasible, but its development requires sustainable architectural vision and development processes. This presentation outlines the MMI Semantic Framework, including recent progress on it and its client applications. The MMI Semantic Framework consists of tools, infrastructure, and operational and community procedures and best practices, to meet short-term and long-term semantic interoperability goals. The design and prioritization of the semantic framework capabilities are based on real-world scenarios in Earth observation systems. We describe some key uses cases, as well as the associated requirements for building the overall infrastructure, which is realized through the MMI Ontology Registry and Repository. This system includes support for community creation and sharing of semantic content, ontology registration, version management, and seamless integration of user-friendly tools and application programming interfaces. The presentation describes the architectural components for semantic mediation, registry and repository for vocabularies, ontology, and term mappings. We show how the technologies and approaches in the framework can address community needs for managing and exchanging semantic information. We will demonstrate how different types of users and client applications exploit the tools and services for data aggregation, visualization, archiving, and integration. Specific examples from OOSTethys (http://www.oostethys.org) and the Ocean Observatories Initiative Cyberinfrastructure (http://www.oceanobservatories.org) will be cited. Finally, we show how semantic augmentation of web services standards could be performed using framework tools.

  1. Enabling international adoption of LOINC through translation

    PubMed Central

    Vreeman, Daniel J.; Chiaravalloti, Maria Teresa; Hook, John; McDonald, Clement J.

    2012-01-01

    Interoperable health information exchange depends on adoption of terminology standards, but international use of such standards can be challenging because of language differences between local concept names and the standard terminology. To address this important barrier, we describe the evolution of an efficient process for constructing translations of LOINC terms names, the foreign language functions in RELMA, and the current state of translations in LOINC. We also present the development of the Italian translation to illustrate how translation is enabling adoption in international contexts. We built a tool that finds the unique list of LOINC Parts that make up a given set of LOINC terms. This list enables translation of smaller pieces like the core component “hepatitis c virus” separately from all the suffixes that could appear with it, such “Ab.IgG”, “DNA”, and “RNA”. We built another tool that generates a translation of a full LOINC name from all of these atomic pieces. As of version 2.36 (June 2011), LOINC terms have been translated into 9 languages from 15 linguistic variants other than its native English. The five largest linguistic variants have all used the Part-based translation mechanism. However, even with efficient tools and processes, translation of standard terminology is a complex undertaking. Two of the prominent linguistic challenges that translators have faced include: the approach to handling acronyms and abbreviations, and the differences in linguistic syntax (e.g. word order) between languages. LOINC’s open and customizable approach has enabled many different groups to create translations that met their needs and matched their resources. Distributing the standard and its many language translations at no cost worldwide accelerates LOINC adoption globally, and is an important enabler of interoperable health information exchange PMID:22285984

  2. National Evaluation Program CapWIN: the capital wireless integrated net phase III final report.

    DOT National Transportation Integrated Search

    2008-04-01

    The Capital Area Wireless Integrated Net (CapWIN) is comprised of first responder agencies in the Washington, DC metropolitan area. Through the use of the CapWIN application, responders are able to: 1. Exchange messages with other users at roadside l...

  3. The Department of Veterans Affairs' (VA) implementation of the Virtual Lifetime Electronic Record (VLER): findings and lessons learned from Health Information Exchange at 12 sites.

    PubMed

    Byrne, Colene M; Mercincavage, Lauren M; Bouhaddou, Omar; Bennett, Jamie R; Pan, Eric C; Botts, Nathan E; Olinger, Lois M; Hunolt, Elaine; Banty, Karl H; Cromwell, Tim

    2014-08-01

    We describe the Department of Veterans Affairs' (VA) Virtual Lifetime Health Electronic Record (VLER) pilot phase in 12 communities to exchange health information with private sector health care organizations and the Department of Defense (DoD), key findings, lessons, and implications for advancing Health Information Exchanges (HIE), nationally. A mixed methods approach was used to monitor and evaluate the status of VLER Health Exchange pilot phase implementation from December 2009 through October 2012. Selected accomplishments, contributions, challenges, and early lessons that are relevant to the growth of nationwide HIE are discussed. Veteran patient and provider acceptance, trust, and perceived value of VLER Health Exchange are found to be high, and usage by providers is steadily growing. Challenges and opportunities to improve provider use are identified, such as better data quality and integration with workflow. Key findings and lessons for advancing HIE are identified. VLER Health Exchange has made great strides in advancing HIE nationally by addressing important technical and policy issues that have impeded scalability, and by increasing trust and confidence in the value and accuracy of HIE among users. VLER Health Exchange has advanced HIE interoperability standards and patient consent policies nationally. Policy, programmatic, technology, and health Information Technology (IT) standards implications to advance HIE for improved delivery and coordination of health care are discussed. The pilot phase success led to VA-wide deployment of this data sharing capability in 2013. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Motion Imagery and Robotics Application (MIRA)

    NASA Technical Reports Server (NTRS)

    Martinez, Lindolfo; Rich, Thomas

    2011-01-01

    Objectives include: I. Prototype a camera service leveraging the CCSDS Integrated protocol stack (MIRA/SM&C/AMS/DTN): a) CCSDS MIRA Service (New). b) Spacecraft Monitor and Control (SM&C). c) Asynchronous Messaging Service (AMS). d) Delay/Disruption Tolerant Networking (DTN). II. Additional MIRA Objectives: a) Demo of Camera Control through ISS using CCSDS protocol stack (Berlin, May 2011). b) Verify that the CCSDS standards stack can provide end-to-end space camera services across ground and space environments. c) Test interoperability of various CCSDS protocol standards. d) Identify overlaps in the design and implementations of the CCSDS protocol standards. e) Identify software incompatibilities in the CCSDS stack interfaces. f) Provide redlines to the SM&C, AMS, and DTN working groups. d) Enable the CCSDS MIRA service for potential use in ISS Kibo camera commanding. e) Assist in long-term evolution of this entire group of CCSDS standards to TRL 6 or greater.

  5. Evaluation plan for space station network interface units

    NASA Technical Reports Server (NTRS)

    Weaver, Alfred C.

    1990-01-01

    Outlined here is a procedure for evaluating network interface units (NIUs) produced for the Space Station program. The procedures should be equally applicable to the data management system (DMS) testbed NIUs produced by Honeywell and IBM. The evaluation procedures are divided into four areas. Performance measurement tools are hardware and software that must be developed in order to evaluate NIU performance. Performance tests are a series of tests, each of which documents some specific characteristic of NIU and/or network performance. In general, these performance tests quantify the speed, capacity, latency, and reliability of message transmission under a wide variety of conditions. Functionality tests are a series of tests and code inspections that demonstrate the functionality of the particular subset of ISO protocols which have been implemented in a given NIU. Conformance tests are a series of tests which would expose whether or not selected features within the ISO protocols are present and interoperable.

  6. Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment

    NASA Astrophysics Data System (ADS)

    Zeigler, Bernard P.; Lee, J. S.

    1998-08-01

    In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shipman, Galen M.

    These are the slides for a presentation on programming models in HPC, at the Los Alamos National Laboratory's Parallel Computing Summer School. The following topics are covered: Flynn's Taxonomy of computer architectures; single instruction single data; single instruction multiple data; multiple instruction multiple data; address space organization; definition of Trinity (Intel Xeon-Phi is a MIMD architecture); single program multiple data; multiple program multiple data; ExMatEx workflow overview; definition of a programming model, programming languages, runtime systems; programming model and environments; MPI (Message Passing Interface); OpenMP; Kokkos (Performance Portable Thread-Parallel Programming Model); Kokkos abstractions, patterns, policies, and spaces; RAJA, a systematicmore » approach to node-level portability and tuning; overview of the Legion Programming Model; mapping tasks and data to hardware resources; interoperability: supporting task-level models; Legion S3D execution and performance details; workflow, integration of external resources into the programming model.« less

  8. Collective Emotions Online and Their Influence on Community Life

    PubMed Central

    Chmiel, Anna; Sienkiewicz, Julian; Thelwall, Mike; Paltoglou, Georgios; Buckley, Kevan; Kappas, Arvid; Hołyst, Janusz A.

    2011-01-01

    Background E-communities, social groups interacting online, have recently become an object of interdisciplinary research. As with face-to-face meetings, Internet exchanges may not only include factual information but also emotional information – how participants feel about the subject discussed or other group members. Emotions in turn are known to be important in affecting interaction partners in offline communication in many ways. Could emotions in Internet exchanges affect others and systematically influence quantitative and qualitative aspects of the trajectory of e-communities? The development of automatic sentiment analysis has made large scale emotion detection and analysis possible using text messages collected from the web. However, it is not clear if emotions in e-communities primarily derive from individual group members' personalities or if they result from intra-group interactions, and whether they influence group activities. Methodology/Principal Findings Here, for the first time, we show the collective character of affective phenomena on a large scale as observed in four million posts downloaded from Blogs, Digg and BBC forums. To test whether the emotions of a community member may influence the emotions of others, posts were grouped into clusters of messages with similar emotional valences. The frequency of long clusters was much higher than it would be if emotions occurred at random. Distributions for cluster lengths can be explained by preferential processes because conditional probabilities for consecutive messages grow as a power law with cluster length. For BBC forum threads, average discussion lengths were higher for larger values of absolute average emotional valence in the first ten comments and the average amount of emotion in messages fell during discussions. Conclusions/Significance Overall, our results prove that collective emotional states can be created and modulated via Internet communication and that emotional expressiveness is the fuel that sustains some e-communities. PMID:21818302

  9. Supply Chain Interoperability Measurement

    DTIC Science & Technology

    2015-06-19

    Supply Chain Interoperability Measurement DISSERTATION June 2015 Christos E. Chalyvidis, Major, Hellenic Air...ENS-DS-15-J-001 SUPPLY CHAIN INTEROPERABILITY MEASUREMENT DISSERTATION Presented to the Faculty Department of Operational Sciences...INTEROPERABILITY MEASUREMENT Christos E. Chalyvidis, BS, MSc. Major, Hellenic Air Force Committee Membership: Dr. A.W. Johnson Chair

  10. Interoperability and information discovery

    USGS Publications Warehouse

    Christian, E.

    2001-01-01

    In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.

  11. 80 FR 46010 - Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2015-08-03

    ...] Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments AGENCY: Food... workshop entitled ``FDA/CDC/NLM Workshop on Promoting Semantic Interoperability of Laboratory Data.'' The... to promoting the semantic interoperability of laboratory data between in vitro diagnostic devices and...

  12. HL7 and DICOM based integration of radiology departments with healthcare enterprise information systems.

    PubMed

    Blazona, Bojan; Koncar, Miroslav

    2007-12-01

    Integration based on open standards, in order to achieve communication and information interoperability, is one of the key aspects of modern health care information systems. However, this requirement represents one of the major challenges for the Information and Communication Technology (ICT) solutions, as systems today use diverse technologies, proprietary protocols and communication standards which are often not interoperable. One of the main producers of clinical information in healthcare settings represent Radiology Information Systems (RIS) that communicate using widely adopted DICOM (Digital Imaging and COmmunications in Medicine) standard, but in very few cases can efficiently integrate information of interest with other systems. In this context we identified HL7 standard as the world's leading medical ICT standard that is envisioned to provide the umbrella for medical data semantic interoperability, which amongst other things represents the cornerstone for the Croatia's National Integrated Healthcare Information System (IHCIS). The aim was to explore the ability to integrate and exchange RIS originated data with Hospital Information Systems based on HL7's CDA (Clinical Document Architecture) standard. We explored the ability of HL7 CDA specifications and methodology to address the need of RIS integration HL7 based healthcare information systems. We introduced the use of WADO service interconnection to IHCIS and finally CDA rendering in widely used Internet explorers. The outcome of our pilot work proves our original assumption of HL7 standard being able to adopt radiology data into the integrated healthcare systems. Uniform DICOM to CDA translation scripts and business processes within IHCIS is desired and cost effective regarding to use of supporting IHCIS services aligned to SOA.

  13. Comparison of H.323 and SIP for IP telephony signaling

    NASA Astrophysics Data System (ADS)

    Dalgic, Ismail; Fang, Hanlin

    1999-11-01

    Two standards currently compete for the dominance of IP telephony signaling: the H.323 protocol suite by ITU-T, and the Session Initiation Protocol (SIP) by IETF. Both of these signaling protocols provide mechanisms for call establishment and teardown, call control and supplementary services, and capability exchange. We investigate and compare these two protocols in terms of Functionality, Quality of Service (QoS), Scalability, Flexibility, Interoperability, and Ease of Implementation. For fairness of comparison, we consider similar scenarios for both protocols. In particular, we focus on scenarios that involve a gatekeeper for H.323, and a Proxy/Redirect server for SIP. The reason is that medium-to-large IP Telephony systems are not manageable without a gatekeeper or proxy server. We consider all three versions of H.323. In terms of functionality and services that can be supported, H.323 version 2 and SIP are very similar. However, supplementary services in H.323 are more rigorously defined, and therefore fewer interoperability issues are expected among its implementations. Furthermore, H.323 has taken more steps to ensure compatibility among its different versions, and to interoperate with PSTN. The two protocols are comparable in their QoS support [similar call setup delays, no support for resource reservation or class of service (CoS) setting], but H.323 version 3 will allow signaling of the requested CoS. SIP's primary advantages are (1) flexibility to add new features, and (2) relative ease of implementation and debugging. Finally, we note that H.323 and SIP are improving themselves by learning from each other, and the differences between them are diminishing with each new version.

  14. Data interoperability software solution for emergency reaction in the Europe Union

    NASA Astrophysics Data System (ADS)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2014-09-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision-making slower and more difficult. However, spread and development of networks and IT-based Emergency Management Systems (EMS) has improved emergency responses, becoming more coordinated. Despite improvements made in recent years, EMS have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision-making. In addition, from a technical perspective, the consolidation of current EMS and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMS surrounded by different contexts. To overcome these problems we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries cultural linguistic issues. To deal with the diversity of data protocols and formats, we have designed a Service Oriented Architecture for Data Interoperability (named DISASTER) providing a flexible extensible solution to solve the mediation issues. Web Services have been adopted as specific technology to implement such paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency first responders: the Netherlands-Germany border fire.

  15. Investigating the capabilities of semantic enrichment of 3D CityEngine data

    NASA Astrophysics Data System (ADS)

    Solou, Dimitra; Dimopoulou, Efi

    2016-08-01

    In recent years the development of technology and the lifting of several technical limitations, has brought the third dimension to the fore. The complexity of urban environments and the strong need for land administration, intensify the need of using a three-dimensional cadastral system. Despite the progress in the field of geographic information systems and 3D modeling techniques, there is no fully digital 3D cadastre. The existing geographic information systems and the different methods of three-dimensional modeling allow for better management, visualization and dissemination of information. Nevertheless, these opportunities cannot be totally exploited because of deficiencies in standardization and interoperability in these systems. Within this context, CityGML was developed as an international standard of the Open Geospatial Consortium (OGC) for 3D city models' representation and exchange. CityGML defines geometry and topology for city modeling, also focusing on semantic aspects of 3D city information. The scope of CityGML is to reach common terminology, also addressing the imperative need for interoperability and data integration, taking into account the number of available geographic information systems and modeling techniques. The aim of this paper is to develop an application for managing semantic information of a model generated based on procedural modeling. The model was initially implemented in CityEngine ESRI's software, and then imported to ArcGIS environment. Final goal was the original model's semantic enrichment and then its conversion to CityGML format. Semantic information management and interoperability seemed to be feasible by the use of the 3DCities Project ESRI tools, since its database structure ensures adding semantic information to the CityEngine model and therefore automatically convert to CityGML for advanced analysis and visualization in different application areas.

  16. 75 FR 63462 - Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-15

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid Interoperability Standards October 7, 2010... directs the development of a framework to achieve interoperability of smart grid devices and systems...

  17. 81 FR 68435 - Workshop on Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2016-10-04

    ...] Workshop on Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments... Semantic Interoperability of Laboratory Data.'' The purpose of this public workshop is to receive and... Semantic Interoperability of Laboratory Data.'' Received comments will be placed in the docket and, except...

  18. Fair and Square? An Examination of Classroom Justice and Relational Teaching Messages

    ERIC Educational Resources Information Center

    Young, Laura E.; Horan, Sean M.; Frisby, Brandi N.

    2013-01-01

    Students and instructors acknowledge the importance of the instructor-student relationship in the classroom. Despite the importance of the instructor-student interpersonal relationship, there can also be unexpected or undesirable outcomes associated with relational teaching. Using the theoretical framework of leader-member exchange, we explored…

  19. Toward a Semantic Forum for Active Collaborative Learning

    ERIC Educational Resources Information Center

    Li, Yanyan; Dong, Mingkai; Huang, Ronghuai

    2009-01-01

    Online discussion forums provide open workspace allowing learners to share information, exchange ideas, address problems and discuss on specific themes. But the substantial impediment to its promotion as effective e-learning facility lies in the continuously increasing messages but with discrete and incoherent structure as well as the loosely-tied…

  20. Steganography and Cryptography Inspired Enhancement of Introductory Programming Courses

    ERIC Educational Resources Information Center

    Kortsarts, Yana; Kempner, Yulia

    2015-01-01

    Steganography is the art and science of concealing communication. The goal of steganography is to hide the very existence of information exchange by embedding messages into unsuspicious digital media covers. Cryptography, or secret writing, is the study of the methods of encryption, decryption and their use in communications protocols.…

  1. 77 FR 20671 - Self-Regulatory Organizations; NASDAQ OMX BX, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-05

    ... Primary Pegged Orders with an offset amount will never be displayed. The text of the proposed rule change.... Under the Exchange's current rule, Midpoint Pegged Orders are not displayed, while Primary and Market... messaging and ``quote flickering.'' A [[Page 20672

  2. In Pursuit of Theoretical Ground in Behavior Change Support Systems: Analysis of Peer-to-Peer Communication in a Health-Related Online Community

    PubMed Central

    Cobb, Nathan; Cohen, Trevor

    2016-01-01

    Background Research studies involving health-related online communities have focused on examining network structure to understand mechanisms underlying behavior change. Content analysis of the messages exchanged in these communities has been limited to the “social support” perspective. However, existing behavior change theories suggest that message content plays a prominent role reflecting several sociocognitive factors that affect an individual’s efforts to make a lifestyle change. An understanding of these factors is imperative to identify and harness the mechanisms of behavior change in the Health 2.0 era. Objective The objective of this work is two-fold: (1) to harness digital communication data to capture essential meaning of communication and factors affecting a desired behavior change, and (2) to understand the applicability of existing behavior change theories to characterize peer-to-peer communication in online platforms. Methods In this paper, we describe grounded theory–based qualitative analysis of digital communication in QuitNet, an online community promoting smoking cessation. A database of 16,492 de-identified public messages from 1456 users from March 1-April 30, 2007, was used in our study. We analyzed 795 messages using grounded theory techniques to ensure thematic saturation. This analysis enabled identification of key concepts contained in the messages exchanged by QuitNet members, allowing us to understand the sociobehavioral intricacies underlying an individual’s efforts to cease smoking in a group setting. We further ascertained the relevance of the identified themes to theoretical constructs in existing behavior change theories (eg, Health Belief Model) and theoretically linked techniques of behavior change taxonomy. Results We identified 43 different concepts, which were then grouped under 12 themes based on analysis of 795 messages. Examples of concepts include “sleepiness,” “pledge,” “patch,” “spouse,” and “slip.” Examples of themes include “traditions,” “social support,” “obstacles,” “relapse,” and “cravings.” Results indicate that themes consisting of member-generated strategies such as “virtual bonfires” and “pledges” were related to the highest number of theoretical constructs from the existing behavior change theories. In addition, results indicate that the member-generated communication content supports sociocognitive constructs from more than one behavior change model, unlike the majority of the existing theory-driven interventions. Conclusions With the onset of mobile phones and ubiquitous Internet connectivity, online social network data reflect the intricacies of human health behavior as experienced by health consumers in real time. This study offers methodological insights for qualitative investigations that examine the various kinds of behavioral constructs prevalent in the messages exchanged among users of online communities. Theoretically, this study establishes the manifestation of existing behavior change theories in QuitNet-like online health communities. Pragmatically, it sets the stage for real-time, data-driven sociobehavioral interventions promoting healthy lifestyle modifications by allowing us to understand the emergent user needs to sustain a desired behavior change. PMID:26839162

  3. The timeless caring connection.

    PubMed

    Schwerin, Judi I

    2004-01-01

    Caring is the essence of nursing. However, one cannot assume that caring is enough, especially if the recipient has not perceived the message of caring. The nursing connection is defined as the successful communication of caring by a nurse, whereby the recipient of care can trust the message of caring and respond to the message in its entirety. The caring connection has special significance today because of the changing societal trends, the domination of chronic illness as the major player in the population's morbidity and mortality, and the need for the public to practice prevention and/or management of chronic disease through lifestyle changes. The challenge for nursing is presented in the role of the nurse as a coach. Key strategies for establishing a caring connection in the nurse-client relationship are identified. An authentic caring connection is an empowering dialogue and exchange of the human spirit-nothing but the sacred ground in nursing practice.

  4. RIPE integrity primitives, part 2 (RACE Integrity Primitives Evaluation)

    NASA Astrophysics Data System (ADS)

    Denboer, B.; Boly, J. P.; Bosselaers, A.; Brandt, J.; Chaum, D.; Damgaard, I.; Dichtl, M.; Fumy, W.; Vanderham, M.; Jansen, C. J. A.

    1993-04-01

    A manual intended for those seeking to secure information systems by applying modern cryptography is presented. It represents the successful attainment of goals by RIPE (RACE (Research and development of Advanced Communications technology in Europe) Integrity Primitives Evaluation). The recommended portfolio of integrity primitives, which is the main product of the project, forms the heart of the manual. By integrity, is meant the kinds of security that can be achieved through cryptography, apart from keeping messages secret. Thus included are ways to ensure that stored or communicated data is not illicitly modified, that parties exchanging messages are actually present, and that 'signed' electronic messages can be recognized as authentic by anyone. Of particular concern to the project were the high speed requirements of broadband communication. The project also aimed for completeness in its recommendations. As a result, the portfolio contains primitives, that is building blocks, that can meet most of today's perceived needs for integrity.

  5. RIPE integrity primitives, part 1 (RACE Integrity Primitives Evaluation)

    NASA Astrophysics Data System (ADS)

    Denboer, B.; Boly, J. P.; Bosselaers, A.; Brandt, J.; Chaum, D.; Damgaard, I.; Dichtl, M.; Fumy, W.; Vanderham, M.; Jansen, C. J. A.

    1993-04-01

    A manual intended for those seeking to secure information systems by applying modern cryptography is presented. It represents the successful attainment of goals by RIPE (RACE (Research and development of Advanced Communication technology in Europe) Integrity Primitives Evaluation). The recommended portfolio of integrity primitives, which is the main product of the project, forms the heart of the manual. By integrity, is meant the kinds of security that can be achieved through cryptography, apart from keeping messages secret. Thus included are ways to ensure that stored or communicated data is not illicitly modified, that parties exchanging messages are actually present, and that 'signed' electronic messages can be recognized as authentic by anyone. Of particular concern to the project were the high speed requirements of broadband communication. The project also aimed for completeness in its recommendations. As a result, the portfolio contains primitives, that is building blocks, that can meet most of today's perceived needs for integrity.

  6. Effect of mobile phone reminder messages on adherence of stent removal or exchange in patients with benign pancreaticobiliary diseases: a prospectively randomized, controlled study.

    PubMed

    Gu, Yong; Wang, Limei; Zhao, Lina; Liu, Zhiguo; Luo, Hui; Tao, Qin; Zhang, Rongchun; He, Shuixiang; Wang, Xiangping; Huang, Rui; Zhang, Linhui; Pan, Yanglin; Guo, Xuegang

    2016-08-26

    Plastic and covered metal stents need to be removed or exchanged within appropriate time in case of undesirable complications. However, it is not uncommon that patients do not follow the recommendation for further stent management after Endoscopic Retrograde Cholangiopancreatography (ERCP). The effect of short message service (SMS) intervention monthly on the stent removal/exchange adherence in patients after ERCP is unknown at this time. A prospective, randomized controlled study was conducted. After receiving regular instructions, patients were randomly assigned to receive SMS reminding monthly (SMS group) for stent removal/exchange or not (control group). The primary outcome was stent removal/exchange adherence within appropriate time (4 months for plastic stent or 7 months for covered stent). Multivariate analysis was performed to assess factors associated with stent removal/exchange adherence within appropriate time. Intention-to-treat analysis was used. A total of 48 patients were randomized, 23 to the SMS group and 25 to the control. Adherence to stent removal/exchange was reported in 78.2 % (18/23) of patients receiving the SMS intervention compared with 40 % (10/25) in the control group (RR 1.98, 95 % CI 1.16-3.31; p = 0 · 010). Among patients with plastic stent insertion, the median interval time from stent implantation to stent removal/exchange were 90 days in the SMS group and 136 days in the control respectively (HR 0.36, 95 % CI 0.16-0.84, p = 0.018). No difference was found between the two groups regarding late-stage stent-related complications. The rate of recurrent abdominal pain tended to be lower in SMS group without significant difference (8.7 vs 28 %, p = 0.144). Multivariate logistic regression analyses revealed that SMS reminding was the only factor associated with adherence of stent removal/exchange (OR 6.73, 95 % CI 1.64-27.54, p = 0.008). This first effectiveness trial demonstrated that SMS reminding monthly could significantly increase the patient adherence to stent removal/exchange after ERCP. The study was respectively registered on July 10 in 2016 at ClinicalTrials.gov ( NCT02831127 ).

  7. Method for modeling social care processes for national information exchange.

    PubMed

    Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit

    2012-01-01

    Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.

  8. On architecting and composing engineering information services to enable smart manufacturing

    PubMed Central

    Ivezic, Nenad; Srinivasan, Vijay

    2016-01-01

    Engineering information systems play an important role in the current era of digitization of manufacturing, which is a key component to enable smart manufacturing. Traditionally, these engineering information systems spanned the lifecycle of a product by providing interoperability of software subsystems through a combination of open and proprietary exchange of data. But research and development efforts are underway to replace this paradigm with engineering information services that can be composed dynamically to meet changing needs in the operation of smart manufacturing systems. This paper describes the opportunities and challenges in architecting such engineering information services and composing them to enable smarter manufacturing. PMID:27840595

  9. 76 FR 66040 - NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-25

    ...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...

  10. Enabling interoperability in planetary sciences and heliophysics: The case for an information model

    NASA Astrophysics Data System (ADS)

    Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.

    2018-01-01

    The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.

  11. A methodology based on openEHR archetypes and software agents for developing e-health applications reusing legacy systems.

    PubMed

    Cardoso de Moraes, João Luís; de Souza, Wanderley Lopes; Pires, Luís Ferreira; do Prado, Antonio Francisco

    2016-10-01

    In Pervasive Healthcare, novel information and communication technologies are applied to support the provision of health services anywhere, at anytime and to anyone. Since health systems may offer their health records in different electronic formats, the openEHR Foundation prescribes the use of archetypes for describing clinical knowledge in order to achieve semantic interoperability between these systems. Software agents have been applied to simulate human skills in some healthcare procedures. This paper presents a methodology, based on the use of openEHR archetypes and agent technology, which aims to overcome the weaknesses typically found in legacy healthcare systems, thereby adding value to the systems. This methodology was applied in the design of an agent-based system, which was used in a realistic healthcare scenario in which a medical staff meeting to prepare a cardiac surgery has been supported. We conducted experiments with this system in a distributed environment composed by three cardiology clinics and a center of cardiac surgery, all located in the city of Marília (São Paulo, Brazil). We evaluated this system according to the Technology Acceptance Model. The case study confirmed the acceptance of our agent-based system by healthcare professionals and patients, who reacted positively with respect to the usefulness of this system in particular, and with respect to task delegation to software agents in general. The case study also showed that a software agent-based interface and a tools-based alternative must be provided to the end users, which should allow them to perform the tasks themselves or to delegate these tasks to other people. A Pervasive Healthcare model requires efficient and secure information exchange between healthcare providers. The proposed methodology allows designers to build communication systems for the message exchange among heterogeneous healthcare systems, and to shift from systems that rely on informal communication of actors to a more automated and less error-prone agent-based system. Our methodology preserves significant investment of many years in the legacy systems and allows developers to extend them adding new features to these systems, by providing proactive assistance to the end-users and increasing the user mobility with an appropriate support. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Advancing the Strategic Messages Affecting Robot Trust Effect: The Dynamic of User- and Robot-Generated Content on Human-Robot Trust and Interaction Outcomes.

    PubMed

    Liang, Yuhua Jake; Lee, Seungcheol Austin

    2016-09-01

    Human-robot interaction (HRI) will soon transform and shift the communication landscape such that people exchange messages with robots. However, successful HRI requires people to trust robots, and, in turn, the trust affects the interaction. Although prior research has examined the determinants of human-robot trust (HRT) during HRI, no research has examined the messages that people received before interacting with robots and their effect on HRT. We conceptualize these messages as SMART (Strategic Messages Affecting Robot Trust). Moreover, we posit that SMART can ultimately affect actual HRI outcomes (i.e., robot evaluations, robot credibility, participant mood) by affording the persuasive influences from user-generated content (UGC) on participatory Web sites. In Study 1, participants were assigned to one of two conditions (UGC/control) in an original experiment of HRT. Compared with the control (descriptive information only), results showed that UGC moderated the correlation between HRT and interaction outcomes in a positive direction (average Δr = +0.39) for robots as media and robots as tools. In Study 2, we explored the effect of robot-generated content but did not find similar moderation effects. These findings point to an important empirical potential to employ SMART in future robot deployment.

  13. 76 FR 51271 - Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the 700 MHz Band

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-18

    ... Docket 07-100; FCC 11-6] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the... interoperable public safety broadband network. The establishment of a common air interface for 700 MHz public safety broadband networks will create a foundation for interoperability and provide a clear path for the...

  14. Political, policy and social barriers to health system interoperability: emerging opportunities of Web 2.0 and 3.0.

    PubMed

    Juzwishin, Donald W M

    2009-01-01

    Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.

  15. D-ATM, a working example of health care interoperability: From dirt path to gravel road.

    PubMed

    DeClaris, John-William

    2009-01-01

    For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.

  16. Health information exchange: national and international approaches.

    PubMed

    Vest, Joshua R

    2012-01-01

    Health information exchange (HIE), the process of electronically moving patient-level information between different organizations, is viewed as a solution to the fragmentation of data in health care. This review provides a description of the current state of HIE in seven nations, as well was three international HIE efforts, with a particular focus on the relation of exchange efforts to national health care systems, common challenges, and the implications of cross-border information sharing. National and international efforts highlighted in English language informatics journals, professional associations, and government reports are described. Fully functioning HIE is not yet a common phenomenon worldwide. However, multiple nations see the potential benefits of HIE and that has led to national and international efforts of varying scope, scale, and purview. National efforts continue to work to overcome the challenges of interoperability, record linking, insufficient infrastructures, governance, and interorganizational relationships, but have created architectural strategies, oversight agencies, and incentives to foster exchange. The three international HIE efforts reviewed represent very different approaches to the same problem of ensuring the availability of health information across borders. The potential of HIE to address many cost and quality issues will ensure HIE remains on many national agendas. In many instances, health care executives and leaders have opportunities to work within national programs to help shape local exchange governance and decide technology partners. Furthermore, HIE raises policy questions concerning the role of centralized planning, national identifiers, standards, and types of information exchanged, each of which are vital issues to individual health organizations and worthy of their attention.

  17. Interoperability of Information Systems Managed and Used by the Local Health Departments.

    PubMed

    Shah, Gulzar H; Leider, Jonathon P; Luo, Huabin; Kaur, Ravneet

    2016-01-01

    In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide.

  18. Saying What You Mean without Being Mean

    ERIC Educational Resources Information Center

    Reilly, Marceta

    2016-01-01

    There's bad news and good news about feedback and teachers collaborating: Getting feedback on our performance is a great way to grow as educators--but feedback often backfires and doesn't produce change in the person getting the feedback. Reilly notes that there are two components to a feedback exchange: the content--the message the person…

  19. Academic Dishonesty: Behaviors, Sanctions, and Retention of Adjudicated College Students

    ERIC Educational Resources Information Center

    Olafson, Lori; Schraw, Gregory; Kehrwald, Nicholas

    2014-01-01

    Academic dishonesty, also known as academic misconduct, includes a variety of actions such as plagiarism, cheating on tests using text messaging or concealed notes, exchanging work with other students, buying essays from students or on the Internet, and having other students write examinations (Diekhoff, LaBeff, Shinohara, & Yasukawa, 1999;…

  20. A Cell Phone in the Classroom: A Friend or a Foe?

    ERIC Educational Resources Information Center

    Irina, Averianova

    2012-01-01

    Communication is getting increasingly mobile, with more than a third of the world's population using cellular phones. Recent statistics indicate that this proportion is much bigger among young people. Research has also registered significant predominance of short message exchange over other modes of interaction in youth culture, where e-mail is…

  1. 77 FR 56900 - Self-Regulatory Organizations; EDGA Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-14

    ... to manage their order and message flow consistently with their business models. In addition, the... pricing models that are designed to incentivize customers to increase liquidity, without any restriction... Reference Room, 100 F Street NE., Washington, DC 20549, on official business days between the hours of 10 a...

  2. The Beat of Boyle Street: Empowering Aboriginal Youth through Music Making

    ERIC Educational Resources Information Center

    Wang, Elaine L.

    2010-01-01

    An irrepressibly popular musical phenomenon, hip-hop is close to spoken word and focuses on lyrics with a message, reviving local traditions of song that tell histories, counsel listeners, and challenge participants to outdo one another in clever exchanges. A hip-hop music-making program in Edmonton, Canada, successfully reengages at-risk…

  3. A Stateful Multicast Access Control Mechanism for Future Metro-Area-Networks.

    ERIC Educational Resources Information Center

    Sun, Wei-qiang; Li, Jin-sheng; Hong, Pei-lin

    2003-01-01

    Multicasting is a necessity for a broadband metro-area-network; however security problems exist with current multicast protocols. A stateful multicast access control mechanism, based on MAPE, is proposed. The architecture of MAPE is discussed, as well as the states maintained and messages exchanged. The scheme is flexible and scalable. (Author/AEF)

  4. National electronic health record interoperability chronology.

    PubMed

    Hufnagel, Stephen P

    2009-05-01

    The federal initiative for electronic health record (EHR) interoperability began in 2000 and set the stage for the establishment of the 2004 Executive Order for EHR interoperability by 2014. This article discusses the chronology from the 2001 e-Government Consolidated Health Informatics (CHI) initiative through the current congressional mandates for an aligned, interoperable, and agile DoD AHLTA and VA VistA.

  5. Multiple Two-Way Time Message Exchange (TTME) Time Synchronization for Bridge Monitoring Wireless Sensor Networks

    PubMed Central

    Shi, Fanrong; Tuo, Xianguo; Yang, Simon X.; Li, Huailiang; Shi, Rui

    2017-01-01

    Wireless sensor networks (WSNs) have been widely used to collect valuable information in Structural Health Monitoring (SHM) of bridges, using various sensors, such as temperature, vibration and strain sensors. Since multiple sensors are distributed on the bridge, accurate time synchronization is very important for multi-sensor data fusion and information processing. Based on shape of the bridge, a spanning tree is employed to build linear topology WSNs and achieve time synchronization in this paper. Two-way time message exchange (TTME) and maximum likelihood estimation (MLE) are employed for clock offset estimation. Multiple TTMEs are proposed to obtain a subset of TTME observations. The time out restriction and retry mechanism are employed to avoid the estimation errors that are caused by continuous clock offset and software latencies. The simulation results show that the proposed algorithm could avoid the estimation errors caused by clock drift and minimize the estimation error due to the large random variable delay jitter. The proposed algorithm is an accurate and low complexity time synchronization algorithm for bridge health monitoring. PMID:28471418

  6. Multiple Two-Way Time Message Exchange (TTME) Time Synchronization for Bridge Monitoring Wireless Sensor Networks.

    PubMed

    Shi, Fanrong; Tuo, Xianguo; Yang, Simon X; Li, Huailiang; Shi, Rui

    2017-05-04

    Wireless sensor networks (WSNs) have been widely used to collect valuable information in Structural Health Monitoring (SHM) of bridges, using various sensors, such as temperature, vibration and strain sensors. Since multiple sensors are distributed on the bridge, accurate time synchronization is very important for multi-sensor data fusion and information processing. Based on shape of the bridge, a spanning tree is employed to build linear topology WSNs and achieve time synchronization in this paper. Two-way time message exchange (TTME) and maximum likelihood estimation (MLE) are employed for clock offset estimation. Multiple TTMEs are proposed to obtain a subset of TTME observations. The time out restriction and retry mechanism are employed to avoid the estimation errors that are caused by continuous clock offset and software latencies. The simulation results show that the proposed algorithm could avoid the estimation errors caused by clock drift and minimize the estimation error due to the large random variable delay jitter. The proposed algorithm is an accurate and low complexity time synchronization algorithm for bridge health monitoring.

  7. A large-alphabet three-party quantum key distribution protocol based on orbital and spin angular momenta hybrid entanglement

    NASA Astrophysics Data System (ADS)

    Lai, Hong; Luo, Mingxing; Zhang, Jun; Pieprzyk, Josef; Pan, Lei; Orgun, Mehmet A.

    2018-07-01

    The orthogonality of the orbital angular momentum (OAM) eigenstates enables a single photon carry an arbitrary number of bits. Moreover, additional degrees of freedom (DOFs) of OAM can span a high-dimensional Hilbert space, which could greatly increase information capacity and security. Moreover, the use of the spin angular momentum-OAM hybrid entangled state can increase Shannon dimensionality, because photons can be hybrid entangled in multiple DOFs. Based on these observations, we develop a hybrid entanglement quantum key distribution (QKD) protocol to achieve three-party quantum key distribution without classical message exchanges. In our proposed protocol, a communicating party uses a spatial light modulator (SLM) and a specific phase hologram to modulate photons' OAM state. Similarly, the other communicating parties use their SLMs and the fixed different phase holograms to modulate the OAM entangled photon pairs, producing the shared key among the parties Alice, Bob and Charlie without classical message exchanges. More importantly, when the same operation is repeated for every party, our protocol could be extended to a multiple-party QKD protocol.

  8. Employing Semantic Technologies for the Orchestration of Government Services

    NASA Astrophysics Data System (ADS)

    Sabol, Tomáš; Furdík, Karol; Mach, Marián

    The main aim of the eGovernment is to provide efficient, secure, inclusive services for its citizens and businesses. The necessity to integrate services and information resources, to increase accessibility, to reduce the administrative burden on citizens and enterprises - these are only a few reasons why the paradigm of the eGovernment has been shifted from the supply-driven approach toward the connected governance, emphasizing the concept of interoperability (Archmann and Nielsen 2008). On the EU level, the interoperability is explicitly addressed as one of the four main challenges, including in the i2010 strategy (i2010 2005). The Commission's Communication (Interoperability for Pan-European eGovernment Services 2006) strongly emphasizes the necessity of interoperable eGovernment services, based on standards, open specifications, and open interfaces. The Pan-European interoperability initiatives, such as the European Interoperability Framework (2004) and IDABC, as well as many projects supported by the European Commission within the IST Program and the Competitiveness and Innovation Program (CIP), illustrate the importance of interoperability on the EU level.

  9. Empowering open systems through cross-platform interoperability

    NASA Astrophysics Data System (ADS)

    Lyke, James C.

    2014-06-01

    Most of the motivations for open systems lie in the expectation of interoperability, sometimes referred to as "plug-and-play". Nothing in the notion of "open-ness", however, guarantees this outcome, which makes the increased interest in open architecture more perplexing. In this paper, we explore certain themes of open architecture. We introduce the concept of "windows of interoperability", which can be used to align disparate portions of architecture. Such "windows of interoperability", which concentrate on a reduced set of protocol and interface features, might achieve many of the broader purposes assigned as benefits in open architecture. Since it is possible to engineer proprietary systems that interoperate effectively, this nuanced definition of interoperability may in fact be a more important concept to understand and nurture for effective systems engineering and maintenance.

  10. Hanging by a thread: exploring the features of nonresponse in an online young adult cancer survivorship support community.

    PubMed

    Crook, Brittani; Glowacki, Elizabeth M; Love, Brad; Jones, Barbara L; Macpherson, Catherine Fiona; Johnson, Rebecca H

    2016-02-01

    Finding helpful information can be challenging for young adult (YA) cancer survivors; thus, it is critical to examine features of online posts that successfully solicit responses and assess how these differ from posts that do not solicit responses. Using posts from an online YA cancer support community, we analyzed initial posts that did and did not receive replies utilizing Linguistic Inquiry Word Count (LIWC). Independent t tests revealed significant differences between the sets of posts regarding content, emotions, cognitive processes, pronoun use, and linguistic complexity. More specifically, posts with replies contained fewer words per sentence, had more first-person pronouns, had more expressions of negative emotions, and contained more present tense and past tense verbs. The findings of this study can help improve peer-exchanged support in online communities so that YA cancer survivors can more effectively receive digital support. This research also provides communication researchers, health educators, and care providers a lens for understanding the YA cancer survivorship experience. This research helps survivors be strategic in how they use online forums to seek advice and support. More complete understanding of what kinds of prompts produce responses allows those in need to craft messages in ways that are most likely to elicit support from fellow cancer survivors. These implications for message design extend beyond blogging and can be applicable for text message and email exchanges between cancer patients and their care providers.

  11. Key pillars of data interoperability in Earth Sciences - INSPIRE and beyond

    NASA Astrophysics Data System (ADS)

    Tomas, Robert; Lutz, Michael

    2013-04-01

    The well-known heterogeneity and fragmentation of data models, formats and controlled vocabularies of environmental data limit potential data users from utilising the wealth of environmental information available today across Europe. The main aim of INSPIRE1 is to improve this situation and give users possibility to access, use and correctly interpret environmental data. Over the past years number of INSPIRE technical guidelines (TG) and implementing rules (IR) for interoperability have been developed, involving hundreds of domain experts from across Europe. The data interoperability specifications, which have been developed for all 34 INSPIRE spatial data themes2, are the central component of the TG and IR. Several of these themes are related to the earth sciences, e.g. geology (including hydrogeology, geophysics and geomorphology), mineral and energy resources, soil science, natural hazards, meteorology, oceanography, hydrology and land cover. The following main pillars for data interoperability and harmonisation have been identified during the development of the specifications: Conceptual data models describe the spatial objects and their properties and relationships for the different spatial data themes. To achieve cross-domain harmonization, the data models for all themes are based on a common modelling framework (the INSPIRE Generic Conceptual Model3) and managed in a common UML repository. Harmonised vocabularies (or code lists) are to be used in data exchange in order to overcome interoperability issues caused by heterogeneous free-text and/or multi-lingual content. Since a mapping to a harmonized vocabulary could be difficult, the INSPIRE data models typically allow the provision of more specific terms from local vocabularies in addition to the harmonized terms - utilizing either the extensibility options or additional terminological attributes. Encoding. Currently, specific XML profiles of the Geography Markup Language (GML) are promoted as the standard encoding. However, since the conceptual models are independent of concrete encodings, it is also possible to derive other encodings (e.g. based on RDF). Registers provide unique and persistent identifiers for a number of different types of information items (e.g. terms from a controlled vocabulary or units of measure) and allow their consistent management and versioning. By using these identifiers in data, references to specific information items can be made unique and unambiguous. It is important that these interoperability solutions are not developed in isolation - for Europe only. This has been identified from the beginning, and therefore, international standards have been taken into account and been widely referred to in INSPIRE. This mutual cooperation with international standardisation activities needs to be maintained or even extended. For example, where INSPIRE has gone beyond existing standards, the INSPIRE interoperability solutions should be introduced to the international standardisation initiatives. However, in some cases, it is difficult to choose the appropriate international organization or standardisation body (e.g. where there are several organizations overlapping in scope) or to achieve international agreements that accept European specifics. Furthermore, the development of the INSPIRE specifications (to be legally adopted in 2013) is only a beginning of the effort to make environmental data interoperable. Their actual implementation by data providers across Europe, as well as the rapid development in the earth sciences (e.g. from new simulation models, scientific advances, etc.) and ICT technology will lead to requests for changes. It is therefore crucial to ensure the long-term sustainable maintenance and further development of the proposed infrastructure. This task cannot be achieved by the INSPIRE coordination team of the European Commission alone. It is therefore crucial to closely involve relevant (where possible, umbrella) organisations in the earth sciences, who can provide the necessary domain knowledge and expert networks.

  12. Ontology-Based Architecture for Intelligent Transportation Systems Using a Traffic Sensor Network.

    PubMed

    Fernandez, Susel; Hadfi, Rafik; Ito, Takayuki; Marsa-Maestre, Ivan; Velasco, Juan R

    2016-08-15

    Intelligent transportation systems are a set of technological solutions used to improve the performance and safety of road transportation. A crucial element for the success of these systems is the exchange of information, not only between vehicles, but also among other components in the road infrastructure through different applications. One of the most important information sources in this kind of systems is sensors. Sensors can be within vehicles or as part of the infrastructure, such as bridges, roads or traffic signs. Sensors can provide information related to weather conditions and traffic situation, which is useful to improve the driving process. To facilitate the exchange of information between the different applications that use sensor data, a common framework of knowledge is needed to allow interoperability. In this paper an ontology-driven architecture to improve the driving environment through a traffic sensor network is proposed. The system performs different tasks automatically to increase driver safety and comfort using the information provided by the sensors.

  13. Wrestling With a Paradox: Complexity in Interoperability Standards Making for Healthcare Information Systems

    NASA Astrophysics Data System (ADS)

    Pittaway, Jeff; Archer, Norm

    Medical interventions are often delayed or erroneous when information needed for diagnosing or prescribing is missing or unavailable. In support of increased information flows, the healthcare industry has invested substantially in standards intended to specify, routinize, and make uniform the type and format of medical information in clinical healthcare information systems such as Electronic Medical Record systems (EMRs). However, fewer than one in four Canadian physicians have adopted EMRs. Deeper analysis illustrates that physicians may perceive value in standardized EMRs when they need to exchange information in highly structured situations among like participants and like environments. However, standards present restrictive barriers to practitioners when they face equivocal situations, unforeseen contingencies, or exchange information across different environments. These barriers constitute a compelling explanation for at least part of the observed low EMR adoption rates. Our recommendations to improve the perceived value of standardized clinical information systems espouse re-conceptualizing the role of standards to embrace greater flexibility in some areas.

  14. Assessing the performance of LOINC® and RadLex for coverage of CT scans across three sites in a health information exchange.

    PubMed

    Beitia, Anton Oscar; Kuperman, Gilad; Delman, Bradley N; Shapiro, Jason S

    2013-01-01

    We evaluated the performance of LOINC® and RadLex standard terminologies for covering CT test names from three sites in a health information exchange (HIE) with the eventual goal of building an HIE-based clinical decision support system to alert providers of prior duplicate CTs. Given the goal, the most important parameter to assess was coverage for high frequency exams that were most likely to be repeated. We showed that both LOINC® and RadLex provided sufficient coverage for our use case through calculations of (a) high coverage of 90% and 94%, respectively for the subset of CTs accounting for 99% of exams performed and (b) high concept token coverage (total percentage of exams performed that map to terminologies) of 92% and 95%, respectively. With trends toward greater interoperability, this work may provide a framework for those wishing to map radiology site codes to a standard nomenclature for purposes of tracking resource utilization.

  15. Ontology-Based Architecture for Intelligent Transportation Systems Using a Traffic Sensor Network

    PubMed Central

    Fernandez, Susel; Hadfi, Rafik; Ito, Takayuki; Marsa-Maestre, Ivan; Velasco, Juan R.

    2016-01-01

    Intelligent transportation systems are a set of technological solutions used to improve the performance and safety of road transportation. A crucial element for the success of these systems is the exchange of information, not only between vehicles, but also among other components in the road infrastructure through different applications. One of the most important information sources in this kind of systems is sensors. Sensors can be within vehicles or as part of the infrastructure, such as bridges, roads or traffic signs. Sensors can provide information related to weather conditions and traffic situation, which is useful to improve the driving process. To facilitate the exchange of information between the different applications that use sensor data, a common framework of knowledge is needed to allow interoperability. In this paper an ontology-driven architecture to improve the driving environment through a traffic sensor network is proposed. The system performs different tasks automatically to increase driver safety and comfort using the information provided by the sensors. PMID:27537878

  16. Data management and data enrichment for systems biology projects.

    PubMed

    Wittig, Ulrike; Rey, Maja; Weidemann, Andreas; Müller, Wolfgang

    2017-11-10

    Collecting, curating, interlinking, and sharing high quality data are central to de.NBI-SysBio, the systems biology data management service center within the de.NBI network (German Network for Bioinformatics Infrastructure). The work of the center is guided by the FAIR principles for scientific data management and stewardship. FAIR stands for the four foundational principles Findability, Accessibility, Interoperability, and Reusability which were established to enhance the ability of machines to automatically find, access, exchange and use data. Within this overview paper we describe three tools (SABIO-RK, Excemplify, SEEK) that exemplify the contribution of de.NBI-SysBio services to FAIR data, models, and experimental methods storage and exchange. The interconnectivity of the tools and the data workflow within systems biology projects will be explained. For many years we are the German partner in the FAIRDOM initiative (http://fair-dom.org) to establish a European data and model management service facility for systems biology. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Interaction Patterns of Nurturant Support Exchanged in Online Health Social Networking

    PubMed Central

    Yang, Christopher C

    2012-01-01

    Background Expressing emotion in online support communities is an important aspect of enabling e-patients to connect with each other and expand their social resources. Indirectly it increases the amount of support for coping with health issues. Exploring the supportive interaction patterns in online health social networking would help us better understand how technology features impacts user behavior in this context. Objective To build on previous research that identified different types of social support in online support communities by delving into patterns of supportive behavior across multiple computer-mediated communication formats. Each format combines different architectural elements, affecting the resulting social spaces. Our research question compared communication across different formats of text-based computer-mediated communication provided on the MedHelp.org health social networking environment. Methods We identified messages with nurturant support (emotional, esteem, and network) across three different computer-mediated communication formats (forums, journals, and notes) of an online support community for alcoholism using content analysis. Our sample consisted of 493 forum messages, 423 journal messages, and 1180 notes. Results Nurturant support types occurred frequently among messages offering support (forum comments: 276/412 messages, 67.0%; journal posts: 65/88 messages, 74%; journal comments: 275/335 messages, 82.1%; and notes: 1002/1180 messages, 84.92%), but less often among messages requesting support. Of all the nurturing supports, emotional (ie, encouragement) appeared most frequently, with network and esteem support appearing in patterns of varying combinations. Members of the Alcoholism Community appeared to adapt some traditional face-to-face forms of support to their needs in becoming sober, such as provision of encouragement, understanding, and empathy to one another. Conclusions The computer-mediated communication format may have the greatest influence on the supportive interactions because of characteristics such as audience reach and access. Other factors include perception of community versus personal space or purpose of communication. These results lead to a need for further research. PMID:22555303

  18. Interaction patterns of nurturant support exchanged in online health social networking.

    PubMed

    Chuang, Katherine Y; Yang, Christopher C

    2012-05-03

    Expressing emotion in online support communities is an important aspect of enabling e-patients to connect with each other and expand their social resources. Indirectly it increases the amount of support for coping with health issues. Exploring the supportive interaction patterns in online health social networking would help us better understand how technology features impacts user behavior in this context. To build on previous research that identified different types of social support in online support communities by delving into patterns of supportive behavior across multiple computer-mediated communication formats. Each format combines different architectural elements, affecting the resulting social spaces. Our research question compared communication across different formats of text-based computer-mediated communication provided on the MedHelp.org health social networking environment. We identified messages with nurturant support (emotional, esteem, and network) across three different computer-mediated communication formats (forums, journals, and notes) of an online support community for alcoholism using content analysis. Our sample consisted of 493 forum messages, 423 journal messages, and 1180 notes. Nurturant support types occurred frequently among messages offering support (forum comments: 276/412 messages, 67.0%; journal posts: 65/88 messages, 74%; journal comments: 275/335 messages, 82.1%; and notes: 1002/1180 messages, 84.92%), but less often among messages requesting support. Of all the nurturing supports, emotional (ie, encouragement) appeared most frequently, with network and esteem support appearing in patterns of varying combinations. Members of the Alcoholism Community appeared to adapt some traditional face-to-face forms of support to their needs in becoming sober, such as provision of encouragement, understanding, and empathy to one another. The computer-mediated communication format may have the greatest influence on the supportive interactions because of characteristics such as audience reach and access. Other factors include perception of community versus personal space or purpose of communication. These results lead to a need for further research.

  19. Health in arts: are arts settings better than sports settings for promoting anti-smoking messages?

    PubMed

    Davies, Christina; Knuiman, Matthew; Pikora, Terri; Rosenberg, Michael

    2015-05-01

    Tobacco smoking is a leading cause of preventable mortality and morbidity. Since 1991, the Western Australian Health Promotion Foundation (Healthway) has sponsored the arts and sport in exchange for cigarette smoke-free events, smoke-free policies and the promotion of anti-smoking messages (e.g. Quit, Smoke Free or Smarter than Smoking). As health promoters often look for innovative and effective settings to advocate health, and as the approach of sponsoring the arts to promote health to the general population is uncommon, the purpose of this study was to evaluate the effectiveness of 'health in arts' by measuring the cognitive impact (message awareness, comprehension, acceptance and intention) of promoting anti-smoking messages at arts events, and comparing findings to sports events, a more traditional health promotion setting. A secondary analysis of the 2004-2009 Healthway Sponsorship Monitor data was conducted. A total of 12 arts events (n = 592 respondents) and 9 sports events (n = 420 respondents) sponsored by Healthway to promote an anti-smoking message were evaluated. The study was cross-sectional in design. Participants were residents of Western Australia aged 15 years or above and attended events as part of an audience or as a spectator. Descriptive and regression analyses were conducted. After adjustment for demographic variables, smoking status and clustering, arts events were found to be as effective in promoting anti-smoking message awareness, comprehension and acceptance and twice as effective on intention to act (p = .03) compared with sports events. This study provides evidence of the effectiveness of arts sponsorship to promote health to the general population, that is, health in arts. Promoting an anti-smoking message in arts settings was as, or more, effective than in sports settings. Results suggest that the arts should be utilised to communicate and reinforce anti-smoking messages to the general population. The suitability of the arts to promote other types of health messages should be investigated further. © Royal Society for Public Health 2013.

  20. Managing Interoperability for GEOSS - A Report from the SIF

    NASA Astrophysics Data System (ADS)

    Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.

    2009-04-01

    The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of GEOSS.

Top