CCSDS SM and C Mission Operations Interoperability Prototype
NASA Technical Reports Server (NTRS)
Lucord, Steven A.
2010-01-01
This slide presentation reviews the prototype of the Spacecraft Monitor and Control (SM&C) Operations for interoperability among other space agencies. This particular prototype uses the German Space Agency (DLR) to test the ideas for interagency coordination.
NASA Technical Reports Server (NTRS)
Lucord, Steve A.; Gully, Sylvain
2009-01-01
The purpose of the PROTOTYPE INTEROPERABILITY DOCUMENT is to document the design and interfaces for the service providers and consumers of a Mission Operations prototype between JSC-OTF and DLR-GSOC. The primary goal is to test the interoperability sections of the CCSDS Spacecraft Monitor & Control (SM&C) Mission Operations (MO) specifications between both control centers. An additional goal is to provide feedback to the Spacecraft Monitor and Control (SM&C) working group through the Review Item Disposition (RID) process. This Prototype is considered a proof of concept and should increase the knowledge base of the CCSDS SM&C Mission Operations standards. No operational capabilities will be provided. The CCSDS Mission Operations (MO) initiative was previously called Spacecraft Monitor and Control (SM&C). The specifications have been renamed to better reflect the scope and overall objectives. The working group retains the name Spacecraft Monitor and Control working group and is under the Mission Operations and Information Services Area (MOIMS) of CCSDS. This document will refer to the specifications as SM&C Mission Operations, Mission Operations or just MO.
CCSDS Spacecraft Monitor and Control Mission Operations Interoperability Prototype
NASA Technical Reports Server (NTRS)
Lucord, Steve; Martinez, Lindolfo
2009-01-01
We are entering a new era in space exploration. Reduced operating budgets require innovative solutions to leverage existing systems to implement the capabilities of future missions. Custom solutions to fulfill mission objectives are no longer viable. Can NASA adopt international standards to reduce costs and increase interoperability with other space agencies? Can legacy systems be leveraged in a service oriented architecture (SOA) to further reduce operations costs? The Operations Technology Facility (OTF) at the Johnson Space Center (JSC) is collaborating with Deutsches Zentrum fur Luft- und Raumfahrt (DLR) to answer these very questions. The Mission Operations and Information Management Services Area (MOIMS) Spacecraft Monitor and Control (SM&C) Working Group within the Consultative Committee for Space Data Systems (CCSDS) is developing the Mission Operations standards to address this problem space. The set of proposed standards presents a service oriented architecture to increase the level of interoperability among space agencies. The OTF and DLR are developing independent implementations of the standards as part of an interoperability prototype. This prototype will address three key components: validation of the SM&C Mission Operations protocol, exploration of the Object Management Group (OMG) Data Distribution Service (DDS), and the incorporation of legacy systems in a SOA. The OTF will implement the service providers described in the SM&C Mission Operation standards to create a portal for interaction with a spacecraft simulator. DLR will implement the service consumers to perform the monitor and control of the spacecraft. The specifications insulate the applications from the underlying transport layer. We will gain experience with a DDS transport layer as we delegate responsibility to the middleware and explore transport bridges to connect disparate middleware products. A SOA facilitates the reuse of software components. The prototype will leverage the capabilities of existing legacy systems. Various custom applications and middleware solutions will be combined into one system providing the illusion of a set of homogenous services. This paper will document our journey as we implement the interoperability prototype. The team consists of software engineers with experience on the current command, telemetry and messaging systems that support the International Space Station (ISS) and Space Shuttle programs. Emphasis will be on the objectives, results and potential cost saving benefits.
OTF CCSDS Mission Operations Prototype Parameter Service. Phase I: Exit Presentation
NASA Technical Reports Server (NTRS)
Reynolds, Walter F.; Lucord, Steven A.; Stevens, John E.
2009-01-01
This slide presentation reviews the prototype of phase 1 of the parameter service design of the CCSDS mission operations. The project goals are to: (1) Demonstrate the use of Mission Operations standards to implement the Parameter Service (2) Demonstrate interoperability between Houston MCC and a CCSDS Mission Operations compliant mission operations center (3) Utilize Mission Operations Common Architecture. THe parameter service design, interfaces, and structures are described.
OTF CCSDS Mission Operations Prototype. Directory and Action Service. Phase I: Exit Presentation
NASA Technical Reports Server (NTRS)
Reynolds, Walter F.; Lucord, Steven A.; Stevens, John E.
2009-01-01
This slide presentation describes the phase I directory and action service prototype for the CCSDS system. The project goals are to: (1) Demonstrate the use of Mission Operations standards to implement Directory and Action Services (2) Investigate Mission Operations language neutrality (3) Investigate C3I XML interoperability concepts (4) Integrate applicable open source technologies in a Service Oriented Architecture
NASA Technical Reports Server (NTRS)
Hall, Laverne; Hung, Chaw-Kwei; Lin, Imin
2000-01-01
The purpose of this paper is to provide a description of NASA JPL Distributed Systems Technology (DST) Section's object-oriented component approach to open inter-operable systems software development and software reuse. It will address what is meant by the terminology object component software, give an overview of the component-based development approach and how it relates to infrastructure support of software architectures and promotes reuse, enumerate on the benefits of this approach, and give examples of application prototypes demonstrating its usage and advantages. Utilization of the object-oriented component technology approach for system development and software reuse will apply to several areas within JPL, and possibly across other NASA Centers.
Securing Sensitive Flight and Engine Simulation Data Using Smart Card Technology
NASA Technical Reports Server (NTRS)
Blaser, Tammy M.
2003-01-01
NASA Glenn Research Center has developed a smart card prototype capable of encrypting and decrypting disk files required to run a distributed aerospace propulsion simulation. Triple Data Encryption Standard (3DES) encryption is used to secure the sensitive intellectual property on disk pre, during, and post simulation execution. The prototype operates as a secure system and maintains its authorized state by safely storing and permanently retaining the encryption keys only on the smart card. The prototype is capable of authenticating a single smart card user and includes pre simulation and post simulation tools for analysis and training purposes. The prototype's design is highly generic and can be used to protect any sensitive disk files with growth capability to urn multiple simulations. The NASA computer engineer developed the prototype on an interoperable programming environment to enable porting to other Numerical Propulsion System Simulation (NPSS) capable operating system environments.
NASA Astrophysics Data System (ADS)
Crutcher, Richard I.; Jones, R. W.; Moore, Michael R.; Smith, S. F.; Tolley, Alan L.; Rochelle, Robert W.
1997-02-01
A prototype 'smart' repeater that provides interoperability capabilities for radio communication systems in multi-agency and multi-user scenarios is being developed by the Oak Ridge National Laboratory. The smart repeater functions as a deployable communications platform that can be dynamically reconfigured to cross-link the radios of participating federal, state, and local government agencies. This interconnection capability improves the coordination and execution of multi-agency operations, including coordinated law enforcement activities and general emergency or disaster response scenarios. The repeater provides multiple channels of operation in the 30-50, 118-136, 138-174, and 403-512 MHz land mobile communications and aircraft bands while providing the ability to cross-connect among multiple frequencies, bands, modulation types, and encryption formats. Additionally, two telephone interconnects provide links to the fixed and cellular telephone networks. The 800- and 900-MHz bands are not supported by the prototype, but the modular design of the system accommodates future retrofits to extend frequency capabilities with minimal impact to the system. Configuration of the repeater is through a portable personal computer with a Windows-based graphical interface control screen that provides dynamic reconfiguration of network interconnections and formats.
Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E.
2014-01-01
Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information. PMID:25954452
Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E
2014-01-01
Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.
A Framework for Seamless Interoperation of Heterogeneous Distributed Software Components
2005-05-01
interoperability, b) distributed resource discovery, and c) validation of quality requirements. Principles and prototypical systems were created to demonstrate the successful completion of the research.
NASA Astrophysics Data System (ADS)
Schaap, D.
2015-12-01
Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP 2 for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; establishing interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) global portal; establishing common standards for a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE)
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Glaves, Helen
2016-04-01
Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP II for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: 1. establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; 2. establishing interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) global portal; 3. the establishment of common standards for a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE)
Pyke, Christopher R; Madan, Isaac
2013-08-01
The real estate industry routinely uses specialized information systems for functions, including design, construction, facilities management, brokerage, tax assessment, and utilities. These systems are mature and effective within vertically integrated market segments. However, new questions are reaching across these traditional information silos. For example, buyers may be interested in evaluating the design, energy efficiency characteristics, and operational performance of a commercial building. This requires the integration of information across multiple databases held by different institutions. Today, this type of data integration is difficult to automate and propone to errors due, in part, to the lack of generally accepted building and spaces identifiers. Moving forward, the real estate industry needs a new mechanism to assign identifiers for whole buildings and interior spaces for the purpose of interoperability, data exchange, and integration. This paper describes a systematic process to identify activities occurring at building or within interior spaces to provide a foundation for exchange and interoperability. We demonstrate the application of the approach with a prototype Web application. This concept and demonstration illustrate the elements of a practical interoperability framework that can increase productivity, create new business opportunities, and reduce errors, waste, and redundancy. © 2013 New York Academy of Sciences.
The Health Service Bus: an architecture and case study in achieving interoperability in healthcare.
Ryan, Amanda; Eklund, Peter
2010-01-01
Interoperability in healthcare is a requirement for effective communication between entities, to ensure timely access to up to-date patient information and medical knowledge, and thus facilitate consistent patient care. An interoperability framework called the Health Service Bus (HSB), based on the Enterprise Service Bus (ESB) middleware software architecture is presented here as a solution to all three levels of interoperability as defined by the HL7 EHR Interoperability Work group in their definitive white paper "Coming to Terms". A prototype HSB system was implemented based on the Mule Open-Source ESB and is outlined and discussed, followed by a clinically-based example.
Archive interoperability in the Virtual Observatory
NASA Astrophysics Data System (ADS)
Genova, Françoise
2003-02-01
Main goals of Virtual Observatory projects are to build interoperability between astronomical on-line services, observatory archives, databases and results published in journals, and to develop tools permitting the best scientific usage from the very large data sets stored in observatory archives and produced by large surveys. The different Virtual Observatory projects collaborate to define common exchange standards, which are the key for a truly International Virtual Observatory: for instance their first common milestone has been a standard allowing exchange of tabular data, called VOTable. The Interoperability Work Area of the European Astrophysical Virtual Observatory project aims at networking European archives, by building a prototype using the CDS VizieR and Aladin tools, and at defining basic rules to help archive providers in interoperability implementation. The prototype is accessible for scientific usage, to get user feedback (and science results!) at an early stage of the project. ISO archive participates very actively to this endeavour, and more generally to information networking. The on-going inclusion of the ISO log in SIMBAD will allow higher level links for users.
NASA Astrophysics Data System (ADS)
Arney, David; Goldman, Julian M.; Whitehead, Susan F.; Lee, Insup
When a x-ray image is needed during surgery, clinicians may stop the anesthesia machine ventilator while the exposure is made. If the ventilator is not restarted promptly, the patient may experience severe complications. This paper explores the interconnection of a ventilator and simulated x-ray into a prototype plug-and-play medical device system. This work assists ongoing interoperability framework development standards efforts to develop functional and non-functional requirements and illustrates the potential patient safety benefits of interoperable medical device systems by implementing a solution to a clinical use case requiring interoperability.
Report on the Second Catalog Interoperability Workshop
NASA Technical Reports Server (NTRS)
Thieman, James R.; James, Mary E.
1988-01-01
The events, resolutions, and recommendations of the Second Catalog Interoperability Workshop, held at JPL in January, 1988, are discussed. This workshop dealt with the issues of standardization and communication among directories, catalogs, and inventories in the earth and space science data management environment. The Directory Interchange Format, being constructed as a standard for the exchange of directory information among participating data systems, is discussed. Involvement in the Interoperability effort by NASA, NOAA, ISGS, and NSF is described, and plans for future interoperability considered. The NASA Master Directory prototype is presented and critiqued and options for additional capabilities debated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.
This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less
Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.
2016-08-10
This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less
Interoperability prototype between hospitals and general practitioners in Switzerland.
Alves, Bruno; Müller, Henning; Schumacher, Michael; Godel, David; Abu Khaled, Omar
2010-01-01
Interoperability in data exchange has the potential to improve the care processes and decrease costs of the health care system. Many countries have related eHealth initiatives in preparation or already implemented. In this area, Switzerland has yet to catch up. Its health system is fragmented, because of the federated nature of cantons. It is thus more difficult to coordinate efforts between the existing healthcare actors. In the Medicoordination project a pragmatic approach was selected: integrating several partners in healthcare on a regional scale in French speaking Switzerland. In parallel with the Swiss eHealth strategy, currently being elaborated by the Swiss confederation, particularly medium-sized hospitals and general practitioners were targeted in Medicoordination to implement concrete scenarios of information exchange between hospitals and general practitioners with a high added value. In this paper we focus our attention on a prototype implementation of one chosen scenario: the discharge summary. Although simple in concept, exchanging release letters shows small, hidden difficulties due to the multi-partner nature of the project. The added value of such a prototype is potentially high and it is now important to show that interoperability can work in practice.
A Tale of Two Observing Systems: Interoperability in the World of Microsoft Windows
NASA Astrophysics Data System (ADS)
Babin, B. L.; Hu, L.
2008-12-01
Louisiana Universities Marine Consortium's (LUMCON) and Dauphin Island Sea Lab's (DISL) Environmental Monitoring System provide a unified coastal ocean observing system. These two systems are mirrored to maintain autonomy while offering an integrated data sharing environment. Both systems collect data via Campbell Scientific Data loggers, store the data in Microsoft SQL servers, and disseminate the data in real- time on the World Wide Web via Microsoft Internet Information Servers and Active Server Pages (ASP). The utilization of Microsoft Windows technologies presented many challenges to these observing systems as open source tools for interoperability grow. The current open source tools often require the installation of additional software. In order to make data available through common standards formats, "home grown" software has been developed. One example of this is the development of software to generate xml files for transmission to the National Data Buoy Center (NDBC). OOSTethys partners develop, test and implement easy-to-use, open-source, OGC-compliant software., and have created a working prototype of networked, semantically interoperable, real-time data systems. Partnering with OOSTethys, we are developing a cookbook to implement OGC web services. The implementation will be written in ASP, will run in a Microsoft operating system environment, and will serve data via Sensor Observation Services (SOS). This cookbook will give observing systems running Microsoft Windows the tools to easily participate in the Open Geospatial Consortium (OGC) Oceans Interoperability Experiment (OCEANS IE).
Orlova, Anna O; Dunnagan, Mark; Finitzo, Terese; Higgins, Michael; Watkins, Todd; Tien, Allen; Beales, Steven
2005-01-01
Information exchange, enabled by computable interoperability, is the key to many of the initiatives underway including the development of Regional Health Information Exchanges, Regional Health Information Organizations, and the National Health Information Network. These initiatives must include public health as a full partner in the emerging transformation of our nation's healthcare system through the adoption and use of information technology. An electronic health record - public health (EHR-PH)system prototype was developed to demonstrate the feasibility of electronic data transfer from a health care provider, i.e. hospital or ambulatory care settings, to multiple customized public health systems which include a Newborn Metabolic Screening Registry, a Newborn Hearing Screening Registry, an Immunization Registry and a Communicable Disease Registry, using HL7 messaging standards. Our EHR-PH system prototype can be considered a distributed EHR-based RHIE/RHIO model - a principal element for a potential technical architecture for a NHIN.
Testbeds for Assessing Critical Scenarios in Power Control Systems
NASA Astrophysics Data System (ADS)
Dondossola, Giovanna; Deconinck, Geert; Garrone, Fabrizio; Beitollahi, Hakem
The paper presents a set of control system scenarios implemented in two testbeds developed in the context of the European Project CRUTIAL - CRitical UTility InfrastructurAL Resilience. The selected scenarios refer to power control systems encompassing information and communication security of SCADA systems for grid teleoperation, impact of attacks on inter-operator communications in power emergency conditions, impact of intentional faults on the secondary and tertiary control in power grids with distributed generators. Two testbeds have been developed for assessing the effect of the attacks and prototyping resilient architectures.
Interoperable and standard e-Health solution over Bluetooth.
Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J
2010-01-01
The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.
ERIC Educational Resources Information Center
Arms, William Y.; Hillmann, Diane; Lagoze, Carl; Krafft, Dean; Marisa, Richard; Saylor, John; Terizzi, Carol; Van de Sompel, Herbert; Gill, Tony; Miller, Paul; Kenney, Anne R.; McGovern, Nancy Y.; Botticelli, Peter; Entlich, Richard; Payette, Sandra; Berthon, Hilary; Thomas, Susan; Webb, Colin; Nelson, Michael L.; Allen, B. Danette; Bennett, Nuala A.; Sandore, Beth; Pianfetti, Evangeline S.
2002-01-01
Discusses digital libraries, including interoperability, metadata, and international standards; Web resource preservation efforts at Cornell University; digital preservation at the National Library of Australia; object persistence and availability; collaboration among libraries, museums and elementary schools; Asian digital libraries; and a Web…
Advanced ASON prototyping research activities in China
NASA Astrophysics Data System (ADS)
Hu, WeiSheng; Jin, Yaohui; Guo, Wei; Su, Yikai; He, Hao; Sun, Weiqiang
2005-02-01
This paper provides an overview of prototyping research activities of automatically switched optical networks and transport networks (ASONs/ASTNs) in China. In recent years, China has recognized the importance and benefits of the emerging ASON/ASTN techniques. During the period of 2001 and 2002, the national 863 Program of China started the preliminary ASON research projects with the main objectives to build preliminary ASON testbeds, develop control plane protocols and test their performance in the testbeds. During the period of 2003 and 2004, the 863 program started ASTN prototyping equipment projects for more practical applications. Totally 12 ASTN equipments are being developed by three groups led by Chinese venders: ZTE with Beijing University of Posts and Telecommunications (BUPT), Wuhan Research Institute of Posts and Telecommunication (WRI) with Shanghai Jiao Tong University (SJTU), and Huawei Inc. Meanwhile, as the ASTN is maturing, some of the China"s carries are participating in the OIF"s World Interoperability Demonstration, carrying out ASTN test, or deploying ASTN backbone networks. Finally, several ASTN backbone networks being tested or deployed now will be operated by the carries in 2005. The 863 Program will carry out an ASTN field trail in Yangtse River Delta, and finally deploy the 3TNET. 3TNET stands for Tbps transmission, Tbps switching, and Tbps routing, as well as a network integrating the above techniques. A task force under the "863" program is responsible for ASTN equipment specifications and interoperation agreements, technical coordination among all the participants, schedule of the whole project during the project undergoing, and organization of internetworking of all the equipments in the laboratories and field trials.
Integrating Space Communication Network Capabilities via Web Portal Technologies
NASA Technical Reports Server (NTRS)
Johnston, Mark D.; Lee, Carlyn-Ann; Lau, Chi-Wung; Cheung, Kar-Ming; Levesque, Michael; Carruth, Butch; Coffman, Adam; Wallace, Mike
2014-01-01
We have developed a service portal prototype as part of an investigation into the feasibility of using Java portlet technology as a means of providing integrated access to NASA communications network services. Portal servers provide an attractive platform for this role due to the various built-in collaboration applications they can provide, combined with the possibility to develop custom inter-operating portlets to extent their functionality while preserving common presentation and behavior. This paper describes various options for integration of network services related to planning and scheduling, and results based on use of a popular open-source portal framework. Plans are underway to develop an operational SCaN Service Portal, building on the experiences reported here.
Daskalakis, S; Mantas, J
2009-01-01
The evaluation of a service-oriented prototype implementation for healthcare interoperability. A prototype framework was developed, aiming to exploit the use of service-oriented architecture (SOA) concepts for achieving healthcare interoperability and to move towards a virtual patient record (VPR) paradigm. The prototype implementation was evaluated for its hypothetical adoption. The evaluation strategy was based on the initial proposition of the DeLone and McLean model of information systems (IS) success [1], as modeled by Iivari [2]. A set of SOA and VPR characteristics were empirically encapsulated within the dimensions of IS success model, combined with measures from previous research works. The data gathered was analyzed using partial least squares (PLS). The results highlighted that system quality is a partial predictor of system use but not of user satisfaction. On the contrary, information quality proved to be a significant predictor of user satisfaction and partially a strong significant predictor of system use. Moreover, system use did not prove to be a significant predictor of individual impact whereas the bi-directional relation between use and user satisfaction did not confirm. Additionally, user satisfaction was found to be a strong significant predictor of individual impact. Finally, individual impact proved to be a strong significant predictor of organizational impact. The empirical study attempted to obtain hypothetical, but still useful beliefs and perceptions regarding the SOA prototype implementation. The deduced observations can form the basis for further investigation regarding the adaptability of SOA implementations with VPR characteristics in the healthcare domain.
Orlova, Anna O.; Dunnagan, Mark; Finitzo, Terese; Higgins, Michael; Watkins, Todd; Tien, Allen; Beales, Steven
2005-01-01
Information exchange, enabled by computable interoperability, is the key to many of the initiatives underway including the development of Regional Health Information Exchanges, Regional Health Information Organizations, and the National Health Information Network. These initiatives must include public health as a full partner in the emerging transformation of our nation’s healthcare system through the adoption and use of information technology. An electronic health record - public health (EHR-PH) system prototype was developed to demonstrate the feasibility of electronic data transfer from a health care provider, i.e. hospital or ambulatory care settings, to multiple customized public health systems which include a Newborn Metabolic Screening Registry, a Newborn Hearing Screening Registry, an Immunization Registry and a Communicable Disease Registry, using HL7 messaging standards. Our EHR-PH system prototype can be considered a distributed EHR-based RHIE/RHIO model - a principal element for a potential technical architecture for a NHIN. PMID:16779105
Connecting the clinical IT infrastructure to a service-oriented architecture of medical devices.
Andersen, Björn; Kasparick, Martin; Ulrich, Hannes; Franke, Stefan; Schlamelcher, Jan; Rockstroh, Max; Ingenerf, Josef
2018-02-23
The new medical device communication protocol known as IEEE 11073 SDC is well-suited for the integration of (surgical) point-of-care devices, so are the established Health Level Seven (HL7) V2 and Digital Imaging and Communications in Medicine (DICOM) standards for the communication of systems in the clinical IT infrastructure (CITI). An integrated operating room (OR) and other integrated clinical environments, however, need interoperability between both domains to fully unfold their potential for improving the quality of care as well as clinical workflows. This work thus presents concepts for the propagation of clinical and administrative data to medical devices, physiologic measurements and device parameters to clinical IT systems, as well as image and multimedia content in both directions. Prototypical implementations of the derived components have proven to integrate well with systems of networked medical devices and with the CITI, effectively connecting these heterogeneous domains. Our qualitative evaluation indicates that the interoperability concepts are suitable to be integrated into clinical workflows and are expected to benefit patients and clinicians alike. The upcoming HL7 Fast Healthcare Interoperability Resources (FHIR) communication standard will likely change the domain of clinical IT significantly. A straightforward mapping to its resource model thus ensures the tenability of these concepts despite a foreseeable change in demand and requirements.
Software Integration in Multi-scale Simulations: the PUPIL System
NASA Astrophysics Data System (ADS)
Torras, J.; Deumens, E.; Trickey, S. B.
2006-10-01
The state of the art for computational tools in both computational chemistry and computational materials physics includes many algorithms and functionalities which are implemented again and again. Several projects aim to reduce, eliminate, or avoid this problem. Most such efforts seem to be focused within a particular specialty, either quantum chemistry or materials physics. Multi-scale simulations, by their very nature however, cannot respect that specialization. In simulation of fracture, for example, the energy gradients that drive the molecular dynamics (MD) come from a quantum mechanical treatment that most often derives from quantum chemistry. That “QM” region is linked to a surrounding “CM” region in which potentials yield the forces. The approach therefore requires the integration or at least inter-operation of quantum chemistry and materials physics algorithms. The same problem occurs in “QM/MM” simulations in computational biology. The challenge grows if pattern recognition or other analysis codes of some kind must be used as well. The most common mode of inter-operation is user intervention: codes are modified as needed and data files are managed “by hand” by the user (interactively and via shell scripts). User intervention is however inefficient by nature, difficult to transfer to the community, and prone to error. Some progress (e.g Sethna’s work at Cornell [C.R. Myers et al., Mat. Res. Soc. Symp. Proc., 538(1999) 509, C.-S. Chen et al., Poster presented at the Material Research Society Meeting (2000)]) has been made on using Python scripts to achieve a more efficient level of interoperation. In this communication we present an alternative approach to merging current working packages without the necessity of major recoding and with only a relatively light wrapper interface. The scheme supports communication among the different components required for a given multi-scale calculation and access to the functionalities of those components for the potential user. A general main program allows the management of every package with a special communication protocol between their interfaces following the directives introduced by the user which are stored in an XML structured file. The initial prototype of the PUPIL (Program for User Packages Interfacing and Linking) system has been done using Java as a fast, easy prototyping object oriented (OO) language. In order to test it, we have applied this prototype to a previously studied problem, the fracture of a silica nanorod. We did so joining two different packages to do a QM/MD calculation. The results show the potential for this software system to do different kind of simulations and its simplicity of maintenance.
Multi-disciplinary interoperability challenges (Ian McHarg Medal Lecture)
NASA Astrophysics Data System (ADS)
Annoni, Alessandro
2013-04-01
Global sustainability research requires multi-disciplinary efforts to address the key research challenges to increase our understanding of the complex relationships between environment and society. For this reason dependence on ICT systems interoperability is rapidly growing but, despite some relevant technological improvement is observed, in practice operational interoperable solutions are still lacking. Among the causes is the absence of a generally accepted definition of "interoperability" in all its broader aspects. In fact the concept of interoperability is just a concept and the more popular definitions are not addressing all challenges to realize operational interoperable solutions. The problem become even more complex when multi-disciplinary interoperability is required because in that case solutions for interoperability of different interoperable solution should be envisaged. In this lecture the following definition will be used: "interoperability is the ability to exchange information and to use it". In the lecture the main challenges for addressing multi-disciplinary interoperability will be presented and a set of proposed approaches/solutions shortly introduced.
SMART on FHIR: a standards-based, interoperable apps platform for electronic health records
Kreda, David A; Mandl, Kenneth D; Kohane, Isaac S; Ramoni, Rachel B
2016-01-01
Objective In early 2010, Harvard Medical School and Boston Children’s Hospital began an interoperability project with the distinctive goal of developing a platform to enable medical applications to be written once and run unmodified across different healthcare IT systems. The project was called Substitutable Medical Applications and Reusable Technologies (SMART). Methods We adopted contemporary web standards for application programming interface transport, authorization, and user interface, and standard medical terminologies for coded data. In our initial design, we created our own openly licensed clinical data models to enforce consistency and simplicity. During the second half of 2013, we updated SMART to take advantage of the clinical data models and the application-programming interface described in a new, openly licensed Health Level Seven draft standard called Fast Health Interoperability Resources (FHIR). Signaling our adoption of the emerging FHIR standard, we called the new platform SMART on FHIR. Results We introduced the SMART on FHIR platform with a demonstration that included several commercial healthcare IT vendors and app developers showcasing prototypes at the Health Information Management Systems Society conference in February 2014. This established the feasibility of SMART on FHIR, while highlighting the need for commonly accepted pragmatic constraints on the base FHIR specification. Conclusion In this paper, we describe the creation of SMART on FHIR, relate the experience of the vendors and developers who built SMART on FHIR prototypes, and discuss some challenges in going from early industry prototyping to industry-wide production use. PMID:26911829
NASA Astrophysics Data System (ADS)
Barnett, Barry S.; Bovik, Alan C.
1995-04-01
This paper presents a real time full motion video conferencing system based on the Visual Pattern Image Sequence Coding (VPISC) software codec. The prototype system hardware is comprised of two personal computers, two camcorders, two frame grabbers, and an ethernet connection. The prototype system software has a simple structure. It runs under the Disk Operating System, and includes a user interface, a video I/O interface, an event driven network interface, and a free running or frame synchronous video codec that also acts as the controller for the video and network interfaces. Two video coders have been tested in this system. Simple implementations of Visual Pattern Image Coding and VPISC have both proven to support full motion video conferencing with good visual quality. Future work will concentrate on expanding this prototype to support the motion compensated version of VPISC, as well as encompassing point-to-point modem I/O and multiple network protocols. The application will be ported to multiple hardware platforms and operating systems. The motivation for developing this prototype system is to demonstrate the practicality of software based real time video codecs. Furthermore, software video codecs are not only cheaper, but are more flexible system solutions because they enable different computer platforms to exchange encoded video information without requiring on-board protocol compatible video codex hardware. Software based solutions enable true low cost video conferencing that fits the `open systems' model of interoperability that is so important for building portable hardware and software applications.
[Comprehensive system integration and networking in operating rooms].
Feußner, H; Ostler, D; Kohn, N; Vogel, T; Wilhelm, D; Koller, S; Kranzfelder, M
2016-12-01
A comprehensive surveillance and control system integrating all devices and functions is a precondition for realization of the operating room of the future. Multiple proprietary integrated operation room systems are currently available with a central user interface; however, they only cover a relatively small part of all functionalities. Internationally, there are at least three different initiatives to promote a comprehensive systems integration and networking in the operating room: the Japanese smart cyber operating theater (SCOT), the American medical device plug-and-play interoperability program (MDPnP) and the German secure and dynamic networking in operating room and hospital (OR.NET) project supported by the Federal Ministry of Education and Research. Within the framework of the internationally advanced OR.NET project, prototype solution approaches were realized, which make short-term and mid-term comprehensive data retrieval systems probable. An active and even autonomous control of the medical devices by the surveillance and control system (closed loop) is expected only in the long run due to strict regulatory barriers.
Moving Beyond the 10,000 Ways That Don't Work
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Arctur, D. K.; Rueda, C.
2009-12-01
From his research in developing light bulb filaments, Thomas Edison provide us with a good lesson to advance any venture. He said "I have not failed, I've just found 10,000 ways that won't work." Advancing data and access interoperability is one of those ventures difficult to achieve because of the differences among the participating communities. Even within the marine domain, different communities exist and with them different technologies (formats and protocols) to publish data and its descriptions, and different vocabularies to name things (e.g. parameters, sensor types). Simplifying the heterogeneity of technologies is not only accomplished by adopting standards, but by creating profiles, and advancing tools that use those standards. In some cases, standards are advanced by building from existing tools. But what is the best strategy? Edison could provide us a hint. Prototypes and test beds are essential to achieve interoperability among geospatial communities. The Open Geospatial Consortium (OGC) calls them interoperability experiments. The World Wide Web Consortium (W3C) calls them incubator projects. Prototypes help test and refine specifications. The Marine Metadata Interoperability (MMI) Initiative, which is advancing marine data integration and re-use by promoting community solutions, understood this strategy and started an interoperability demonstration with the SURA Coastal Ocean Observing and Prediction (SCOOP) program. This interoperability demonstration transformed into the OGC Ocean Science Interoperability Experiment (Oceans IE). The Oceans IE brings together the Ocean-Observing community to advance interoperability of ocean observing systems by using OGC Standards. The Oceans IE Phase I investigated the use of OGC Web Feature Service (WFS) and OGC Sensor Observation Service (SOS) standards for representing and exchanging point data records from fixed in-situ marine platforms. The Oceans IE Phase I produced an engineering best practices report, advanced reference implementations, and submitted various change requests that are now being considered by the OGC SOS working group. Building on Phase I, and with a focus on semantically-enabled services, Oceans IE Phase II will continue the use and improvement of OGC specifications in the marine community. We will present the lessons learned and in particular the strategy of experimenting with technologies to advance standards to publish data in marine communities, which could also help advance interoperability in other geospatial communities. We will also discuss the growing collaborations among ocean-observing standards organizations that will bring about the institutional acceptance needed for these technologies and practices to gain traction globally.
Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interop...
UHF (Ultra High Frequency) Military Satellite Communications Ground Equipment Interoperability.
1986-10-06
crisis management requires interoperability between various services. These short-term crises often arise from unforeseen circumstances in which...Scheduler Qualcomm has prepared an interoperability study for the JTC3A (Reference 15) as a TA/CE for USCINCLANT ROC 5-84 requirements. It has defined a...interoperability is fundamental. A number of operational crises have occurred where interoperable communications or the lack of interoperable
Field evaluation of a prototype paper-based point-of-care fingerstick transaminase test.
Pollock, Nira R; McGray, Sarah; Colby, Donn J; Noubary, Farzad; Nguyen, Huyen; Nguyen, The Anh; Khormaee, Sariah; Jain, Sidhartha; Hawkins, Kenneth; Kumar, Shailendra; Rolland, Jason P; Beattie, Patrick D; Chau, Nguyen V; Quang, Vo M; Barfield, Cori; Tietje, Kathy; Steele, Matt; Weigl, Bernhard H
2013-01-01
Monitoring for drug-induced liver injury (DILI) via serial transaminase measurements in patients on potentially hepatotoxic medications (e.g., for HIV and tuberculosis) is routine in resource-rich nations, but often unavailable in resource-limited settings. Towards enabling universal access to affordable point-of-care (POC) screening for DILI, we have performed the first field evaluation of a paper-based, microfluidic fingerstick test for rapid, semi-quantitative, visual measurement of blood alanine aminotransferase (ALT). Our objectives were to assess operational feasibility, inter-operator variability, lot variability, device failure rate, and accuracy, to inform device modification for further field testing. The paper-based ALT test was performed at POC on fingerstick samples from 600 outpatients receiving HIV treatment in Vietnam. Results, read independently by two clinic nurses, were compared with gold-standard automated (Roche Cobas) results from venipuncture samples obtained in parallel. Two device lots were used sequentially. We demonstrated high inter-operator agreement, with 96.3% (95% C.I., 94.3-97.7%) agreement in placing visual results into clinically-defined "bins" (<3x, 3-5x, and >5x upper limit of normal), >90% agreement in validity determination, and intraclass correlation coefficient of 0.89 (95% C.I., 0.87-0.91). Lot variability was observed in % invalids due to hemolysis (21.1% for Lot 1, 1.6% for Lot 2) and correlated with lots of incorporated plasma separation membranes. Invalid rates <1% were observed for all other device controls. Overall bin placement accuracy for the two readers was 84% (84.3%/83.6%). Our findings of extremely high inter-operator agreement for visual reading-obtained in a target clinical environment, as performed by local practitioners-indicate that the device operation and reading process is feasible and reproducible. Bin placement accuracy and lot-to-lot variability data identified specific targets for device optimization and material quality control. This is the first field study performed with a patterned paper-based microfluidic device and opens the door to development of similar assays for other important analytes.
Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.
Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert
2017-08-01
Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.
Supply Chain Interoperability Measurement
2015-06-19
Supply Chain Interoperability Measurement DISSERTATION June 2015 Christos E. Chalyvidis, Major, Hellenic Air...ENS-DS-15-J-001 SUPPLY CHAIN INTEROPERABILITY MEASUREMENT DISSERTATION Presented to the Faculty Department of Operational Sciences...INTEROPERABILITY MEASUREMENT Christos E. Chalyvidis, BS, MSc. Major, Hellenic Air Force Committee Membership: Dr. A.W. Johnson Chair
The Osseus platform: a prototype for advanced web-based distributed simulation
NASA Astrophysics Data System (ADS)
Franceschini, Derrick; Riecken, Mark
2016-05-01
Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.
NASA Astrophysics Data System (ADS)
Glaves, Helen; Schaap, Dick
2016-04-01
The increasingly ocean basin level approach to marine research has led to a corresponding rise in the demand for large quantities of high quality interoperable data. This requirement for easily discoverable and readily available marine data is currently being addressed by initiatives such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Australian Ocean Data Network (AODN) with each having implemented an e-infrastructure to facilitate the discovery and re-use of standardised multidisciplinary marine datasets available from a network of distributed repositories, data centres etc. within their own region. However, these regional data systems have been developed in response to the specific requirements of their users and in line with the priorities of the funding agency. They have also been created independently of the marine data infrastructures in other regions often using different standards, data formats, technologies etc. that make integration of marine data from these regional systems for the purposes of basin level research difficult. Marine research at the ocean basin level requires a common global framework for marine data management which is based on existing regional marine data systems but provides an integrated solution for delivering interoperable marine data to the user. The Ocean Data Interoperability Platform (ODIP/ODIP II) project brings together those responsible for the management of the selected marine data systems and other relevant technical experts with the objective of developing interoperability across the regional e-infrastructures. The commonalities and incompatibilities between the individual data infrastructures are identified and then used as the foundation for the specification of prototype interoperability solutions which demonstrate the feasibility of sharing marine data across the regional systems and also with relevant larger global data services such as GEO, COPERNICUS, IODE, POGO etc. The potential impact for the individual regional data infrastructures of implementing these prototype interoperability solutions is also being evaluated to determine both the technical and financial implications of their integration within existing systems. These impact assessments form part of the strategy to encourage wider adoption of the ODIP solutions and approach beyond the current scope of the project which is focussed on regional marine data systems in Europe, Australia, the USA and, more recently, Canada.
NASA Astrophysics Data System (ADS)
Glaves, H. M.; Schaap, D.
2014-12-01
As marine research becomes increasingly multidisciplinary in its approach there has been a corresponding rise in the demand for large quantities of high quality interoperable data. A number of regional initiatives are already addressing this requirement through the establishment of e-infrastructures to improve the discovery and access of marine data. Projects such as Geo-Seas and SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and IMOS in Australia have implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these regional initiatives has been developed to address their own requirements and independently of other regions. To establish a common framework for marine data management on a global scale these is a need to develop interoperability solutions that can be implemented across these initiatives.Through a series of workshops attended by the relevant domain specialists, the Ocean Data Interoperability Platform (ODIP) project has identified areas of commonality between the regional infrastructures and used these as the foundation for the development of three prototype interoperability solutions addressing: the use of brokering services for the purposes of providing access to the data available in the regional data discovery and access services including via the GEOSS portal the development of interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) portal the establishment of a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE) These prototypes will be used to underpin the development of a common global approach to the management of marine data which can be promoted to the wider marine research community. ODIP is a community lead project that is currently focussed on regional initiatives in Europe, the USA and Australia but which is seeking to expand this framework to include other regional marine data infrastructures.
NASA Technical Reports Server (NTRS)
Timmerman, J.; Jones, Denise R. (Technical Monitor)
2001-01-01
A Runway Incursion Prevention System (RIPS) was tested at the Dallas - Ft. Worth International Airport in October 2000. The system integrated airborne and ground components to provide both pilots and controllers with enhanced situational awareness, supplemental guidance cues, a real-time display of traffic information, and warning of runway incursions in order to prevent runway incidents while also improving operational capability. Rockwell Collins provided and supported a prototype Automatic Dependent Surveillance - Broadcast (ADS-B) system using 1090 MHz and a prototype Differential GPS (DGPS) system onboard the NASA Boeing 757 research aircraft. This report describes the Rockwell Collins contributions to the RIPS flight test, summarizes the development process, and analyzes both ADS-B and DGPS data collected during the flight test. In addition, results are report on interoperability tests conducted between the NASA Advanced General Aviation Transport Experiments (AGATE) ADS-B flight test system and the NASA Boeing 757 ADS-B system.
Parel, I; Cutti, A G; Fiumana, G; Porcellini, G; Verni, G; Accardo, A P
2012-04-01
To measure the scapulohumeral rhythm (SHR) in outpatient settings, the motion analysis protocol named ISEO (INAIL Shoulder and Elbow Outpatient protocol) was developed, based on inertial and magnetic sensors. To complete the sensor-to-segment calibration, ISEO requires the involvement of an operator for sensor placement and for positioning the patient's arm in a predefined posture. Since this can affect the measure, this study aimed at quantifying ISEO intra- and inter-operator agreement. Forty subjects were considered, together with two operators, A and B. Three measurement sessions were completed for each subject: two by A and one by B. In each session, the humerus and scapula rotations were measured during sagittal and scapular plane elevation movements. ISEO intra- and inter-operator agreement were assessed by computing, between sessions, the: (1) similarity of the scapulohumeral patterns through the Coefficient of Multiple Correlation (CMC(2)), both considering and excluding the difference of the initial value of the scapula rotations between two sessions (inter-session offset); (2) 95% Smallest Detectable Difference (SDD(95)) in scapula range of motion. Results for CMC(2) showed that the intra- and inter-operator agreement is acceptable (median≥0.85, lower-whisker ≥ 0.75) for most of the scapula rotations, independently from the movement and the inter-session offset. The only exception is the agreement for scapula protraction-retraction and for scapula medio-lateral rotation during abduction (inter-operator), which is acceptable only if the inter-session offset is removed. SDD(95) values ranged from 4.4° to 8.6° for the inter-operator and between 4.9° and 8.5° for the intra-operator agreement. In conclusion, ISEO presents a high intra- and inter-operator agreement, particularly with the scapula inter-session offset removed. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Frederick, M. E.; Cox, E. L.; Friedl, L. A.
2006-12-01
NASA's Earth Science Theme is charged with implementing NASA Strategic Goal 3A to "study Earth from space to advance scientific understanding and meet societal needs." In the course of meeting this objective, NASA produces research results, such as scientific observatories, research models, advanced sensor and space system technology, data active archives and interoperability technology, high performance computing systems, and knowledge products. These research results have the potential to serve society beyond their intended purpose of answering pressing Earth system science questions. NASA's Applied Sciences Program systematically evaluates the potential of the portfolio of research results to serve society by conducting projects in partnership with regional/national scale operational partners with the statutory responsibility to inform decision makers. These projects address NASA's National Applications and the societal benefit areas under the IEOS and GEOSS. Prototyping methods are used in two ways in NASA's Applied Sciences Program. The first is part of the National Applications program element, referred to as Integrated Systems Solutions (ISS) projects. The approach for these projects is to use high fidelity prototypes to benchmark the assimilation of NASA research results into our partners' decision support systems. The outcome from ISS projects is a prototype system that has been rigorously tested with the partner to understand the scientific uncertainty and improved value of their modified system. In many cases, these completed prototypes are adopted or adapted for use by the operational partners. The second falls under the Crosscutting Solutions program element, referred to as Rapid Prototyping (RP) experiments. The approach for RP experiments is to use low fidelity prototypes that are low cost and quickly produced to evaluate the potential of the breadth of NASA research results to serve society. The outcome from the set of RP experiments is an evaluation of many and varied NASA research results for their potential to be candidates for further development as an ISS project. The intention is to seed the community with many creative ideas for projects that use "un-applied" NASA research results to serve society, such as simulations of future missions.
2002-06-01
techniques for addressing the software component retrieval problem. Steigerwald [Ste91] introduced the use of algebraic specifications for defining the...provided in terms of a specification written using Luqi’s Prototype Specification Description Language (PSDL) [LBY88] augmented with an algebraic
Analysis of OPACITY and PLAID Protocols for Contactless Smart Cards
2012-09-01
9 3. Access Control ........................................................................ 9 E . THREATS AND...Synchronization .............................. 23 c. Simple Integration and Interoperability ..................... 24 E . MODES OF OPERATION...Interoperability ..................... 47 E . MODES OF OPERATIONS ................................................................ 47 F. SUGGESTED KEY
Plugfest 2009: Global Interoperability in Telerobotics and Telemedicine
King, H. Hawkeye; Hannaford, Blake; Kwok, Ka-Wai; Yang, Guang-Zhong; Griffiths, Paul; Okamura, Allison; Farkhatdinov, Ildar; Ryu, Jee-Hwan; Sankaranarayanan, Ganesh; Arikatla, Venkata; Tadano, Kotaro; Kawashima, Kenji; Peer, Angelika; Schauß, Thomas; Buss, Martin; Miller, Levi; Glozman, Daniel; Rosen, Jacob; Low, Thomas
2014-01-01
Despite the great diversity of teleoperator designs and applications, their underlying control systems have many similarities. These similarities can be exploited to enable inter-operability between heterogeneous systems. We have developed a network data specification, the Interoperable Telerobotics Protocol, that can be used for Internet based control of a wide range of teleoperators. In this work we test interoperable telerobotics on the global Internet, focusing on the telesurgery application domain. Fourteen globally dispersed telerobotic master and slave systems were connected in thirty trials in one twenty four hour period. Users performed common manipulation tasks to demonstrate effective master-slave operation. With twenty eight (93%) successful, unique connections the results show a high potential for standardizing telerobotic operation. Furthermore, new paradigms for telesurgical operation and training are presented, including a networked surgery trainer and upper-limb exoskeleton control of micro-manipulators. PMID:24748993
Space Network Interoperability Panel (SNIP) study
NASA Technical Reports Server (NTRS)
Ryan, Thomas; Lenhart, Klaus; Hara, Hideo
1991-01-01
The Space Network Interoperability Panel (SNIP) study is a tripartite study that involves the National Aeronautics and Space Administration (NASA), the European Space Agency (ESA), and the National Space Development Agency (NASDA) of Japan. SNIP involves an ongoing interoperability study of the Data Relay Satellite (DRS) Systems of the three organizations. The study is broken down into two parts; Phase one deals with S-band (2 GHz) interoperability and Phase two deals with Ka-band (20/30 GHz) interoperability (in addition to S-band). In 1987 the SNIP formed a Working Group to define and study operations concepts and technical subjects to assure compatibility of the international data relay systems. Since that time a number of Panel and Working Group meetings have been held to continue the study. Interoperability is of interest to the three agencies because it offers a number of potential operation and economic benefits. This paper presents the history and status of the SNIP study.
Design and implementation of a CORBA-based genome mapping system prototype.
Hu, J; Mungall, C; Nicholson, D; Archibald, A L
1998-01-01
CORBA (Common Object Request Broker Architecture), as an open standard, is considered to be a good solution for the development and deployment of applications in distributed heterogeneous environments. This technology can be applied in the bioinformatics area to enhance utilization, management and interoperation between biological resources. This paper investigates issues in developing CORBA applications for genome mapping information systems in the Internet environment with emphasis on database connectivity and graphical user interfaces. The design and implementation of a CORBA prototype for an animal genome mapping database are described. The prototype demonstration is available via: http://www.ri.bbsrc.ac.uk/ark_corba/. jian.hu@bbsrc.ac.uk
Operational Interoperability Challenges on the Example of GEOSS and WIS
NASA Astrophysics Data System (ADS)
Heene, M.; Buesselberg, T.; Schroeder, D.; Brotzer, A.; Nativi, S.
2015-12-01
The following poster highlights the operational interoperability challenges on the example of Global Earth Observation System of Systems (GEOSS) and World Meteorological Organization Information System (WIS). At the heart of both systems is a catalogue of earth observation data, products and services but with different metadata management concepts. While in WIS a strong governance with an own metadata profile for the hundreds of thousands metadata records exists, GEOSS adopted a more open approach for the ten million records. Furthermore, the development of WIS - as an operational system - follows a roadmap with committed downwards compatibility while the GEOSS development process is more agile. The poster discusses how the interoperability can be reached for the different metadata management concepts and how a proxy concept helps to couple two different systems which follow a different development methodology. Furthermore, the poster highlights the importance of monitoring and backup concepts as a verification method for operational interoperability.
Dandanell, G
1992-01-01
The interoperator distance between a synthetic operator Os and the deoP2O2-galK fusion was varied between 46 and 176 bp. The repression of the deoP2 directed galK expression as a function of the interoperator distance (center-to-center) was measured in vivo in a single-copy system. The results show that the DeoR repressor efficiently can repress transcription at all the interoperator distances tested. The degree of repression depends very little on the spacing between the operators, however, a weak periodic dependency of 8-11 bp may exist. PMID:1437558
Secure, Mobile, Wireless Network Technology Designed, Developed, and Demonstrated
NASA Technical Reports Server (NTRS)
Ivancic, William D.; Paulsen, Phillip E.
2004-01-01
The inability to seamlessly disseminate data securely over a high-integrity, wireless broadband network has been identified as a primary technical barrier to providing an order-of-magnitude increase in aviation capacity and safety. Secure, autonomous communications to and from aircraft will enable advanced, automated, data-intensive air traffic management concepts, increase National Air Space (NAS) capacity, and potentially reduce the overall cost of air travel operations. For the first time ever, secure, mobile, network technology was designed, developed, and demonstrated with state-ofthe- art protocols and applications by a diverse, cooperative Government-industry team led by the NASA Glenn Research Center. This revolutionary technology solution will make fundamentally new airplane system capabilities possible by enabling secure, seamless network connections from platforms in motion (e.g., cars, ships, aircraft, and satellites) to existing terrestrial systems without the need for manual reconfiguration. Called Mobile Router, the new technology autonomously connects and configures networks as they traverse from one operating theater to another. The Mobile Router demonstration aboard the Neah Bay, a U.S. Coast Guard vessel stationed in Cleveland, Ohio, accomplished secure, seamless interoperability of mobile network systems across multiple domains without manual system reconfiguration. The Neah Bay was chosen because of its low cost and communications mission similarity to low-Earth-orbiting satellite platforms. This technology was successfully advanced from technology readiness level (TRL) 2 (concept and/or application formation) to TRL 6 (system model or prototype demonstration in a relevant environment). The secure, seamless interoperability offered by the Mobile Router and encryption device will enable several new, vehicle-specific and systemwide technologies to perform such things as remote, autonomous aircraft performance monitoring and early detection and mitigation of potential equipment malfunctions. As an additional benefit, team advancements were incorporated into open standards, ensuring technology transfer. Low-cost, commercial products incorporating the new technology are already available. Furthermore, these products are fully interoperable with legacy network technology equipment currently being used throughout the world.
Organisational Interoperability: Evaluation and Further Development of the OIM Model
2003-06-01
an Organizational Interoperability Maturity Model (OIM) to evaluate interoperability at the organizational level. The OIM considers the human ... activity aspects of military operations, which are not covered in other models. This paper describes how the model has been used to identify problems and to
Bonaretti, Serena; Vilayphiou, Nicolas; Chan, Caroline Mai; Yu, Andrew; Nishiyama, Kyle; Liu, Danmei; Boutroy, Stephanie; Ghasem-Zadeh, Ali; Boyd, Steven K.; Chapurlat, Roland; McKay, Heather; Shane, Elizabeth; Bouxsein, Mary L.; Black, Dennis M.; Majumdar, Sharmila; Orwoll, Eric S.; Lang, Thomas F.; Khosla, Sundeep; Burghardt, Andrew J.
2017-01-01
Introduction HR-pQCT is increasingly used to assess bone quality, fracture risk and anti-fracture interventions. The contribution of the operator has not been adequately accounted in measurement precision. Operators acquire a 2D projection (“scout view image”) and define the region to be scanned by positioning a “reference line” on a standard anatomical landmark. In this study, we (i) evaluated the contribution of positioning variability to in vivo measurement precision, (ii) measured intra- and inter-operator positioning variability, and (iii) tested if custom training software led to superior reproducibility in new operators compared to experienced operators. Methods To evaluate the operator in vivo measurement precision we compared precision errors calculated in 64 co-registered and non-co-registered scan-rescan images. To quantify operator variability, we developed software that simulates the positioning process of the scanner’s software. Eight experienced operators positioned reference lines on scout view images designed to test intra- and inter-operator reproducibility. Finally, we developed modules for training and evaluation of reference line positioning. We enrolled 6 new operators to participate in a common training, followed by the same reproducibility experiments performed by the experienced group. Results In vivo precision errors were up to three-fold greater (Tt.BMD and Ct.Th) when variability in scan positioning was included. Inter-operator precision errors were significantly greater than short-term intra-operator precision (p<0.001). New trained operators achieved comparable intra-operator reproducibility to experienced operators, and lower inter-operator reproducibility (p<0.001). Precision errors were significantly greater for the radius than for the tibia. Conclusion Operator reference line positioning contributes significantly to in vivo measurement precision and is significantly greater for multi-operator datasets. Inter-operator variability can be significantly reduced using a systematic training platform, now available online (http://webapps.radiology.ucsf.edu/refline/). PMID:27475931
Ocean Data Interoperability Platform (ODIP): using regional data systems for global ocean research
NASA Astrophysics Data System (ADS)
Schaap, D.; Thijsse, P.; Glaves, H.
2017-12-01
Ocean acidification, loss of coral reefs, sustainable exploitation of the marine environment are just a few of the challenges researchers around the world are currently attempting to understand and address. However, studies of these ecosystem level challenges are impossible unless researchers can discover and re-use the large volumes of interoperable multidisciplinary data that are currently only accessible through regional and global data systems that serve discreet, and often discipline specific, user communities. The plethora of marine data systems currently in existence are also using different standards, technologies and best practices making re-use of the data problematic for those engaged in interdisciplinary marine research. The Ocean Data Interoperability Platform (ODIP) is responding to this growing demand for discoverable, accessible and reusable data by establishing the foundations for a common global framework for marine data management. But creation of such an infrastructure is a major undertaking, and one that needs to be achieved in part by establishing different levels of interoperability across existing regional and global marine e-infrastructures. Workshops organised by ODIP II facilitate dialogue between selected regional and global marine data systems in an effort to identify potential solutions that integrate these marine e-infrastructures. The outcomes of these discussions have formed the basis for a number of prototype development tasks that aim to demonstrate effective sharing of data across multiple data systems, and allow users to access data from more than one system through a single access point. The ODIP II project is currently developing four prototype solutions that are establishing interoperability between selected regional marine data management infrastructures in Europe, the USA, Canada and Australia, and with the global POGO, IODE Ocean Data Portal (ODP) and GEOSS systems. The potential impact of implementing these solutions for the individual marine data infrastructures is also being evaluated to determine both the technical and financial implications of their integration within existing systems. These impact assessments form part of the strategy to encourage wider adoption of the ODIP solutions and approach beyond the current scope of the project.
Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes
NASA Astrophysics Data System (ADS)
Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.
2014-12-01
With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data Web developer-friendly with a RESTful service. This goal was achieved by defining a proxy layer on top of the existing SPARQL endpoint that 1) translates HTTP requests into SPARQL queries, and 2) renders the returned results as required by the request sender using content negotiation, suffixes and parameters.
Exploring a model-driven architecture (MDA) approach to health care information systems development.
Raghupathi, Wullianallur; Umar, Amjad
2008-05-01
To explore the potential of the model-driven architecture (MDA) in health care information systems development. An MDA is conceptualized and developed for a health clinic system to track patient information. A prototype of the MDA is implemented using an advanced MDA tool. The UML provides the underlying modeling support in the form of the class diagram. The PIM to PSM transformation rules are applied to generate the prototype application from the model. The result of the research is a complete MDA methodology to developing health care information systems. Additional insights gained include development of transformation rules and documentation of the challenges in the application of MDA to health care. Design guidelines for future MDA applications are described. The model has the potential for generalizability. The overall approach supports limited interoperability and portability. The research demonstrates the applicability of the MDA approach to health care information systems development. When properly implemented, it has the potential to overcome the challenges of platform (vendor) dependency, lack of open standards, interoperability, portability, scalability, and the high cost of implementation.
A tale of three cities--where RHIOS meet the NHIN.
DeBor, Greg; Diamond, Carol; Grodecki, Don; Halamka, John; Overhage, J Marc; Shirky, Clay
2006-01-01
Regional health information exchanges in California, Indiana, and Massachusetts have been collaborating on a prototype for a nationwide health information network, first under the auspices of the Markle Foundation's Connecting for Health program and now under contract to the Department of Health and Human Services' Office of the National Coordinator for Health Information Technology. Since mid-2004, this collaboration has evolved from a collection of regional efforts to a standards-driven cooperative and now to one of four prototype national networks fostered by federal efforts. This development reflects a maturing market for interoperability and integration in healthcare information technology, starting with RHIOs, and suggests one response to the industry's need for the type of plug-and-play information exchange available in other industries. The authors share their experiences and their views of how RHIOs and a Nationwide Health Information Network will further develop to make interoperable electronic health records a reality in coming years. The content of this article is solely the responsibility of the authors and does not necessarily represent the official view of the Office of the National Coordinator for Health Information Technology.
System and methods of resource usage using an interoperable management framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.
Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.
Gene Fusion Markup Language: a prototype for exchanging gene fusion data.
Kalyana-Sundaram, Shanker; Shanmugam, Achiraman; Chinnaiyan, Arul M
2012-10-16
An avalanche of next generation sequencing (NGS) studies has generated an unprecedented amount of genomic structural variation data. These studies have also identified many novel gene fusion candidates with more detailed resolution than previously achieved. However, in the excitement and necessity of publishing the observations from this recently developed cutting-edge technology, no community standardization approach has arisen to organize and represent the data with the essential attributes in an interchangeable manner. As transcriptome studies have been widely used for gene fusion discoveries, the current non-standard mode of data representation could potentially impede data accessibility, critical analyses, and further discoveries in the near future. Here we propose a prototype, Gene Fusion Markup Language (GFML) as an initiative to provide a standard format for organizing and representing the significant features of gene fusion data. GFML will offer the advantage of representing the data in a machine-readable format to enable data exchange, automated analysis interpretation, and independent verification. As this database-independent exchange initiative evolves it will further facilitate the formation of related databases, repositories, and analysis tools. The GFML prototype is made available at http://code.google.com/p/gfml-prototype/. The Gene Fusion Markup Language (GFML) presented here could facilitate the development of a standard format for organizing, integrating and representing the significant features of gene fusion data in an inter-operable and query-able fashion that will enable biologically intuitive access to gene fusion findings and expedite functional characterization. A similar model is envisaged for other NGS data analyses.
NASA Astrophysics Data System (ADS)
Wright, D. J.; Lassoued, Y.; Dwyer, N.; Haddad, T.; Bermudez, L. E.; Dunne, D.
2009-12-01
Coastal mapping plays an important role in informing marine spatial planning, resource management, maritime safety, hazard assessment and even national sovereignty. As such, there is now a plethora of data/metadata catalogs, pre-made maps, tabular and text information on resource availability and exploitation, and decision-making tools. A recent trend has been to encapsulate these in a special class of web-enabled geographic information systems called a coastal web atlas (CWA). While multiple benefits are derived from tailor-made atlases, there is great value added from the integration of disparate CWAs. CWAs linked to one another can query more successfully to optimize planning and decision-making. If a dataset is missing in one atlas, it may be immediately located in another. Similar datasets in two atlases may be combined to enhance study in either region. *But how best to achieve semantic interoperability to mitigate vague data queries, concepts or natural language semantics when retrieving and integrating data and information?* We report on the development of a new prototype seeking to interoperate between two initial CWAs: the Marine Irish Digital Atlas (MIDA) and the Oregon Coastal Atlas (OCA). These two mature atlases are used as a testbed for more regional connections, with the intent for the OCA to use lessons learned to develop a regional network of CWAs along the west coast, and for MIDA to do the same in building and strengthening atlas networks with the UK, Belgium, and other parts of Europe. Our prototype uses semantic interoperability via services harmonization and ontology mediation, allowing local atlases to use their own data structures, and vocabularies (ontologies). We use standard technologies such as OGC Web Map Services (WMS) for delivering maps, and OGC Catalogue Service for the Web (CSW) for delivering and querying ISO-19139 metadata. The metadata records of a given CWA use a given ontology of terms called local ontology. Human or machine users formulate their requests using a common ontology of metadata terms, called global ontology. A CSW mediator rewrites the user’s request into CSW requests over local CSWs using their own (local) ontologies, collects the results and sends them back to the user. To extend the system, we have recently added global maritime boundaries and are also considering nearshore ocean observing system data. Ongoing work includes adding WFS, error management, and exception handling, enabling Smart Searches, and writing full documentation. This prototype is a central research project of the new International Coastal Atlas Network (ICAN), a group of 30+ organizations from 14 nations (and growing) dedicated to seeking interoperability approaches to CWAs in support of coastal zone management and the translation of coastal science to coastal decision-making.
Kamimura, Emi; Tanaka, Shinpei; Takaba, Masayuki; Tachi, Keita; Baba, Kazuyoshi
2017-01-01
The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D) images of teeth captured by a digital impression technique to a conventional impression technique in vivo. Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE). A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE). Stereolithography (STL) data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D) laboratory scanner (D810, 3shape). The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software) for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test). The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm) than when using a conventional impression technique (0.023 ± 0.01 mm). The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator.
The Java Image Science Toolkit (JIST) for rapid prototyping and publishing of neuroimaging software.
Lucas, Blake C; Bogovic, John A; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L; Pham, Dzung L; Landman, Bennett A
2010-03-01
Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC).
The Java Image Science Toolkit (JIST) for Rapid Prototyping and Publishing of Neuroimaging Software
Lucas, Blake C.; Bogovic, John A.; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L.; Pham, Dzung
2010-01-01
Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC). PMID:20077162
NASA Astrophysics Data System (ADS)
Glaves, Helen
2015-04-01
Marine research is rapidly moving away from traditional discipline specific science to a wider ecosystem level approach. This more multidisciplinary approach to ocean science requires large amounts of good quality, interoperable data to be readily available for use in an increasing range of new and complex applications. Significant amounts of marine data and information are already available throughout the world as a result of e-infrastructures being established at a regional level to manage and deliver marine data to the end user. However, each of these initiatives has been developed to address specific regional requirements and independently of those in other regions. Establishing a common framework for marine data management on a global scale necessitates that there is interoperability across these existing data infrastructures and active collaboration between the organisations responsible for their management. The Ocean Data Interoperability Platform (ODIP) project is promoting co-ordination between a number of these existing regional e-infrastructures including SeaDataNet and Geo-Seas in Europe, the Integrated Marine Observing System (IMOS) in Australia, the Rolling Deck to Repository (R2R) in the USA and the international IODE initiative. To demonstrate this co-ordinated approach the ODIP project partners are currently working together to develop several prototypes to test and evaluate potential interoperability solutions for solving the incompatibilities between the individual regional marine data infrastructures. However, many of the issues being addressed by the Ocean Data Interoperability Platform are not specific to marine science. For this reason many of the outcomes of this international collaborative effort are equally relevant and transferable to other domains.
47 CFR 90.525 - Administration of interoperability channels.
Code of Federal Regulations, 2010 CFR
2010-10-01
... RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Regulations Governing the Licensing and Use of... meeting the requirements of § 90.523 may operate mobile or portable units on the Interoperability channels... Commission provided it holds a part 90 license. All persons operating mobile or portable units under this...
Telemedicine system interoperability architecture: concept description and architecture overview.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard Layne, II
2004-05-01
In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.
Challenges and Approaches to Make Multidisciplinary Team Meetings Interoperable - The KIMBo Project.
Krauss, Oliver; Holzer, Karl; Schuler, Andreas; Egelkraut, Reinhard; Franz, Barbara
2017-01-01
Multidisciplinary team meetings (MDTMs) are already in use for certain areas in healthcare (e.g. treatment of cancer). Due to the lack of common standards and accessibility for the applied IT systems, their potential is not yet completely exploited. Common requirements for MDTMs shall be identified and aggregated into a process definition to be automated by an application architecture utilizing modern standards in electronic healthcare, e.g. HL7 FHIR. To identify requirements, an extensive literature review as well as semi-structured expert interviews were conducted. Results showed, that interoperability and flexibility in terms of the process are key requirements to be addressed. An architecture blueprint as well as an aggregated process definition were derived from the insights gained. To evaluate the feasibility of identified requirements, methods of explorative prototyping in software engineering were used. MDTMs will become an important part of modern and future healthcare but the need for standardization in terms of interoperability is imminent.
Chiu, Tsz-chun Roxy; Ngo, Hiu-ching; Lau, Lai-wa; Leung, King-wah; Lo, Man-him; Yu, Ho-fai; Ying, Michael
2016-01-01
Aims This study was undertaken to investigate the immediate effect of static stretching on normal Achilles tendon morphology and stiffness, and the different effect on dominant and non-dominant legs; and to evaluate inter-operator and intra-operator reliability of using shear-wave elastography in measuring Achilles tendon stiffness. Methods 20 healthy subjects (13 males, 7 females) were included in the study. Thickness, cross-sectional area and stiffness of Achilles tendons in both legs were measured before and after 5-min static stretching using grey-scale ultrasound and shear-wave elastography. Inter-operator and intra-operator reliability of tendon stiffness measurements of six operators were evaluated. Results Result showed that there was no significant change in the thickness and cross-sectional area of Achilles tendon after static stretching in both dominant and non-dominant legs (p > 0.05). Tendon stiffness showed a significant increase in non-dominant leg (p < 0.05) but not in dominant leg (p > 0.05). The inter-operator reliability of shear-wave elastography measurements was 0.749 and the intra-operator reliability ranged from 0.751 to 0.941. Conclusion Shear-wave elastography is a useful and non-invasive imaging tool to assess the immediate stiffness change of Achilles tendon in response to static stretching with high intra-operator and inter-operator reliability. PMID:27120097
Medical Device Plug-and-Play Interoperability Standards and Technology Leadership
2012-10-01
External Network Pump Adapter PulseOx Adapter • MD MP3 cart is a platform for the development of smart pump control algorithms • It includes...delivery with bounded latency Medical Device Mobile PnP Prototype Platform (MD MP3 ) • Got MDCF code to run on the BeagleBoard development boards we are
Current Efforts in European Projects to Facilitate the Sharing of Scientific Observation Data
NASA Astrophysics Data System (ADS)
Bredel, Henning; Rieke, Matthes; Maso, Joan; Jirka, Simon; Stasch, Christoph
2017-04-01
This presentation is intended to provide an overview of currently ongoing efforts in European projects to facilitate and promote the interoperable sharing of scientific observation data. This will be illustrated through two examples: a prototypical portal developed in the ConnectinGEO project for matching available (in-situ) data sources to the needs of users and a joint activity of several research projects to harmonise the usage of the OGC Sensor Web Enablement standards for providing access to marine observation data. ENEON is an activity initiated by the European ConnectinGEO project to coordinate in-situ Earth observation networks with the aim to harmonise the access to observations, improve discoverability, and identify/close gaps in European earth observation data resources. In this context, ENEON commons has been developed as a supporting Web portal for facilitating discovery, access, re-use and creation of knowledge about observations, networks, and related activities (e.g. projects). The portal is based on developments resulting from the European WaterInnEU project and has been extended to cover the requirements for handling knowledge about in-situ earth observation networks. A first prototype of the portal was completed in January 2017 which offers functionality for interactive discussion, information exchange and querying information about data delivered by different observation networks. Within this presentation, we will introduce the presented prototype and initiate a discussion about potential future work directions. The second example concerns the harmonisation of data exchange in the marine domain. There are many organisation who operate ocean observatories or data archives. In recent years, the application of the OGC Sensor Web Enablement (SWE) technology has become more and more popular to increase the interoperability between marine observation networks. However, as the SWE standards were intentionally designed in a domain independent manner, there are still a significant degrees of freedom how the same information could be handled in the SWE framework. Thus, further domain-specific agreements are necessary to describe more precisely, how SWE standards shall be applied in specific contexts. Within this presentation we will report the current status of the marine SWE profiles initiative which has the aim to develop guidance and recommendations for the application of SWE standards for ocean observation data. This initiative which is supported by projects such as NeXOS, FixO3, ODIP 2, BRIDGES and SeaDataCloud has already lead to first results, which will be introduced in the proposed presentation. In summary we will introduce two different building blocks how earth observation networks can be coordinated to ensure better discoverability through intelligent portal solutions and to ensure a common, interoperable exchange of the collected data through dedicated domain profiles of Sensor Web standard.
Design and Realization of a Planar Ultrawideband Antenna with Notch Band at 3.5 GHz
2014-01-01
A small antenna with single notch band at 3.5 GHz is designed for ultrawideband (UWB) communication applications. The fabricated antenna comprises a radiating monopole element and a perfectly conducting ground plane with a wide slot. To achieve a notch band at 3.5 GHz, a parasitic element has been inserted in the same plane of the substrate along with the radiating patch. Experimental results shows that, by properly adjusting the position of the parasitic element, the designed antenna can achieve an ultrawide operating band of 3.04 to 11 GHz with a notched band operating at 3.31–3.84 GHz. Moreover, the proposed antenna achieved a good gain except at the notched band and exhibits symmetric radiation patterns throughout the operating band. The prototype of the proposed antenna possesses a very compact size and uses simple structures to attain the stop band characteristic with an aim to lessen the interference between UWB and worldwide interoperability for microwave access (WiMAX) band. PMID:25133245
Commanding Heterogeneous Multi-Robot Teams
2014-06-01
Coalition Battle Management Language (C-BML) Study Group Report. 2005 Fall Simulation Interoperability Workshop (05F- SIW - 041), Orlando, FL, September...NMSG-085 CIG Land Operation Demonstration. 2013 Spring Simulation Interoperability Workshop (13S- SIW -031), San Diego, CA. April 2013. [4] K...Simulation Interoperability Workshop (10F- SIW -039), Orlando, FL, September 2010. [5] M. Langerwisch, M. Ax, S. Thamke, T. Remmersmann, A. Tiderko
NASA Technical Reports Server (NTRS)
Yin, J.; Oyaki, A.; Hwang, C.; Hung, C.
2000-01-01
The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.
Martin, Bryn A; Yiallourou, Theresia I; Pahlavian, Soroush Heidari; Thyagaraj, Suraj; Bunck, Alexander C; Loth, Francis; Sheffer, Daniel B; Kröger, Jan Robert; Stergiopulos, Nikolaos
2016-05-01
For the first time, inter-operator dependence of MRI based computational fluid dynamics (CFD) modeling of cerebrospinal fluid (CSF) in the cervical spinal subarachnoid space (SSS) is evaluated. In vivo MRI flow measurements and anatomy MRI images were obtained at the cervico-medullary junction of a healthy subject and a Chiari I malformation patient. 3D anatomies of the SSS were reconstructed by manual segmentation by four independent operators for both cases. CFD results were compared at nine axial locations along the SSS in terms of hydrodynamic and geometric parameters. Intraclass correlation (ICC) assessed the inter-operator agreement for each parameter over the axial locations and coefficient of variance (CV) compared the percentage of variance for each parameter between the operators. Greater operator dependence was found for the patient (0.19 < ICC < 0.99) near the craniovertebral junction compared to the healthy subject (ICC > 0.78). For the healthy subject, hydraulic diameter and Womersley number had the least variance (CV = ~2%). For the patient, peak diastolic velocity and Reynolds number had the smallest variance (CV = ~3%). These results show a high degree of inter-operator reliability for MRI-based CFD simulations of CSF flow in the cervical spine for healthy subjects and a lower degree of reliability for patients with Type I Chiari malformation.
Martin, Bryn A.; Yiallourou, Theresia I.; Pahlavian, Soroush Heidari; Thyagaraj, Suraj; Bunck, Alexander C.; Loth, Francis; Sheffer, Daniel B.; Kröger, Jan Robert; Stergiopulos, Nikolaos
2015-01-01
For the first time, inter-operator dependence of MRI based computational fluid dynamics (CFD) modeling of cerebrospinal fluid (CSF) in the cervical spinal subarachnoid space (SSS) is evaluated. In vivo MRI flow measurements and anatomy MRI images were obtained at the cervico-medullary junction of a healthy subject and a Chiari I malformation patient. 3D anatomies of the SSS were reconstructed by manual segmentation by four independent operators for both cases. CFD results were compared at nine axial locations along the SSS in terms of hydrodynamic and geometric parameters. Intraclass correlation (ICC) assessed the inter-operator agreement for each parameter over the axial locations and coefficient of variance (CV) compared the percentage of variance for each parameter between the operators. Greater operator dependence was found for the patient (0.19
INcreasing Security and Protection through Infrastructure REsilience: The INSPIRE Project
NASA Astrophysics Data System (ADS)
D'Antonio, Salvatore; Romano, Luigi; Khelil, Abdelmajid; Suri, Neeraj
The INSPIRE project aims at enhancing the European potential in the field of security by ensuring the protection of critical information infrastructures through (a) the identification of their vulnerabilities and (b) the development of innovative techniques for securing networked process control systems. To increase the resilience of such systems INSPIRE will develop traffic engineering algorithms, diagnostic processes and self-reconfigurable architectures along with recovery techniques. Hence, the core idea of the INSPIRE project is to protect critical information infrastructures by appropriately configuring, managing, and securing the communication network which interconnects the distributed control systems. A working prototype will be implemented as a final demonstrator of selected scenarios. Controls/Communication Experts will support project partners in the validation and demonstration activities. INSPIRE will also contribute to standardization process in order to foster multi-operator interoperability and coordinated strategies for securing lifeline systems.
Warner, Jeremy L; Rioth, Matthew J; Mandl, Kenneth D; Mandel, Joshua C; Kreda, David A; Kohane, Isaac S; Carbone, Daniel; Oreto, Ross; Wang, Lucy; Zhu, Shilin; Yao, Heming; Alterovitz, Gil
2016-07-01
Precision cancer medicine (PCM) will require ready access to genomic data within the clinical workflow and tools to assist clinical interpretation and enable decisions. Since most electronic health record (EHR) systems do not yet provide such functionality, we developed an EHR-agnostic, clinico-genomic mobile app to demonstrate several features that will be needed for point-of-care conversations. Our prototype, called Substitutable Medical Applications and Reusable Technology (SMART)® PCM, visualizes genomic information in real time, comparing a patient's diagnosis-specific somatic gene mutations detected by PCR-based hotspot testing to a population-level set of comparable data. The initial prototype works for patient specimens with 0 or 1 detected mutation. Genomics extensions were created for the Health Level Seven® Fast Healthcare Interoperability Resources (FHIR)® standard; otherwise, the prototype is a normal SMART on FHIR app. The PCM prototype can rapidly present a visualization that compares a patient's somatic genomic alterations against a distribution built from more than 3000 patients, along with context-specific links to external knowledge bases. Initial evaluation by oncologists provided important feedback about the prototype's strengths and weaknesses. We added several requested enhancements and successfully demonstrated the app at the inaugural American Society of Clinical Oncology Interoperability Demonstration; we have also begun to expand visualization capabilities to include cancer specimens with multiple mutations. PCM is open-source software for clinicians to present the individual patient within the population-level spectrum of cancer somatic mutations. The app can be implemented on any SMART on FHIR-enabled EHRs, and future versions of PCM should be able to evolve in parallel with external knowledge bases. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Gene Fusion Markup Language: a prototype for exchanging gene fusion data
2012-01-01
Background An avalanche of next generation sequencing (NGS) studies has generated an unprecedented amount of genomic structural variation data. These studies have also identified many novel gene fusion candidates with more detailed resolution than previously achieved. However, in the excitement and necessity of publishing the observations from this recently developed cutting-edge technology, no community standardization approach has arisen to organize and represent the data with the essential attributes in an interchangeable manner. As transcriptome studies have been widely used for gene fusion discoveries, the current non-standard mode of data representation could potentially impede data accessibility, critical analyses, and further discoveries in the near future. Results Here we propose a prototype, Gene Fusion Markup Language (GFML) as an initiative to provide a standard format for organizing and representing the significant features of gene fusion data. GFML will offer the advantage of representing the data in a machine-readable format to enable data exchange, automated analysis interpretation, and independent verification. As this database-independent exchange initiative evolves it will further facilitate the formation of related databases, repositories, and analysis tools. The GFML prototype is made available at http://code.google.com/p/gfml-prototype/. Conclusion The Gene Fusion Markup Language (GFML) presented here could facilitate the development of a standard format for organizing, integrating and representing the significant features of gene fusion data in an inter-operable and query-able fashion that will enable biologically intuitive access to gene fusion findings and expedite functional characterization. A similar model is envisaged for other NGS data analyses. PMID:23072312
NASA Astrophysics Data System (ADS)
Glaves, H. M.
2015-12-01
In recent years marine research has become increasingly multidisciplinary in its approach with a corresponding rise in the demand for large quantities of high quality interoperable data as a result. This requirement for easily discoverable and readily available marine data is currently being addressed by a number of regional initiatives with projects such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Integrated Marine Observing System (IMOS) in Australia, having implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these systems has been developed to address local requirements and created in isolation from those in other regions.Multidisciplinary marine research on a global scale necessitates a common framework for marine data management which is based on existing data systems. The Ocean Data Interoperability Platform project is seeking to address this requirement by bringing together selected regional marine e-infrastructures for the purposes of developing interoperability across them. By identifying the areas of commonality and incompatibility between these data infrastructures, and leveraging the development activities and expertise of these individual systems, three prototype interoperability solutions are being created which demonstrate the effective sharing of marine data and associated metadata across the participating regional data infrastructures as well as with other target international systems such as GEO, COPERNICUS etc.These interoperability solutions combined with agreed best practice and approved standards, form the basis of a common global approach to marine data management which can be adopted by the wider marine research community. To encourage implementation of these interoperability solutions by other regional marine data infrastructures an impact assessment is being conducted to determine both the technical and financial implications of deploying them alongside existing services. The associated best practice and common standards are also being disseminated to the user community through relevant accreditation processes and related initiatives such as the Research Data Alliance and the Belmont Forum.
Kamimura, Emi; Tanaka, Shinpei; Takaba, Masayuki; Tachi, Keita; Baba, Kazuyoshi
2017-01-01
Purpose The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D) images of teeth captured by a digital impression technique to a conventional impression technique in vivo. Materials and methods Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE). A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE). Stereolithography (STL) data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D) laboratory scanner (D810, 3shape). The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software) for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test). Results The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm) than when using a conventional impression technique (0.023 ± 0.01 mm). Conclusion The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator. PMID:28636642
Ellouze, Afef Samet; Bouaziz, Rafik; Ghorbel, Hanen
2016-10-01
Integrating semantic dimension into clinical archetypes is necessary once modeling medical records. First, it enables semantic interoperability and, it offers applying semantic activities on clinical data and provides a higher design quality of Electronic Medical Record (EMR) systems. However, to obtain these advantages, designers need to use archetypes that cover semantic features of clinical concepts involved in their specific applications. In fact, most of archetypes filed within open repositories are expressed in the Archetype Definition Language (ALD) which allows defining only the syntactic structure of clinical concepts weakening semantic activities on the EMR content in the semantic web environment. This paper focuses on the modeling of an EMR prototype for infants affected by Cerebral Palsy (CP), using the dual model approach and integrating semantic web technologies. Such a modeling provides a better delivery of quality of care and ensures semantic interoperability between all involved therapies' information systems. First, data to be documented are identified and collected from the involved therapies. Subsequently, data are analyzed and arranged into archetypes expressed in accordance of ADL. During this step, open archetype repositories are explored, in order to find the suitable archetypes. Then, ADL archetypes are transformed into archetypes expressed in OWL-DL (Ontology Web Language - Description Language). Finally, we construct an ontological source related to these archetypes enabling hence their annotation to facilitate data extraction and providing possibility to exercise semantic activities on such archetypes. Semantic dimension integration into EMR modeled in accordance to the archetype approach. The feasibility of our solution is shown through the development of a prototype, baptized "CP-SMS", which ensures semantic exploitation of CP EMR. This prototype provides the following features: (i) creation of CP EMR instances and their checking by using a knowledge base which we have constructed by interviews with domain experts, (ii) translation of initially CP ADL archetypes into CP OWL-DL archetypes, (iii) creation of an ontological source which we can use to annotate obtained archetypes and (vi) enrichment and supply of the ontological source and integration of semantic relations by providing hence fueling the ontology with new concepts, ensuring consistency and eliminating ambiguity between concepts. The degree of semantic interoperability that could be reached between EMR systems depends strongly on the quality of the used archetypes. Thus, the integration of semantic dimension in archetypes modeling process is crucial. By creating an ontological source and annotating archetypes, we create a supportive platform ensuring semantic interoperability between archetypes-based EMR-systems. Copyright © 2016. Published by Elsevier Inc.
EVA safety: Space suit system interoperability
NASA Technical Reports Server (NTRS)
Skoog, A. I.; McBarron, J. W.; Abramov, L. P.; Zvezda, A. O.
1995-01-01
The results and the recommendations of the International Academy of Astronautics extravehicular activities (IAA EVA) Committee work are presented. The IAA EVA protocols and operation were analyzed for harmonization procedures and for the standardization of safety critical and operationally important interfaces. The key role of EVA and how to improve the situation based on the identified EVA space suit system interoperability deficiencies were considered.
Interoperability Trends in Extravehicular Activity (EVA) Space Operations for the 21st Century
NASA Technical Reports Server (NTRS)
Miller, Gerald E.
1999-01-01
No other space operations in the 21 st century more comprehensively embody the challenges and dependencies of interoperability than EVA. This discipline is already functioning at an W1paralleled level of interagency, inter-organizational and international cooperation. This trend will only increase as space programs endeavor to expand in the face of shrinking budgets. Among the topics examined in this paper are hardware-oriented issues. Differences in design standards among various space participants dictate differences in the EVA tools that must be manufactured, flown and maintained on-orbit. Presently only two types of functional space suits exist in the world. However, three versions of functional airlocks are in operation. Of the three airlocks, only the International Space Station (ISS) Joint Airlock can accommodate both types of suits. Due to functional differences in the suits, completely different operating protocols are required for each. Should additional space suit or airlock designs become available, the complexity will increase. The lessons learned as a result of designing and operating within such a system are explored. This paper also examines the non-hardware challenges presented by interoperability for a discipline that is as uniquely dependent upon the individual as EVA. Operation of space suits (essentially single-person spacecrafts) by persons whose native language is not that of the suits' designers is explored. The intricacies of shared mission planning, shared control and shared execution of joint EVA's are explained. For example, once ISS is fully functional, the potential exists for two crewmembers of different nationality to be wearing suits manufactured and controlled by a third nation, while operating within an airlock manufactured and controlled by a fourth nation, in an effort to perform tasks upon hardware belonging to a fifth nation. Everything from training issues, to procedures development and writing, to real-time operations is addressed. Finally, this paper looks to the management challenges presented by interoperability in general. With budgets being reduced among all space-faring nations, the need to expand cooperation in the highly expensive field of human space operations is only going to intensify. The question facing management is not if the trend toward interoperation will continue, but how to best facilitate its doing so. Real-world EVA interoperability experience throughout the ShuttlelMir and ISS Programs is discussed to illustrate the challenges and
Approaching semantic interoperability in Health Level Seven
Alschuler, Liora
2010-01-01
‘Semantic Interoperability’ is a driving objective behind many of Health Level Seven's standards. The objective in this paper is to take a step back, and consider what semantic interoperability means, assess whether or not it has been achieved, and, if not, determine what concrete next steps can be taken to get closer. A framework for measuring semantic interoperability is proposed, using a technique called the ‘Single Logical Information Model’ framework, which relies on an operational definition of semantic interoperability and an understanding that interoperability improves incrementally. Whether semantic interoperability tomorrow will enable one computer to talk to another, much as one person can talk to another person, is a matter for speculation. It is assumed, however, that what gets measured gets improved, and in that spirit this framework is offered as a means to improvement. PMID:21106995
Use of Dynamic Models and Operational Architecture to Solve Complex Navy Challenges
NASA Technical Reports Server (NTRS)
Grande, Darby; Black, J. Todd; Freeman, Jared; Sorber, TIm; Serfaty, Daniel
2010-01-01
The United States Navy established 8 Maritime Operations Centers (MOC) to enhance the command and control of forces at the operational level of warfare. Each MOC is a headquarters manned by qualified joint operational-level staffs, and enabled by globally interoperable C41 systems. To assess and refine MOC staffing, equipment, and schedules, a dynamic software model was developed. The model leverages pre-existing operational process architecture, joint military task lists that define activities and their precedence relations, as well as Navy documents that specify manning and roles per activity. The software model serves as a "computational wind-tunnel" in which to test a MOC on a mission, and to refine its structure, staffing, processes, and schedules. More generally, the model supports resource allocation decisions concerning Doctrine, Organization, Training, Material, Leadership, Personnel and Facilities (DOTMLPF) at MOCs around the world. A rapid prototype effort efficiently produced this software in less than five months, using an integrated process team consisting of MOC military and civilian staff, modeling experts, and software developers. The work reported here was conducted for Commander, United States Fleet Forces Command in Norfolk, Virginia, code N5-0LW (Operational Level of War) that facilitates the identification, consolidation, and prioritization of MOC capabilities requirements, and implementation and delivery of MOC solutions.
National Geothermal Data System (USA): an Exemplar of Open Access to Data
NASA Astrophysics Data System (ADS)
Allison, M. Lee; Richard, Stephen; Blackman, Harold; Anderson, Arlene; Patten, Kim
2014-05-01
The National Geothermal Data System's (NGDS - www.geothermaldata.org) formal launch in April, 2014 will provide open access to millions of data records, sharing -relevant geoscience and longer term to land use data to propel geothermal development and production. NGDS serves information from all of the U.S. Department of Energy's sponsored development and research projects and geologic data from all 50 states, using free and open source software. This interactive online system is opening new exploration opportunities and potentially shortening project development by making data easily discoverable, accessible, and interoperable. We continue to populate our prototype functional data system with multiple data nodes and nationwide data online and available to the public. Data from state geological surveys and partners includes more than 6 million records online, including 1.72 million well headers (oil and gas, water, geothermal), 670,000 well logs, and 497,000 borehole temperatures and is growing rapidly. There are over 312 interoperable Web services and another 106 WMS (Web Map Services) registered in the system as of January, 2014. Companion projects run by Southern Methodist University and U.S. Geological Survey (USGS) are adding millions of additional data records. The DOE Geothermal Data Repository, currently hosted on OpenEI, is a system node and clearinghouse for data from hundreds of U.S. DOE-funded geothermal projects. NGDS is built on the US Geoscience Information Network (USGIN) data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG). NGDS complies with the White House Executive Order of May 2013, requiring all federal agencies to make their data holdings publicly accessible online in open source, interoperable formats with common core and extensible metadata. The National Geothermal Data System is being designed, built, deployed, and populated primarily with support from the US Department of Energy, Geothermal Technologies Office. To keep this system operational after the original implementation will require four core elements: continued serving of data and applications by providers; maintenance of system operations; a governance structure; and an effective business model. Each of these presents a number of challenges currently under consideration.
Aeronautical Mobile Airport Communications System (AeroMACS)
NASA Technical Reports Server (NTRS)
Budinger, James M.; Hall, Edward
2011-01-01
To help increase the capacity and efficiency of the nation s airports, a secure wideband wireless communications system is proposed for use on the airport surface. This paper provides an overview of the research and development process for the Aeronautical Mobile Airport Communications System (AeroMACS). AeroMACS is based on a specific commercial profile of the Institute of Electrical and Electronics Engineers (IEEE) 802.16 standard known as Wireless Worldwide Interoperability for Microwave Access or WiMAX (WiMax Forum). The paper includes background on the need for global interoperability in air/ground data communications, describes potential AeroMACS applications, addresses allocated frequency spectrum constraints, summarizes the international standardization process, and provides findings and recommendations from the world s first AeroMACS prototype implemented in Cleveland, Ohio, USA.
Extravehicular activity space suit interoperability.
Skoog, A I; McBarron JW 2nd; Severin, G I
1995-10-01
The European Agency (ESA) and the Russian Space Agency (RKA) are jointly developing a new space suit system for improved extravehicular activity (EVA) capabilities in support of the MIR Space Station Programme, the EVA Suit 2000. Recent national policy agreements between the U.S. and Russia on planned cooperations in manned space also include joint extravehicular activity (EVA). With an increased number of space suit systems and a higher operational frequency towards the end of this century an improved interoperability for both routine and emergency operations is of eminent importance. It is thus timely to report the current status of ongoing work on international EVA interoperability being conducted by the Committee on EVA Protocols and Operations of the International Academy of Astronauts initiated in 1991. This paper summarises the current EVA interoperability issues to be harmonised and presents quantified vehicle interface requirements for the current U.S. Shuttle EMU and Russian MIR Orlan DMA and the new European/Russian EVA Suit 2000 extravehicular systems. Major critical/incompatible interfaces for suits/mother-craft of different combinations are discussed, and recommendations for standardisations given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.
The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levelsmore » to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.« less
2011-07-01
Orlando, Florida, September 2009, 09F- SIW -090. [HLA (2000) - 1] - Modeling and Simulation Standard - High Level Architecture (HLA) – Framework and...Simulation Interoperability Workshop, Orlando, FL, USA, September 2009, 09F- SIW -023. [MaK] - www.mak.com [MIL-STD-3011] - MIL-STD-3011...Spring Simulation Interoperability Workshop, Norfolk, VA, USA, March 2007, 07S- SIW -072. [Ross] - Ross, P. and Clark, P. (2005), “Recommended
2011-01-01
Background The practice and research of medicine generates considerable quantities of data and model resources (DMRs). Although in principle biomedical resources are re-usable, in practice few can currently be shared. In particular, the clinical communities in physiology and pharmacology research, as well as medical education, (i.e. PPME communities) are facing considerable operational and technical obstacles in sharing data and models. Findings We outline the efforts of the PPME communities to achieve automated semantic interoperability for clinical resource documentation in collaboration with the RICORDO project. Current community practices in resource documentation and knowledge management are overviewed. Furthermore, requirements and improvements sought by the PPME communities to current documentation practices are discussed. The RICORDO plan and effort in creating a representational framework and associated open software toolkit for the automated management of PPME metadata resources is also described. Conclusions RICORDO is providing the PPME community with tools to effect, share and reason over clinical resource annotations. This work is contributing to the semantic interoperability of DMRs through ontology-based annotation by (i) supporting more effective navigation and re-use of clinical DMRs, as well as (ii) sustaining interoperability operations based on the criterion of biological similarity. Operations facilitated by RICORDO will range from automated dataset matching to model merging and managing complex simulation workflows. In effect, RICORDO is contributing to community standards for resource sharing and interoperability. PMID:21878109
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brackney, L.
Broadly accessible, low cost, accurate, and easy-to-use energy auditing tools remain out of reach for managers of the aging U.S. building population (over 80% of U.S. commercial buildings are more than 10 years old*). concept3D and NREL's commercial buildings group will work to translate and extend NREL's existing spreadsheet-based energy auditing tool for a browser-friendly and mobile-computing platform. NREL will also work with concept3D to further develop a prototype geometry capture and materials inference tool operable on a smart phone/pad platform. These tools will be developed to interoperate with NREL's Building Component Library and OpenStudio energy modeling platforms, and willmore » be marketed by concept3D to commercial developers, academic institutions and governmental agencies. concept3D is NREL's lead developer and subcontractor of the Building Component Library.« less
Applications of software-defined radio (SDR) technology in hospital environments.
Chávez-Santiago, Raúl; Mateska, Aleksandra; Chomu, Konstantin; Gavrilovska, Liljana; Balasingham, Ilangko
2013-01-01
A software-defined radio (SDR) is a radio communication system where the major part of its functionality is implemented by means of software in a personal computer or embedded system. Such a design paradigm has the major advantage of producing devices that can receive and transmit widely different radio protocols based solely on the software used. This flexibility opens several application opportunities in hospital environments, where a large number of wired and wireless electronic devices must coexist in confined areas like operating rooms and intensive care units. This paper outlines some possible applications in the 2360-2500 MHz frequency band. These applications include the integration of wireless medical devices in a common communication platform for seamless interoperability, and cognitive radio (CR) for body area networks (BANs) and wireless sensor networks (WSNs) for medical environmental surveillance. The description of a proof-of-concept CR prototype is also presented.
ESnet authentication services and trust federations
NASA Astrophysics Data System (ADS)
Muruganantham, Dhivakaran; Helm, Mike; Genovese, Tony
2005-01-01
ESnet provides authentication services and trust federation support for SciDAC projects, collaboratories, and other distributed computing applications. The ESnet ATF team operates the DOEGrids Certificate Authority, available to all DOE Office of Science programs, plus several custom CAs, including one for the National Fusion Collaboratory and one for NERSC. The secure hardware and software environment developed to support CAs is suitable for supporting additional custom authentication and authorization applications that your program might require. Seamless, secure interoperation across organizational and international boundaries is vital to collaborative science. We are fostering the development of international PKI federations by founding the TAGPMA, the American regional PMA, and the worldwide IGTF Policy Management Authority (PMA), as well as participating in European and Asian regional PMAs. We are investigating and prototyping distributed authentication technology that will allow us to support the "roaming scientist" (distributed wireless via eduroam), as well as more secure authentication methods (one-time password tokens).
Department of Defense Air Traffic Control and Airspace Management Systems
1989-08-08
service. The potential near-term impacts of incompatible and non- interoperable systems on the Air Force are described in terms of safety and...impacts of incompatible and non-interoperable systems on the Air Force are described in terms of safety and operational effectiveness and probable...derogation of safety , from the standpoint of aircraft collision avoidance, is probable where service specific systems are operating in adjacent or
Operations and Plans: International Military Rationalization, Standardization, and Interoperability
1989-02-15
Army Regulation 34–1 Operations and Plans International Military Rationalization , Standardization, and Interoperability Headquarters Department of...YYYY) 15-02-1997 2. REPORT TYPE 3. DATES COVERED (FROM - TO) xx-xx-1997 to xx-xx-1997 4. TITLE AND SUBTITLE International Military Rationalization ...DSN 427-9007 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39.18 SUMMARY of CHANGE AR 34–1 International Military Rationalization
Validation of a digital audio recording method for the objective assessment of cough in the horse.
Duz, M; Whittaker, A G; Love, S; Parkin, T D H; Hughes, K J
2010-10-01
To validate the use of digital audio recording and analysis for quantification of coughing in horses. Part A: Nine simultaneous digital audio and video recordings were collected individually from seven stabled horses over a 1 h period using a digital audio recorder attached to the halter. Audio files were analysed using audio analysis software. Video and audio recordings were analysed for cough count and timing by two blinded operators on two occasions using a randomised study design for determination of intra-operator and inter-operator agreement. Part B: Seventy-eight hours of audio recordings obtained from nine horses were analysed once by two blinded operators to assess inter-operator repeatability on a larger sample. Part A: There was complete agreement between audio and video analyses and inter- and intra-operator analyses. Part B: There was >97% agreement between operators on number and timing of 727 coughs recorded over 78 h. The results of this study suggest that the cough monitor methodology used has excellent sensitivity and specificity for the objective assessment of cough in horses and intra- and inter-operator variability of recorded coughs is minimal. Crown Copyright 2010. Published by Elsevier India Pvt Ltd. All rights reserved.
Operational Plan Ontology Model for Interconnection and Interoperability
NASA Astrophysics Data System (ADS)
Long, F.; Sun, Y. K.; Shi, H. Q.
2017-03-01
Aiming at the assistant decision-making system’s bottleneck of processing the operational plan data and information, this paper starts from the analysis of the problem of traditional expression and the technical advantage of ontology, and then it defines the elements of the operational plan ontology model and determines the basis of construction. Later, it builds up a semi-knowledge-level operational plan ontology model. Finally, it probes into the operational plan expression based on the operational plan ontology model and the usage of the application software. Thus, this paper has the theoretical significance and application value in the improvement of interconnection and interoperability of the operational plan among assistant decision-making systems.
Molinari, Francesco; Pirronti, Tommaso; Sverzellati, Nicola; Diciotti, Stefano; Amato, Michele; Paolantonio, Guglielmo; Gentile, Luigia; Parapatt, George K; D'Argento, Francesco; Kuhnigk, Jan-Martin
2013-01-01
We aimed to compare the intra- and interoperator variability of lobar volumetry and emphysema scores obtained by semi-automated and manual segmentation techniques in lung emphysema patients. In two sessions held three months apart, two operators performed lobar volumetry of unenhanced chest computed tomography examinations of 47 consecutive patients with chronic obstructive pulmonary disease and lung emphysema. Both operators used the manual and semi-automated segmentation techniques. The intra- and interoperator variability of the volumes and emphysema scores obtained by semi-automated segmentation was compared with the variability obtained by manual segmentation of the five pulmonary lobes. The intra- and interoperator variability of the lobar volumes decreased when using semi-automated lobe segmentation (coefficients of repeatability for the first operator: right upper lobe, 147 vs. 96.3; right middle lobe, 137.7 vs. 73.4; right lower lobe, 89.2 vs. 42.4; left upper lobe, 262.2 vs. 54.8; and left lower lobe, 260.5 vs. 56.5; coefficients of repeatability for the second operator: right upper lobe, 61.4 vs. 48.1; right middle lobe, 56 vs. 46.4; right lower lobe, 26.9 vs. 16.7; left upper lobe, 61.4 vs. 27; and left lower lobe, 63.6 vs. 27.5; coefficients of reproducibility in the interoperator analysis: right upper lobe, 191.3 vs. 102.9; right middle lobe, 219.8 vs. 126.5; right lower lobe, 122.6 vs. 90.1; left upper lobe, 166.9 vs. 68.7; and left lower lobe, 168.7 vs. 71.6). The coefficients of repeatability and reproducibility of emphysema scores also decreased when using semi-automated segmentation and had ranges that varied depending on the target lobe and selected threshold of emphysema. Semi-automated segmentation reduces the intra- and interoperator variability of lobar volumetry and provides a more objective tool than manual technique for quantifying lung volumes and severity of emphysema.
An open and reconfigurable wireless sensor network for pervasive health monitoring.
Triantafyllidis, A; Koutkias, V; Chouvarda, I; Maglaveras, N
2008-01-01
Sensor networks constitute the backbone for the construction of personalized monitoring systems. Up to now, several sensor networks have been proposed for diverse pervasive healthcare applications, which are however characterized by a significant lack of open architectures, resulting in closed, non-interoperable and difficult to extend solutions. In this context, we propose an open and reconfigurable wireless sensor network (WSN) for pervasive health monitoring, with particular emphasis in its easy extension with additional sensors and functionality by incorporating embedded intelligence mechanisms. We consider a generic WSN architecture comprised of diverse sensor nodes (with communication and processing capabilities) and a mobile base unit (MBU) operating as the gateway between the sensors and the medical personnel, formulating this way a body area network (BAN). The primary focus of this work is on the intra-BAN data communication issues, adopting SensorML as the data representation mean, including the encoding of the monitoring patterns and the functionality of the sensor network. In our prototype implementation two sensor nodes are emulated; one for heart rate monitoring and the other for blood glucose observations, while the MBU corresponds to a personal digital assistant (PDA) device. Java 2 Micro Edition (J2ME) is used to implement both the sensor nodes and the MBU components. Intra-BAN wireless communication relies on the Blue-tooth protocol. Via an adaptive user interface in the MBU, health professionals may specify the monitoring parameters of the WSN and define the monitoring patterns of interest in terms of rules. This work constitutes an essential step towards the construction of open, extensible, inter-operable and intelligent WSNs for pervasive health monitoring.
Data Intensive Scientific Workflows on a Federated Cloud: CRADA Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele
The Fermilab Scientific Computing Division and the KISTI Global Science Experimental Data Hub Center have built a prototypical large-scale infrastructure to handle scientific workflows of stakeholders to run on multiple cloud resources. The demonstrations have been in the areas of (a) Data-Intensive Scientific Workflows on Federated Clouds, (b) Interoperability and Federation of Cloud Resources, and (c) Virtual Infrastructure Automation to enable On-Demand Services.
On-line access to remote sensing data with the satellite-data information system (ISIS)
NASA Astrophysics Data System (ADS)
Strunz, G.; Lotz-Iwen, H.-J.
1994-08-01
The German Remote Sensing Data Center (DFD) is developing the satellite-data information system ISIS as central interface for users to access Earth observation data. ISIS has been designed to support international scientific research as well as operational applications by offering online database access via public networks, and is integrated in the international activities dedicated to catalogue and archive interoperability. A prototype of ISIS is already in use within the German Processing and Archiving Facility for ERS-1 for the storage and retrieval of digital SAR quicklook products and for the Radarmap of Germany. An operational status of the system is envisaged for the launch of ERS-2. The paper in hand describes the underlying concepts of ISIS and the recent state of realization. It explains the overall structure of the system and the functionality of each of its components. Emphasis is put on the description of the advisory system, the catalogue retrieval, and the online access and transfer of image data. Finally, the integration into a future global environmental data network is outlined.
Zhang, Mingyuan; Velasco, Ferdinand T.; Musser, R. Clayton; Kawamoto, Kensaku
2013-01-01
Enabling clinical decision support (CDS) across multiple electronic health record (EHR) systems has been a desired but largely unattained aim of clinical informatics, especially in commercial EHR systems. A potential opportunity for enabling such scalable CDS is to leverage vendor-supported, Web-based CDS development platforms along with vendor-supported application programming interfaces (APIs). Here, we propose a potential staged approach for enabling such scalable CDS, starting with the use of custom EHR APIs and moving towards standardized EHR APIs to facilitate interoperability. We analyzed three commercial EHR systems for their capabilities to support the proposed approach, and we implemented prototypes in all three systems. Based on these analyses and prototype implementations, we conclude that the approach proposed is feasible, already supported by several major commercial EHR vendors, and potentially capable of enabling cross-platform CDS at scale. PMID:24551426
NASA Technical Reports Server (NTRS)
Jones, Michael K.
1998-01-01
Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.
NASA Technical Reports Server (NTRS)
Stephens, J. Briscoe; Grider, Gary W.
1992-01-01
These Earth Science and Applications Division-Data and Information System (ESAD-DIS) interoperability requirements are designed to quantify the Earth Science and Application Division's hardware and software requirements in terms of communications between personal and visualization workstation, and mainframe computers. The electronic mail requirements and local area network (LAN) requirements are addressed. These interoperability requirements are top-level requirements framed around defining the existing ESAD-DIS interoperability and projecting known near-term requirements for both operational support and for management planning. Detailed requirements will be submitted on a case-by-case basis. This document is also intended as an overview of ESAD-DIs interoperability for new-comers and management not familiar with these activities. It is intended as background documentation to support requests for resources and support requirements.
Big Data in the Earth Observing System Data and Information System
NASA Technical Reports Server (NTRS)
Lynnes, Chris; Baynes, Katie; McInerney, Mark
2016-01-01
Approaches that are being pursued for the Earth Observing System Data and Information System (EOSDIS) data system to address the challenges of Big Data were presented to the NASA Big Data Task Force. Cloud prototypes are underway to tackle the volume challenge of Big Data. However, advances in computer hardware or cloud won't help (much) with variety. Rather, interoperability standards, conventions, and community engagement are the key to addressing variety.
NASA Technical Reports Server (NTRS)
Lynnes, Christopher
2016-01-01
The NASA representative to the Unidata Strategic Committee presented a semiannual update on NASAs work with and use of Unidata technologies. The talk covered the program of cloud computing prototypes being undertaken for the Earth Observing System Data and Information System (EOSDIS). Also discussed were dataset interoperability recommendations ratified via the EOSDIS Standards Office and the HDF Product Designer tool with respect to its possible applicability to data in network Common Data Form (NetCDF) version 4.
Progress toward Modular UAS for Geoscience Applications
NASA Astrophysics Data System (ADS)
Dahlgren, R. P.; Clark, M. A.; Comstock, R. J.; Fladeland, M.; Gascot, H., III; Haig, T. H.; Lam, S. J.; Mazhari, A. A.; Palomares, R. R.; Pinsker, E. A.; Prathipati, R. T.; Sagaga, J.; Thurling, J. S.; Travers, S. V.
2017-12-01
Small Unmanned Aerial Systems (UAS) have become accepted tools for geoscience, ecology, agriculture, disaster response, land management, and industry. A variety of consumer UAS options exist as science and engineering payload platforms, but their incompatibilities with one another contribute to high operational costs compared with those of piloted aircraft. This research explores the concept of modular UAS, demonstrating airframes that can be reconfigured in the field for experimental optimization, to enable multi-mission support, facilitate rapid repair, or respond to changing field conditions. Modular UAS is revolutionary in allowing aircraft to be optimized around the payload, reversing the conventional wisdom of designing the payload to accommodate an unmodifiable aircraft. UAS that are reconfigurable like Legos™ are ideal for airborne science service providers, system integrators, instrument designers and end users to fulfill a wide range of geoscience experiments. Modular UAS facilitate the adoption of open-source software and rapid prototyping technology where design reuse is important in the context of a highly regulated industry like aerospace. The industry is now at a stage where consolidation, acquisition, and attrition will reduce the number of small manufacturers, with a reduction of innovation and motivation to reduce costs. Modularity leads to interface specifications, which can evolve into de facto or formal standards which contain minimum (but sufficient) details such that multiple vendors can then design to those standards and demonstrate interoperability. At that stage, vendor coopetition leads to robust interface standards, interoperability standards and multi-source agreements which in turn drive costs down significantly.
A Multi-Center Space Data System Prototype Based on CCSDS Standards
NASA Technical Reports Server (NTRS)
Rich, Thomas M.
2016-01-01
Deep space missions beyond earth orbit will require new methods of data communications in order to compensate for increasing Radio Frequency (RF) propagation delay. The Consultative Committee for Space Data Systems (CCSDS) standard protocols Spacecraft Monitor & Control (SM&C), Asynchronous Message Service (AMS), and Delay/Disruption Tolerant Networking (DTN) provide such a method. However, the maturity level of this protocol stack is insufficient for mission inclusion at this time. This Space Data System prototype is intended to provide experience which will raise the Technical Readiness Level (TRL) of this protocol set. In order to reduce costs, future missions can take advantage of these standard protocols, which will result in increased interoperability between control centers. This prototype demonstrates these capabilities by implementing a realistic space data system in which telemetry is published to control center applications at the Jet Propulsion Lab (JPL), the Marshall Space Flight Center (MSFC), and the Johnson Space Center (JSC). Reverse publishing paths for commanding from each control center are also implemented. The target vehicle consists of realistic flight computer hardware running Core Flight Software (CFS) in the integrated Power, Avionics, and Power (iPAS) Pathfinder Lab at JSC. This prototype demonstrates a potential upgrade path for future Deep Space Network (DSN) modification, in which the automatic error recovery and communication gap compensation capabilities of DTN would be exploited. In addition, SM&C provides architectural flexibility by allowing new service providers and consumers to be added efficiently anywhere in the network using the common interface provided by SM&C's Message Abstraction Layer (MAL). In FY 2015, this space data system was enhanced by adding telerobotic operations capability provided by the Robot API Delegate (RAPID) family of protocols developed at NASA. RAPID is one of several candidates for consideration and inclusion in a new international standard being developed by the CCSDS Telerobotic Operations Working Group. Software gateways for the purpose of interfacing RAPID messages with the existing SM&C based infrastructure were developed. Telerobotic monitor, control, and bridge applications were written in the RAPID framework, which were then tailored to the NAO telerobotic test article hardware, a product of Aldebaran Robotics.
A component-based software environment for visualizing large macromolecular assemblies.
Sanner, Michel F
2005-03-01
The interactive visualization of large biological assemblies poses a number of challenging problems, including the development of multiresolution representations and new interaction methods for navigating and analyzing these complex systems. An additional challenge is the development of flexible software environments that will facilitate the integration and interoperation of computational models and techniques from a wide variety of scientific disciplines. In this paper, we present a component-based software development strategy centered on the high-level, object-oriented, interpretive programming language: Python. We present several software components, discuss their integration, and describe some of their features that are relevant to the visualization of large molecular assemblies. Several examples are given to illustrate the interoperation of these software components and the integration of structural data from a variety of experimental sources. These examples illustrate how combining visual programming with component-based software development facilitates the rapid prototyping of novel visualization tools.
Telescience in the Space Station era
NASA Technical Reports Server (NTRS)
Schmerling, E. R.
1988-01-01
Telescience refers to the development of systems where participants involved in research in space can access their fellow scientists and the appropriate NASA services before flight, during flight, and after flight, preferably from their home institutions and through the same equipment. Telescience requires integration of available technologies to develop computer environments that maintain interoperability across different disciplines and different portions of the lifetimes of space experiments, called teledesign, teleoperations, and teleanalysis. Participants in the NASA Telescience Testbed Program are using a rigid prototyping approach to evaluate the necessary technologies and select the options and tradeoffs that best suit their accustomed modalities. The concept of transaction management is described, where the emphasis is placed on the effects of commands, whether event-generated onboard the spacecraft or sent up from the ground. Interoperability, security, and privacy issues are also discussed, and the Telescience Testbed Pilot Program is described.
Model for Semantically Rich Point Cloud Data
NASA Astrophysics Data System (ADS)
Poux, F.; Neuville, R.; Hallot, P.; Billen, R.
2017-10-01
This paper proposes an interoperable model for managing high dimensional point clouds while integrating semantics. Point clouds from sensors are a direct source of information physically describing a 3D state of the recorded environment. As such, they are an exhaustive representation of the real world at every scale: 3D reality-based spatial data. Their generation is increasingly fast but processing routines and data models lack of knowledge to reason from information extraction rather than interpretation. The enhanced smart point cloud developed model allows to bring intelligence to point clouds via 3 connected meta-models while linking available knowledge and classification procedures that permits semantic injection. Interoperability drives the model adaptation to potentially many applications through specialized domain ontologies. A first prototype is implemented in Python and PostgreSQL database and allows to combine semantic and spatial concepts for basic hybrid queries on different point clouds.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-24
... Docket 07-100; FCC 11-6] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the... framework for the nationwide public safety broadband network. This document considers and proposes... broadband networks operating in the 700 MHz band. This document addresses public safety broadband network...
Smart Grid Interoperability Maturity Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Levinson, Alex; Mater, J.
2010-04-28
The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizationalmore » alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.« less
NASA Astrophysics Data System (ADS)
Kibria, Mirza Golam; Villardi, Gabriel Porto; Ishizu, Kentaro; Kojima, Fumihide; Yano, Hiroyuki
2016-12-01
In this paper, we study inter-operator spectrum sharing and intra-operator resource allocation in shared spectrum access communication systems and propose efficient dynamic solutions to address both inter-operator and intra-operator resource allocation optimization problems. For inter-operator spectrum sharing, we present two competent approaches, namely the subcarrier gain-based sharing and fragmentation-based sharing, which carry out fair and flexible allocation of the available shareable spectrum among the operators subject to certain well-defined sharing rules, traffic demands, and channel propagation characteristics. The subcarrier gain-based spectrum sharing scheme has been found to be more efficient in terms of achieved throughput. However, the fragmentation-based sharing is more attractive in terms of computational complexity. For intra-operator resource allocation, we consider resource allocation problem with users' dissimilar service requirements, where the operator supports users with delay constraint and non-delay constraint service requirements, simultaneously. This optimization problem is a mixed-integer non-linear programming problem and non-convex, which is computationally very expensive, and the complexity grows exponentially with the number of integer variables. We propose less-complex and efficient suboptimal solution based on formulating exact linearization, linear approximation, and convexification techniques for the non-linear and/or non-convex objective functions and constraints. Extensive simulation performance analysis has been carried out that validates the efficiency of the proposed solution.
Requirements Development for Interoperability Simulation Capability for Law Enforcement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holter, Gregory M.
2004-05-19
The National Counterdrug Center (NCC) was initially authorized by Congress in FY 1999 appropriations to create a simulation-based counterdrug interoperability training capability. As the lead organization for Research and Analysis to support the NCC, the Pacific Northwest National Laboratory (PNNL) was responsible for developing the requirements for this interoperability simulation capability. These requirements were structured to address the hardware and software components of the system, as well as the deployment and use of the system. The original set of requirements was developed through a process of conducting a user-based survey of requirements for the simulation capability, coupled with an analysismore » of similar development efforts. The user-based approach ensured that existing concerns with respect to interoperability within the law enforcement community would be addressed. Law enforcement agencies within the designated pilot area of Cochise County, Arizona, were surveyed using interviews and ride-alongs during actual operations. The results of this survey were then accumulated, organized, and validated with the agencies to ensure the accuracy of the results. These requirements were then supplemented by adapting operational requirements from existing systems to ensure system reliability and operability. The NCC adopted a development approach providing incremental capability through the fielding of a phased series of progressively more capable versions of the system. This allowed for feedback from system users to be incorporated into subsequent revisions of the system requirements, and also allowed the addition of new elements as needed to adapt the system to broader geographic and geopolitical areas, including areas along the southwest and northwest U.S. borders. This paper addresses the processes used to develop and refine requirements for the NCC interoperability simulation capability, as well as the response of the law enforcement community to the use of the NCC system. The paper also addresses the applicability of such an interoperability simulation capability to a broader set of law enforcement, border protection, site/facility security, and first-responder needs.« less
Daboul, Amro; Ivanovska, Tatyana; Bülow, Robin; Biffar, Reiner; Cardini, Andrea
2018-01-01
Using 3D anatomical landmarks from adult human head MRIs, we assessed the magnitude of inter-operator differences in Procrustes-based geometric morphometric analyses. An in depth analysis of both absolute and relative error was performed in a subsample of individuals with replicated digitization by three different operators. The effect of inter-operator differences was also explored in a large sample of more than 900 individuals. Although absolute error was not unusual for MRI measurements, including bone landmarks, shape was particularly affected by differences among operators, with up to more than 30% of sample variation accounted for by this type of error. The magnitude of the bias was such that it dominated the main pattern of bone and total (all landmarks included) shape variation, largely surpassing the effect of sex differences between hundreds of men and women. In contrast, however, we found higher reproducibility in soft-tissue nasal landmarks, despite relatively larger errors in estimates of nasal size. Our study exemplifies the assessment of measurement error using geometric morphometrics on landmarks from MRIs and stresses the importance of relating it to total sample variance within the specific methodological framework being used. In summary, precise landmarks may not necessarily imply negligible errors, especially in shape data; indeed, size and shape may be differentially impacted by measurement error and different types of landmarks may have relatively larger or smaller errors. Importantly, and consistently with other recent studies using geometric morphometrics on digital images (which, however, were not specific to MRI data), this study showed that inter-operator biases can be a major source of error in the analysis of large samples, as those that are becoming increasingly common in the 'era of big data'.
Ivanovska, Tatyana; Bülow, Robin; Biffar, Reiner; Cardini, Andrea
2018-01-01
Using 3D anatomical landmarks from adult human head MRIs, we assessed the magnitude of inter-operator differences in Procrustes-based geometric morphometric analyses. An in depth analysis of both absolute and relative error was performed in a subsample of individuals with replicated digitization by three different operators. The effect of inter-operator differences was also explored in a large sample of more than 900 individuals. Although absolute error was not unusual for MRI measurements, including bone landmarks, shape was particularly affected by differences among operators, with up to more than 30% of sample variation accounted for by this type of error. The magnitude of the bias was such that it dominated the main pattern of bone and total (all landmarks included) shape variation, largely surpassing the effect of sex differences between hundreds of men and women. In contrast, however, we found higher reproducibility in soft-tissue nasal landmarks, despite relatively larger errors in estimates of nasal size. Our study exemplifies the assessment of measurement error using geometric morphometrics on landmarks from MRIs and stresses the importance of relating it to total sample variance within the specific methodological framework being used. In summary, precise landmarks may not necessarily imply negligible errors, especially in shape data; indeed, size and shape may be differentially impacted by measurement error and different types of landmarks may have relatively larger or smaller errors. Importantly, and consistently with other recent studies using geometric morphometrics on digital images (which, however, were not specific to MRI data), this study showed that inter-operator biases can be a major source of error in the analysis of large samples, as those that are becoming increasingly common in the 'era of big data'. PMID:29787586
GMPLS-based control plane for optical networks: early implementation experience
NASA Astrophysics Data System (ADS)
Liu, Hang; Pendarakis, Dimitrios; Komaee, Nooshin; Saha, Debanjan
2002-07-01
Generalized Multi-Protocol Label Switching (GMPLS) extends MPLS signaling and Internet routing protocols to provide a scalable, interoperable, distributed control plane, which is applicable to multiple network technologies such as optical cross connects (OXCs), photonic switches, IP routers, ATM switches, SONET and DWDM systems. It is intended to facilitate automatic service provisioning and dynamic neighbor and topology discovery across multi-vendor intelligent transport networks, as well as their clients. Efforts to standardize such a distributed common control plane have reached various stages in several bodies such as the IETF, ITU and OIF. This paper describes the design considerations and architecture of a GMPLS-based control plane that we have prototyped for core optical networks. Functional components of GMPLS signaling and routing are integrated in this architecture with an application layer controller module. Various requirements including bandwidth, network protection and survivability, traffic engineering, optimal utilization of network resources, and etc. are taken into consideration during path computation and provisioning. Initial experiments with our prototype demonstrate the feasibility and main benefits of GMPLS as a distributed control plane for core optical networks. In addition to such feasibility results, actual adoption and deployment of GMPLS as a common control plane for intelligent transport networks will depend on the successful completion of relevant standardization activities, extensive interoperability testing as well as the strengthening of appropriate business drivers.
Campagna, Giuseppe; Zampetti, Simona; Gallozzi, Alessia; Giansanti, Sara; Chiesa, Claudio; Pacifico, Lucia; Buzzetti, Raffaella
2016-01-01
In a previous study, we found that wrist circumference, in particular its bone component, was associated with insulin resistance in a population of overweight/obese children. The aim of the present study was to evaluate the intra- and inter-operator variability in wrist circumference measurement in a population of obese children and adolescents. One hundred and two (54 male and 48 female) obese children and adolescents were consecutively enrolled. In all subjects wrist circumferences were measured by two different operators two times to assess intra- and inter-operator variability. Statistical analysis was performed using SAS v.9.4 and JMP v.12. Measurements of wrist circumference showed excellent inter-operator reliability with Intra class Correlation Coefficients (ICC) of 0.96 and ICC of 0.97 for the first and the second measurement, respectively. The intra-operator reliability was, also, very strong with a Concordance Correlation Coefficient (CCC) of 0.98 for both operators. The high reproducibility demonstrated in our results suggests that wrist circumference measurement, being safe, non-invasive and repeatable can be easily used in out-patient settings to identify youths with increased risk of insulin-resistance. This can avoid testing the entire population of overweight/obese children for insulin resistance parameters. PMID:27294398
Managing Interoperability for GEOSS - A Report from the SIF
NASA Astrophysics Data System (ADS)
Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.
2009-04-01
The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of GEOSS.
Joint Command and Control: Integration Not Interoperability
2013-03-01
separate computer and communication equipment. Besides having to engineer interoperability, the Services also must determine the level of...effects. Determines force responsiveness and allocates resources.5 This thesis argues Joint military operations will never be fully integrated as...processes and systems. Secondly, the limited depth of discussion risks implying (or the reader inferring) the solution is more straightforward than
Potential interoperability problems facing multi-site radiation oncology centers in The Netherlands
NASA Astrophysics Data System (ADS)
Scheurleer, J.; Koken, Ph; Wessel, R.
2014-03-01
Aim: To identify potential interoperability problems facing multi-site Radiation Oncology (RO) departments in the Netherlands and solutions for unambiguous multi-system workflows. Specific challenges confronting the RO department of VUmc (RO-VUmc), which is soon to open a satellite department, were characterized. Methods: A nationwide questionnaire survey was conducted to identify possible interoperability problems and solutions. Further detailed information was obtained by in-depth interviews at 3 Dutch RO institutes that already operate in more than one site. Results: The survey had a 100% response rate (n=21). Altogether 95 interoperability problems were described. Most reported problems were on a strategic and semantic level. The majority were DICOM(-RT) and HL7 related (n=65), primarily between treatment planning and verification systems or between departmental and hospital systems. Seven were identified as being relevant for RO-VUmc. Departments have overcome interoperability problems with their own, or with tailor-made vendor solutions. There was little knowledge about or utilization of solutions developed by Integrating the Healthcare Enterprise Radiation Oncology (IHE-RO). Conclusions: Although interoperability problems are still common, solutions have been identified. Awareness of IHE-RO needs to be raised. No major new interoperability problems are predicted as RO-VUmc develops into a multi-site department.
FLTSATCOM interoperability applications
NASA Astrophysics Data System (ADS)
Woolford, Lynn
A mobile Fleet Satellite Communications (FLTSATCOM) system called the Mobile Operational Control Center (MOCC) was developed which has demonstrated the ability to be interoperable with many of the current FLTSATCOM command and control channels. This low-cost system is secure in all its communications, is lightweight, and provides a gateway for other communications formats. The major elements of this system are made up of a personal computer, a protocol microprocessor, and off-the-shelf mobile communication components. It is concluded that with both FLTSATCOM channel protocol and data format interoperability, the MOCC has the ability provide vital information in or near real time, which significantly improves mission effectiveness.
Crump, Jacob K.; Del Fiol, Guilherme; Williams, Marc S.; Freimuth, Robert R.
2018-01-01
Integration of genetic information is becoming increasingly important in clinical practice. However, genetic information is often ambiguous and difficult to understand, and clinicians have reported low-self-efficacy in integrating genetics into their care routine. The Health Level Seven (HL7) Infobutton standard helps to integrate online knowledge resources within Electronic Health Records (EHRs) and is required for EHR certification in the US. We implemented a prototype of a standards-based genetic reporting application coupled with infobuttons leveraging the Infobutton and Fast Healthcare Interoperability Resources (FHIR) Standards. Infobutton capabilities were provided by Open Infobutton, an open source package compliant with the HL7 Infobutton Standard. The resulting prototype demonstrates how standards-based reporting of genetic results, coupled with curated knowledge resources, can provide dynamic access to clinical knowledge on demand at the point of care. The proposed functionality can be enabled within any EHR system that has been certified through the US Meaningful Use program.
NASA Astrophysics Data System (ADS)
Vucinic, Dean; Deen, Danny; Oanta, Emil; Batarilo, Zvonimir; Lacor, Chris
This paper focuses on visualization and manipulation of graphical content in distributed network environments. The developed graphical middleware and 3D desktop prototypes were specialized for situational awareness. This research was done in the LArge Scale COllaborative decision support Technology (LASCOT) project, which explored and combined software technologies to support human-centred decision support system for crisis management (earthquake, tsunami, flooding, airplane or oil-tanker incidents, chemical, radio-active or other pollutants spreading, etc.). The performed state-of-the-art review did not identify any publicly available large scale distributed application of this kind. Existing proprietary solutions rely on the conventional technologies and 2D representations. Our challenge was to apply the "latest" available technologies, such Java3D, X3D and SOAP, compatible with average computer graphics hardware. The selected technologies are integrated and we demonstrate: the flow of data, which originates from heterogeneous data sources; interoperability across different operating systems and 3D visual representations to enhance the end-users interactions.
Tri-Band CPW-Fed Stub-Loaded Slot Antenna Design for WLAN/WiMAX Applications
NASA Astrophysics Data System (ADS)
Li, Jianxing; Guo, Jianying; He, Bin; Zhang, Anxue; Liu, Qing Huo
2016-11-01
A novel uniplanar CPW-fed tri-band stub-loaded slot antenna is proposed for wireless local area network (WLAN) and worldwide interoperability for microwave access (WiMAX) applications. Dual resonant modes were effectively excited in the upper band by using two identical pairs of slot stubs and parasitic slots symmetrically along the arms of a traditional CPW-fed slot dipole, achieving a much wider bandwidth. The middle band was realized by the fundamental mode of the slot dipole. To obtain the lower band, two identical inverted-L-shaped open-ended slots were symmetrically etched in the ground plane. A prototype was fabricated and measured, showing that tri-band operation with 10-dB return loss bandwidths of 150 MHz from 2.375 to 2.525 GHz, 725 MHz from 3.075 to 3.8 GHz, and 1.9 GHz from 5.0 to 6.9 GHz has been achieved. Details of the antenna design as well as the measured and simulated results are presented and discussed.
Message Received How to Bridge the Communication Gap and Save Lives
2004-03-01
safety during an emergency depend on the ability of first responders to talk via radio, directly, without dispatch and in real time. Many technologies are...Words interoperability Coast Guard first responders procedures interagency communications policies 18...communication interoperability for public safety first responders entails far more than finding and emplacing a technology and training the operators. The
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
... this document. FOR FURTHER INFORMATION CONTACT: Brenda Boykin, Wireless Telecommunications Bureau, (202... power levels of up to 1000 kW.\\2\\ The Lower A Block is also adjacent to the unpaired Lower 700 MHz E Block, where licensees (along with Lower 700 MHz D Block licensees) may operate at power levels up to 50...
The Long Road to Semantic Interoperability in Support of Public Health: Experiences from Two States
Vreeman, Daniel J.; Grannis, Shaun J.
2014-01-01
Proliferation of health information technologies creates opportunities to improve clinical and public health, including high quality, safer care and lower costs. To maximize such potential benefits, health information technologies must readily and reliably exchange information with other systems. However, evidence from public health surveillance programs in two states suggests that operational clinical information systems often fail to use available standards, a barrier to semantic interoperability. Furthermore, analysis of existing policies incentivizing semantic interoperability suggests they have limited impact and are fragmented. In this essay, we discuss three approaches for increasing semantic interoperability to support national goals for using health information technologies. A clear, comprehensive strategy requiring collaborative efforts by clinical and public health stakeholders is suggested as a guide for the long road towards better population health data and outcomes. PMID:24680985
Stevenson, Timothy H; Chevalier, Nicole A; Scher, Gregory R; Burke, Ronald L
2016-01-01
Effective multilateral military operations such as those conducted by the North Atlantic Treaty Organization (NATO) require close cooperation and standardization between member nations to ensure interoperability. Failure to standardize policies, procedures, and doctrine prior to the commencement of military operations will result in critical interoperability gaps, which jeopardize the health of NATO forces and mission success. To prevent these gaps from occurring, US forces must be actively involved with NATO standardization efforts such as the Committee of the Chiefs of Medical Services to ensure US interests are properly represented when NATO standards are developed and US doctrine and procedures will meet the established NATO requirements.
2005-03-24
1 :45PM- 3 :30PM Panel: Establishing a Business Mission Area in the Department of...Minimum MaximumLEVEL OF INTEROPERABILITY Level 1 Level 2 Level 3 Level 4 10 COTS Native IP Network IP TCP UDP Network QoS Layer IIOP NTP SNMP Legacy...2005 Page 1 3 /27/2005 Page 2 3 /27/2005 Page 3 3 /27/2005 Page 4 3 /27/2005 Page 5 3 /27/2005 Page 6 3 /27/2005 Page 7 3 /27/2005 Page 8 3 /27/2005 Page 9 3
Inter-operator and inter-device agreement and reliability of the SEM Scanner.
Clendenin, Marta; Jaradeh, Kindah; Shamirian, Anasheh; Rhodes, Shannon L
2015-02-01
The SEM Scanner is a medical device designed for use by healthcare providers as part of pressure ulcer prevention programs. The objective of this study was to evaluate the inter-rater and inter-device agreement and reliability of the SEM Scanner. Thirty-one (31) volunteers free of pressure ulcers or broken skin at the sternum, sacrum, and heels were assessed with the SEM Scanner. Each of three operators utilized each of three devices to collect readings from four anatomical sites (sternum, sacrum, left and right heels) on each subject for a total of 108 readings per subject collected over approximately 30 min. For each combination of operator-device-anatomical site, three SEM readings were collected. Inter-operator and inter-device agreement and reliability were estimated. Over the course of this study, more than 3000 SEM Scanner readings were collected. Agreement between operators was good with mean differences ranging from -0.01 to 0.11. Inter-operator and inter-device reliability exceeded 0.80 at all anatomical sites assessed. The results of this study demonstrate the high reliability and good agreement of the SEM Scanner across different operators and different devices. Given the limitations of current methods to prevent and detect pressure ulcers, the SEM Scanner shows promise as an objective, reliable tool for assessing the presence or absence of pressure-induced tissue damage such as pressure ulcers. Copyright © 2015 Bruin Biometrics, LLC. Published by Elsevier Ltd.. All rights reserved.
Data interoperability software solution for emergency reaction in the Europe Union
NASA Astrophysics Data System (ADS)
Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.
2015-07-01
Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency-first responders: the Netherlands-Germany border fire.
Watershed and Economic Data InterOperability (WEDO) ...
Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interoperability goes beyond the current practice of publishing modeling studies as reports or journal articles. Rather than summarized results, modeling studies can be published with their full complement of input data, calibration parameters and output with associated metadata for easy duplication by others. Reproducible science is possible only if researchers can find, evaluate and use complete modeling studies performed by other modelers. WEDO greatly increases transparency by making detailed data available to the scientific community.WEDO is a next generation technology, a Web Service linked to the EPA’s EnviroAtlas for discovery of modeling studies nationwide. Streams and rivers are identified using the National Hydrography Dataset network and stream IDs. Streams with modeling studies available are color coded in the EnviroAtlas. One can select streams within a watershed of interest to readily find data available via WEDO. The WEDO website is linked from the EnviroAtlas to provide a thorough review of each modeling study. WEDO currently provides modeled flow and water quality time series, designed for a broad range of watershed and economic models for nutrient trading market analysis. M
End effector monitoring system: An illustrated case of operational prototyping
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Land, Sherry A.; Thronesbery, Carroll
1994-01-01
Operational prototyping is introduced to help developers apply software innovations to real-world problems, to help users articulate requirements, and to help develop more usable software. Operational prototyping has been applied to an expert system development project. The expert system supports fault detection and management during grappling operations of the Space Shuttle payload bay arm. The dynamic exchanges among operational prototyping team members are illustrated in a specific prototyping session. We discuss the requirements for operational prototyping technology, types of projects for which operational prototyping is best suited and when it should be applied to those projects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, Dave; Stephan, Eric G.; Wang, Weimin
Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability liemore » the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.« less
JAXA-NASA Interoperability Demonstration for Application of DTN Under Simulated Rain Attenuation
NASA Technical Reports Server (NTRS)
Suzuki, Kiyoshisa; Inagawa, Shinichi; Lippincott, Jeff; Cecil, Andrew J.
2014-01-01
As is well known, K-band or higher band communications in space link segment often experience intermittent disruptions caused by heavy rainfall. In view of keeping data integrity and establishing autonomous operations under such situation, it is important to consider introducing a tolerance mechanism such as Delay/Disruption Tolerant Networking (DTN). The Consultative Committee for Space Data Systems (CCSDS) is studying DTN as part of the standardization activities for space data systems. As a contribution to CCSDS and a feasibility study for future utilization of DTN, Japan Aerospace Exploration Agency (JAXA) and National Aeronautics and Space Administration (NASA) conducted an interoperability demonstration for confirming its tolerance mechanism and capability of automatic operation using Data Relay Test Satellite (DRTS) space link and its ground terminals. Both parties used the Interplanetary Overlay Network (ION) open source software, including the Bundle Protocol, the Licklider Transmission Protocol, and Contact Graph Routing. This paper introduces the contents of the interoperability demonstration and its results.
NASA Astrophysics Data System (ADS)
Teng, W.; Kempler, S.; Chiu, L.; Doraiswamy, P.; Liu, Z.; Milich, L.; Tetrault, R.
2003-12-01
Monitoring global agricultural crop conditions during the growing season and estimating potential seasonal production are critically important for market development of U.S. agricultural products and for global food security. Two major operational users of satellite remote sensing for global crop monitoring are the USDA Foreign Agricultural Service (FAS) and the U.N. World Food Programme (WFP). The primary goal of FAS is to improve foreign market access for U.S. agricultural products. The WFP uses food to meet emergency needs and to support economic and social development. Both use global agricultural decision support systems that can integrate and synthesize a variety of data sources to provide accurate and timely information on global crop conditions. The Goddard Space Flight Center Earth Sciences Distributed Active Archive Center (GES DAAC) has begun a project to provide operational solutions to FAS and WFP, by fully leveraging results from previous work, as well as from existing capabilities of the users. The GES DAAC has effectively used its recently developed prototype TRMM Online Visualization and Analysis System (TOVAS) to provide ESE data and information to the WFP for its agricultural drought monitoring efforts. This prototype system will be evolved into an Agricultural Information System (AIS), which will operationally provide ESE and other data products (e.g., rainfall, land productivity) and services, to be integrated into and thus enhance the existing GIS-based, decision support systems of FAS and WFP. Agriculture-oriented, ESE data products (e.g., MODIS-based, crop condition assessment product; TRMM derived, drought index product) will be input to a crop growth model in collaboration with the USDA Agricultural Research Service, to generate crop condition and yield prediction maps. The AIS will have the capability for remotely accessing distributed data, by being compliant with community-based interoperability standards, enabling easy access to agriculture-related products from other data producers. The AIS? system approach will provide a generic mechanism for easily incorporating new products and making them accessible to users.
Future Interoperability of Camp Protection Systems (FICAPS)
NASA Astrophysics Data System (ADS)
Caron, Sylvie; Gündisch, Rainer; Marchand, Alain; Stahl, Karl-Hermann
2013-05-01
The FICAPS Project has been established as a Project of the European Defence Agency based on an initiative of Germany and France. Goal of this Project was to derive Guidelines, which by a proper implementation in future developments improve Camp Protection Systems (CPS) by enabling and improving interoperability between Camp Protection Systems and its Equipments of different Nations involved in multinational missions. These Guidelines shall allow for: • Real-time information exchange between equipments and systems of different suppliers and nations (even via SatCom), • Quick and easy replacement of equipments (even of different Nations) at run-time in the field by means of plug and play capability, thus lowering the operational and logistic costs and making the system highly available, • Enhancement of system capabilities (open and modular systems) by adding new equipment with new capabilities (just plug-in, automatic adjustment of the HMI Human Machine Interface) without costly and time consuming validation and test on system level (validation and test can be done on Equipment level), Four scenarios have been identified to summarize the interoperability requirements from an operational viewpoint. To prove the definitions given in the Guideline Document, a French and a German Demonstration System, based on existing national assets, were realized. Demonstrations, showing the capabilities given by the defined interoperability requirements with respect to the operational scenarios, were performed. Demonstrations included remote control of a CPS by another CPS, remote sensor control (Electro-Optic/InfraRed EO/IR) and remote effector control. This capability can be applied to extend the protection area or to protect distant infrastructural assets Demonstrations have been performed. The required interoperability functionality was shown successfully. Even if the focus of the FICAPS project was on camp protection, the solution found is also appropriate for other force protection and ISR (Intelligence Surveillance Reconnaissance) tasks not only due to its flexibility but also due to the chosen interfacing.
Thelen, Sebastian; Czaplik, Michael; Meisen, Philipp; Schilberg, Daniel; Jeschke, Sabina
2015-01-01
In order to study new methods of telemedicine usage in the context of emergency medical services, researchers need to prototype integrated telemedicine systems. To conduct a one-year trial phase-intended to study a new application of telemedicine in German emergency medical services-we used off-the-shelf medical devices and software to realize real-time patient monitoring within an integrated telemedicine system prototype. We demonstrate its feasibility by presenting the integrated real-time patient monitoring solution, by studying signal delay and transmission robustness regarding changing communication channel characteristics, and by evaluating issues reported by the physicians during the trial phase. Where standards like HL7 and the IEEE 11073 family are intended to enable interoperability of product grade medical devices, we show that research prototypes benefit from the use of web technologies and simple device interfaces, as they simplify product development for a manufacturer and ease integration efforts for research teams. Embracing this approach for the development of new medical devices eases the constraint to use off-the-shelf products for research trials investigating innovative use of telemedicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, John; Halbgewachs, Ron; Chavez, Adrian
The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relatingmore » to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock utilities into proprietary and closed systems.« less
Building Future Transatlantic Interoperability Around a Robust NATO Response Force
2012-10-01
than already traveled . However, this accrued wealth of interoperable capa- bility may be at its apogee, soon to decline as the result of two looming...and Bydgo- szcz, Poland, as well as major national training centers such as the bilateral U.S.- Romanian Joint Task Force– East at Kogalniceanu...operations. Increase U.S. and Allied Exchange Students at National and NATO military schools. Austerity measures may eventually affect the investment
2006-08-01
constellation, SAR Bistatic for interferometry, L-band SAR data from Argentinean SAOCOM satellites, and optical imaging data from the French ‘ Pleiades ...a services federation (e.g. COSMO-SkyMed (SAR) and Pleiades (optical) constellation). Its main purpose is the elaboration of Programming Requests...on catalogue interoperability or on a federation of services (i.e. with French Pleiades optical satellites). The multi-mission objectives are
The Macro Dynamics of Weapon System Acquisition: Shaping Early Decisions to Get Better Outcomes
2012-05-17
defects and rework •Design tools and processes •Lack of feedback to key design and SE processes •Lack of quantified risk and uncertainty at key... Tools for Rapid Exploration of the Physical Design Space Coupling Operability, Interoperability, and Physical Feasibility Analyses – a Game Changer...Interoperability •Training Quantified Margins and Uncertainties at Each Critical Decision Point M&S RDT&E A Continuum of Tools Underpinned with
ECHO Services: Foundational Middleware for a Science Cyberinfrastructure
NASA Technical Reports Server (NTRS)
Burnett, Michael
2005-01-01
This viewgraph presentation describes ECHO, an interoperability middleware solution. It uses open, XML-based APIs, and supports net-centric architectures and solutions. ECHO has a set of interoperable registries for both data (metadata) and services, and provides user accounts and a common infrastructure for the registries. It is built upon a layered architecture with extensible infrastructure for supporting community unique protocols. It has been operational since November, 2002 and it available as open source.
2004-06-01
Situation Understanding) Common Operational Pictures Planning & Decision Support Capabilities Message & Order Processing Common Operational...Pictures Planning & Decision Support Capabilities Message & Order Processing Common Languages & Data Models Modeling & Simulation Domain
A development framework for semantically interoperable health information systems.
Lopez, Diego M; Blobel, Bernd G M E
2009-02-01
Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.
Semantic interoperability--HL7 Version 3 compared to advanced architecture standards.
Blobel, B G M E; Engel, K; Pharow, P
2006-01-01
To meet the challenge for high quality and efficient care, highly specialized and distributed healthcare establishments have to communicate and co-operate in a semantically interoperable way. Information and communication technology must be open, flexible, scalable, knowledge-based and service-oriented as well as secure and safe. For enabling semantic interoperability, a unified process for defining and implementing the architecture, i.e. structure and functions of the cooperating systems' components, as well as the approach for knowledge representation, i.e. the used information and its interpretation, algorithms, etc. have to be defined in a harmonized way. Deploying the Generic Component Model, systems and their components, underlying concepts and applied constraints must be formally modeled, strictly separating platform-independent from platform-specific models. As HL7 Version 3 claims to represent the most successful standard for semantic interoperability, HL7 has been analyzed regarding the requirements for model-driven, service-oriented design of semantic interoperable information systems, thereby moving from a communication to an architecture paradigm. The approach is compared with advanced architectural approaches for information systems such as OMG's CORBA 3 or EHR systems such as GEHR/openEHR and CEN EN 13606 Electronic Health Record Communication. HL7 Version 3 is maturing towards an architectural approach for semantic interoperability. Despite current differences, there is a close collaboration between the teams involved guaranteeing a convergence between competing approaches.
Desiderata for an authoritative Representation of MeSH in RDF.
Winnenburg, Rainer; Bodenreider, Olivier
2014-01-01
The Semantic Web provides a framework for the integration of resources on the web, which facilitates information integration and interoperability. RDF is the main representation format for Linked Open Data (LOD). However, datasets are not always made available in RDF by their producers and the Semantic Web community has had to convert some of these datasets to RDF in order for these datasets to participate in the LOD cloud. As a result, the LOD cloud sometimes contains outdated, partial and even inaccurate RDF datasets. We review the LOD landscape for one of these resources, MeSH, and analyze the characteristics of six existing representations in order to identify desirable features for an authoritative version, for which we create a prototype. We illustrate the suitability of this prototype on three common use cases. NLM intends to release an authoritative representation of MeSH in RDF (beta version) in the Fall of 2014.
Desiderata for an authoritative Representation of MeSH in RDF
Winnenburg, Rainer; Bodenreider, Olivier
2014-01-01
The Semantic Web provides a framework for the integration of resources on the web, which facilitates information integration and interoperability. RDF is the main representation format for Linked Open Data (LOD). However, datasets are not always made available in RDF by their producers and the Semantic Web community has had to convert some of these datasets to RDF in order for these datasets to participate in the LOD cloud. As a result, the LOD cloud sometimes contains outdated, partial and even inaccurate RDF datasets. We review the LOD landscape for one of these resources, MeSH, and analyze the characteristics of six existing representations in order to identify desirable features for an authoritative version, for which we create a prototype. We illustrate the suitability of this prototype on three common use cases. NLM intends to release an authoritative representation of MeSH in RDF (beta version) in the Fall of 2014. PMID:25954433
NASA Technical Reports Server (NTRS)
Bradley, Arthur; Dubowsky, Steven; Quinn, Roger; Marzwell, Neville
2005-01-01
Robots that operate independently of one another will not be adequate to accomplish the future exploration tasks of long-distance autonomous navigation, habitat construction, resource discovery, and material handling. Such activities will require that systems widely share information, plan and divide complex tasks, share common resources, and physically cooperate to manipulate objects. Recognizing the need for interoperable robots to accomplish the new exploration initiative, NASA s Office of Exploration Systems Research & Technology recently funded the development of the Joint Technical Architecture for Robotic Systems (JTARS). JTARS charter is to identify the interface standards necessary to achieve interoperability among space robots. A JTARS working group (JTARS-WG) has been established comprising recognized leaders in the field of space robotics including representatives from seven NASA centers along with academia and private industry. The working group s early accomplishments include addressing key issues required for interoperability, defining which systems are within the project s scope, and framing the JTARS manuals around classes of robotic systems.
Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron; Thompson, Julie Dawn
2009-01-01
The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented.
Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron
2009-01-01
The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented. PMID:18971242
OR.NET: a service-oriented architecture for safe and dynamic medical device interoperability.
Kasparick, Martin; Schmitz, Malte; Andersen, Björn; Rockstroh, Max; Franke, Stefan; Schlichting, Stefan; Golatowski, Frank; Timmermann, Dirk
2018-02-23
Modern surgical departments are characterized by a high degree of automation supporting complex procedures. It recently became apparent that integrated operating rooms can improve the quality of care, simplify clinical workflows, and mitigate equipment-related incidents and human errors. Particularly using computer assistance based on data from integrated surgical devices is a promising opportunity. However, the lack of manufacturer-independent interoperability often prevents the deployment of collaborative assistive systems. The German flagship project OR.NET has therefore developed, implemented, validated, and standardized concepts for open medical device interoperability. This paper describes the universal OR.NET interoperability concept enabling a safe and dynamic manufacturer-independent interconnection of point-of-care (PoC) medical devices in the operating room and the whole clinic. It is based on a protocol specifically addressing the requirements of device-to-device communication, yet also provides solutions for connecting the clinical information technology (IT) infrastructure. We present the concept of a service-oriented medical device architecture (SOMDA) as well as an introduction to the technical specification implementing the SOMDA paradigm, currently being standardized within the IEEE 11073 service-oriented device connectivity (SDC) series. In addition, the Session concept is introduced as a key enabler for safe device interconnection in highly dynamic ensembles of networked medical devices; and finally, some security aspects of a SOMDA are discussed.
Open Architecture SDR for Space
NASA Technical Reports Server (NTRS)
Smith, Carl; Long, Chris; Liebetreu, John; Reinhart, Richard C.
2005-01-01
This paper describes an open-architecture SDR (software defined radio) infrastructure that is suitable for space-based operations (Space-SDR). SDR technologies will endow space and planetary exploration systems with dramatically increased capability, reduced power consumption, and significantly less mass than conventional systems, at costs reduced by vigorous competition, hardware commonality, dense integration, reduced obsolescence, interoperability, and software re-use. Significant progress has been recorded on developments like the Joint Tactical Radio System (JSTRS) Software Communication Architecture (SCA), which is oriented toward reconfigurable radios for defense forces operating in multiple theaters of engagement. The JTRS-SCA presents a consistent software interface for waveform development, and facilitates interoperability, waveform portability, software re-use, and technology evolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halbgewachs, Ronald D.; Chavez, Adrian R.
Process Control System (PCS) and Industrial Control System (ICS) security is critical to our national security. But there are a number of technological, economic, and educational impediments to PCS owners implementing effective security on their systems. Sandia National Laboratories has performed the research and development of the OPSAID (Open PCS Security Architecture for Interoperable Design), a project sponsored by the US Department of Energy Office of Electricity Delivery and Energy Reliability (DOE/OE), to address this issue. OPSAID is an open-source architecture for PCS/ICS security that provides a design basis for vendors to build add-on security devices for legacy systems, whilemore » providing a path forward for the development of inherently-secure PCS elements in the future. Using standardized hardware, a proof-of-concept prototype system was also developed. This report describes the improvements and capabilities that have been added to OPSAID since an initial report was released. Testing and validation of this architecture has been conducted in another project, Lemnos Interoperable Security Project, sponsored by DOE/OE and managed by the National Energy Technology Laboratory (NETL).« less
Common Data Model for Neuroscience Data and Data Model Exchange
Gardner, Daniel; Knuth, Kevin H.; Abato, Michael; Erde, Steven M.; White, Thomas; DeBellis, Robert; Gardner, Esther P.
2001-01-01
Objective: Generalizing the data models underlying two prototype neurophysiology databases, the authors describe and propose the Common Data Model (CDM) as a framework for federating a broad spectrum of disparate neuroscience information resources. Design: Each component of the CDM derives from one of five superclasses—data, site, method, model, and reference—or from relations defined between them. A hierarchic attribute-value scheme for metadata enables interoperability with variable tree depth to serve specific intra- or broad inter-domain queries. To mediate data exchange between disparate systems, the authors propose a set of XML-derived schema for describing not only data sets but data models. These include biophysical description markup language (BDML), which mediates interoperability between data resources by providing a meta-description for the CDM. Results: The set of superclasses potentially spans data needs of contemporary neuroscience. Data elements abstracted from neurophysiology time series and histogram data represent data sets that differ in dimension and concordance. Site elements transcend neurons to describe subcellular compartments, circuits, regions, or slices; non-neuroanatomic sites include sequences to patients. Methods and models are highly domain-dependent. Conclusions: True federation of data resources requires explicit public description, in a metalanguage, of the contents, query methods, data formats, and data models of each data resource. Any data model that can be derived from the defined superclasses is potentially conformant and interoperability can be enabled by recognition of BDML-described compatibilities. Such metadescriptions can buffer technologic changes. PMID:11141510
Watershed and Economic Data InterOperability (WEDO)??
The annual public meeting of the Federal Interagency Steering Committee on Multimedia Environmental Modeling (ISCMEM) will convene to discuss some of the latest developments in environmental modeling applications, tools and frameworks, as well as new operational initiatives for F...
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
Today, increasing numbers of intermittent generation sources (e.g., wind and photovoltaic) and new mobile intermittent loads (e.g., electric vehicles) can significantly affect traditional utility business practices and operations. At the same time, a growing number of technologies and devices, from appliances to lighting systems, are being deployed at consumer premises that have more sophisticated controls and information that remain underused for anything beyond basic building equipment operations. The intersection of these two drivers is an untapped opportunity and underused resource that, if appropriately configured and realized in open standards, can provide significant energy efficiency and commensurate savings on utility bills,more » enhanced and lower cost reliability to utilities, and national economic benefits in the creation of new markets, sectors, and businesses being fueled by the seamless coordination of energy and information through device and technology interoperability. Or, as the Quadrennial Energy Review puts it, “A plethora of both consumer-level and grid-level devices are either in the market, under development, or at the conceptual stage. When tied together through the information technology that is increasingly being deployed on electric utilities’ distribution grids, they can be an important enabling part of the emerging grid of the future. However, what is missing is the ability for all of these devices to coordinate and communicate their operations with the grid, and among themselves, in a common language — an open standard.” In this paper, we define interoperability as the ability to exchange actionable information between two or more systems within a home or building, or across and within organizational boundaries. Interoperability relies on the shared meaning of the exchanged information, with agreed-upon expectations and consequences, for the response to the information exchange.« less
MQ-4C Triton Unmanned Aircraft System (MQ-4C Triton)
2015-12-01
will respond to theater level operational or national strategic taskings. MQ-4C Triton December 2015 SAR March 23, 2016 15:26:01 UNCLASSIFIED 6...88% at a mission radius of 2,000 nm Level of Interoperability 1-5 BLOS and LOS from MOB/ FOB (Land Based) MCS BLOS and LOS from MOB/ FOB (Land...MCS UA Mission Radius >=3,000 nm >=3,000 nm >=2,000 nm TBD >=2,000 nm Level Of Interoperability 2 Capability LOS/BLOS multi-ISR payload reception to
Secure, Autonomous, Intelligent Controller for Integrating Distributed Sensor Webs
NASA Technical Reports Server (NTRS)
Ivancic, William D.
2007-01-01
This paper describes the infrastructure and protocols necessary to enable near-real-time commanding, access to space-based assets, and the secure interoperation between sensor webs owned and controlled by various entities. Select terrestrial and aeronautics-base sensor webs will be used to demonstrate time-critical interoperability between integrated, intelligent sensor webs both terrestrial and between terrestrial and space-based assets. For this work, a Secure, Autonomous, Intelligent Controller and knowledge generation unit is implemented using Virtual Mission Operation Center technology.
2010-10-01
the 2004 Fall Simulation Interoperability Workshop, Orlando, Florida, USA, September 2004, 04F- SIW -090. [Blacklock (2007)] - Blacklock, J. and Zalcman...Valley, CA, USA, March 2009, 09S- SIW -084. [DIS (1995)] - IEEE Standard – Protocols for Distributed Interactive Simulation Application (1995), IEEE...Workshop, Orlando, FL, USA, September 2007, 07F- SIW -111. [Gresche] - Gresche, D. et al, (2006), “International Mission Training Research
1992-09-01
59) The problem could be attributed to too many sophisticated systems, the lack of a true interservice focal point with the ability to control DoD...the study and yet still provide a true picture of interservice communication interoperability problems within USSOCOM, the study was limited to...they referred to as the signs of the Zodiac, and are still cited today in astronomy and horoscopes (8:22-36). Ptolemy of Alexandria is given credit
3D facial landmarks: Inter-operator variability of manual annotation
2014-01-01
Background Manual annotation of landmarks is a known source of variance, which exist in all fields of medical imaging, influencing the accuracy and interpretation of the results. However, the variability of human facial landmarks is only sparsely addressed in the current literature as opposed to e.g. the research fields of orthodontics and cephalometrics. We present a full facial 3D annotation procedure and a sparse set of manually annotated landmarks, in effort to reduce operator time and minimize the variance. Method Facial scans from 36 voluntary unrelated blood donors from the Danish Blood Donor Study was randomly chosen. Six operators twice manually annotated 73 anatomical and pseudo-landmarks, using a three-step scheme producing a dense point correspondence map. We analyzed both the intra- and inter-operator variability, using mixed-model ANOVA. We then compared four sparse sets of landmarks in order to construct a dense correspondence map of the 3D scans with a minimum point variance. Results The anatomical landmarks of the eye were associated with the lowest variance, particularly the center of the pupils. Whereas points of the jaw and eyebrows have the highest variation. We see marginal variability in regards to intra-operator and portraits. Using a sparse set of landmarks (n=14), that capture the whole face, the dense point mean variance was reduced from 1.92 to 0.54 mm. Conclusion The inter-operator variability was primarily associated with particular landmarks, where more leniently landmarks had the highest variability. The variables embedded in the portray and the reliability of a trained operator did only have marginal influence on the variability. Further, using 14 of the annotated landmarks we were able to reduced the variability and create a dense correspondences mesh to capture all facial features. PMID:25306436
Simplified Virtualization in a HEP/NP Environment with Condor
NASA Astrophysics Data System (ADS)
Strecker-Kellogg, W.; Caramarcu, C.; Hollowell, C.; Wong, T.
2012-12-01
In this work we will address the development of a simple prototype virtualized worker node cluster, using Scientific Linux 6.x as a base OS, KVM and the libvirt API for virtualization, and the Condor batch software to manage virtual machines. The discussion in this paper provides details on our experience with building, configuring, and deploying the various components from bare metal, including the base OS, creation and distribution of the virtualized OS images and the integration of batch services with the virtual machines. Our focus was on simplicity and interoperability with our existing architecture.
Retrieving and Indexing Spatial Data in the Cloud Computing Environment
NASA Astrophysics Data System (ADS)
Wang, Yonggang; Wang, Sheng; Zhou, Daliang
In order to solve the drawbacks of spatial data storage in common Cloud Computing platform, we design and present a framework for retrieving, indexing, accessing and managing spatial data in the Cloud environment. An interoperable spatial data object model is provided based on the Simple Feature Coding Rules from the OGC such as Well Known Binary (WKB) and Well Known Text (WKT). And the classic spatial indexing algorithms like Quad-Tree and R-Tree are re-designed in the Cloud Computing environment. In the last we develop a prototype software based on Google App Engine to implement the proposed model.
An ontology-based framework for bioinformatics workflows.
Digiampietri, Luciano A; Perez-Alcazar, Jose de J; Medeiros, Claudia Bauzer
2007-01-01
The proliferation of bioinformatics activities brings new challenges - how to understand and organise these resources, how to exchange and reuse successful experimental procedures, and to provide interoperability among data and tools. This paper describes an effort toward these directions. It is based on combining research on ontology management, AI and scientific workflows to design, reuse and annotate bioinformatics experiments. The resulting framework supports automatic or interactive composition of tasks based on AI planning techniques and takes advantage of ontologies to support the specification and annotation of bioinformatics workflows. We validate our proposal with a prototype running on real data.
Geoscience Information Network (USGIN) Solutions for Interoperable Open Data Access Requirements
NASA Astrophysics Data System (ADS)
Allison, M. L.; Richard, S. M.; Patten, K.
2014-12-01
The geosciences are leading development of free, interoperable open access to data. US Geoscience Information Network (USGIN) is a freely available data integration framework, jointly developed by the USGS and the Association of American State Geologists (AASG), in compliance with international standards and protocols to provide easy discovery, access, and interoperability for geoscience data. USGIN standards include the geologic exchange language 'GeoSciML' (v 3.2 which enables instant interoperability of geologic formation data) which is also the base standard used by the 117-nation OneGeology consortium. The USGIN deployment of NGDS serves as a continent-scale operational demonstration of the expanded OneGeology vision to provide access to all geoscience data worldwide. USGIN is developed to accommodate a variety of applications; for example, the International Renewable Energy Agency streams data live to the Global Atlas of Renewable Energy. Alternatively, users without robust data sharing systems can download and implement a free software packet, "GINstack" to easily deploy web services for exposing data online for discovery and access. The White House Open Data Access Initiative requires all federally funded research projects and federal agencies to make their data publicly accessible in an open source, interoperable format, with metadata. USGIN currently incorporates all aspects of the Initiative as it emphasizes interoperability. The system is successfully deployed as the National Geothermal Data System (NGDS), officially launched at the White House Energy Datapalooza in May, 2014. The USGIN Foundation has been established to ensure this technology continues to be accessible and available.
Wireless Sensor Networks for Developmental and Flight Instrumentation
NASA Technical Reports Server (NTRS)
Alena, Richard; Figueroa, Fernando; Becker, Jeffrey; Foster, Mark; Wang, Ray; Gamudevelli, Suman; Studor, George
2011-01-01
Wireless sensor networks (WSN) based on the IEEE 802.15.4 Personal Area Network and ZigBee Pro 2007 standards are finding increasing use in home automation and smart energy markets providing a framework for interoperable software. The Wireless Connections in Space Project, funded by the NASA Engineering and Safety Center, is developing technology, metrics and requirements for next-generation spacecraft avionics incorporating wireless data transport. The team from Stennis Space Center and Mobitrum Corporation, working under a NASA SBIR grant, has developed techniques for embedding plug-and-play software into ZigBee WSN prototypes implementing the IEEE 1451 Transducer Electronic Datasheet (TEDS) standard. The TEDS provides meta-information regarding sensors such as serial number, calibration curve and operational status. Incorporation of TEDS into wireless sensors leads directly to building application level software that can recognize sensors at run-time, dynamically instantiating sensors as they are added or removed. The Ames Research Center team has been experimenting with this technology building demonstration prototypes for on-board health monitoring. Innovations in technology, software and process can lead to dramatic improvements for managing sensor systems applied to Developmental and Flight Instrumentation (DFI) aboard aerospace vehicles. A brief overview of the plug-and-play ZigBee WSN technology is presented along with specific targets for application within the aerospace DFI market. The software architecture for the sensor nodes incorporating the TEDS information is described along with the functions of the Network Capable Gateway processor which bridges 802.15.4 PAN to the TCP/IP network. Client application software connects to the Gateway and is used to display TEDS information and real-time sensor data values updated every few seconds, incorporating error detection and logging to help measure performance and reliability in relevant target environments. Test results from our prototype WSN running the Mobitrum software system are summarized and the implications to the scalability and reliability for DFI applications are discussed. Our demonstration system, incorporating sensors for life support system and structural health monitoring is described along with test results obtained by running the demonstration prototype in relevant environments such as the Wireless Habitat Testbed at Johnson Space Center in Houston. An operations concept for improved sensor process flow from design to flight test is outlined specific to the areas of Environmental Control and Life Support System performance characterization and structural health monitoring of human-rated spacecraft. This operations concept will be used to highlight the areas where WSN technology, particularly plug-and-play software based on IEEE 1451, can improve the current process, resulting in significant reductions in the technical effort, overall cost and schedule for providing DFI capability for future spacecraft. RELEASED -
M and S supporting unmanned autonomous systems (UAxS) concept development and experimentation
NASA Astrophysics Data System (ADS)
Biagini, Marco; Scaccianoce, Alfio; Corona, Fabio; Forconi, Sonia; Byrum, Frank; Fowler, Olivia; Sidoran, James L.
2017-05-01
The development of the next generation of multi-domain unmanned semi and fully autonomous C4ISR systems involves a multitude of security concerns and interoperability challenges. Conceptual solutions to capability shortfalls and gaps can be identified through Concept Development and Experimentation (CD and E) cycles. Modelling and Simulation (M and S) is a key tool in supporting unmanned autonomous systems (UAxS) CD and E activities and addressing associated security challenges. This paper serves to illustrate the application of M and S to UAxS development and highlight initiatives made by the North Atlantic Treaty Organization (NATO) M and S Centre of Excellence (CoE) to facilitate interoperability. The NATO M and S CoE collaborates with other NATO and Nations bodies in order to develop UAxS projects such as the Allied Command for Transformation Counter Unmanned Autonomous Systems (CUAxS) project or the work of Science and Technology Organization (STO) panels. Some initiatives, such as Simulated Interactive Robotics Initiative (SIRI) made the baseline for further developments and to study emerging technologies in M and S and robotics fields. Artificial Intelligence algorithm modelling, Robot Operating Systems (ROS), network operations, cyber security, interoperable languages and related data models are some of the main aspects considered in this paper. In particular, the implementation of interoperable languages like C-BML and NIEM MilOps are discussed in relation to a Command and Control - Simulation Interoperability (C2SIM) paradigm. All these technologies are used to build a conceptual architecture to support UAxS CD and E.In addition, other projects that the NATO M and S CoE is involved in, such as the NATO Urbanization Project could provide credible future operational environments and benefit UAxS project development, as dual application of UAxS technology in large urbanized areas.In conclusion, this paper contains a detailed overview regarding how applying Modelling and Simulation to support CD and E activities is a valid approach to develop and validate future capabilities requirements in general and next generation UAxS.
Relevance of eHealth standards for big data interoperability in radiology and beyond.
Marcheschi, Paolo
2017-06-01
The aim of this paper is to report on the implementation of radiology and related information technology standards to feed big data repositories and so to be able to create a solid substrate on which to operate with analysis software. Digital Imaging and Communications in Medicine (DICOM) and Health Level 7 (HL7) are the major standards for radiology and medical information technology. They define formats and protocols to transmit medical images, signals, and patient data inside and outside hospital facilities. These standards can be implemented but big data expectations are stimulating a new approach, simplifying data collection and interoperability, seeking reduction of time to full implementation inside health organizations. Virtual Medical Record, DICOM Structured Reporting and HL7 Fast Healthcare Interoperability Resources (FHIR) are changing the way medical data are shared among organization and they will be the keys to big data interoperability. Until we do not find simple and comprehensive methods to store and disseminate detailed information on the patient's health we will not be able to get optimum results from the analysis of those data.
Guided mass spectrum labelling in atom probe tomography.
Haley, D; Choi, P; Raabe, D
2015-12-01
Atom probe tomography (APT) is a valuable near-atomic scale imaging technique, which yields mass spectrographic data. Experimental correctness can often pivot on the identification of peaks within a dataset, this is a manual process where subjectivity and errors can arise. The limitations of manual procedures complicate APT experiments for the operator and furthermore are a barrier to technique standardisation. In this work we explore the capabilities of computer-guided ranging to aid identification and analysis of mass spectra. We propose a fully robust algorithm for enumeration of the possible identities of detected peak positions, which assists labelling. Furthermore, a simple ranking scheme is developed to allow for evaluation of the likelihood of each possible identity being the likely assignment from the enumerated set. We demonstrate a simple, yet complete work-chain that allows for the conversion of mass-spectra to fully identified APT spectra, with the goal of minimising identification errors, and the inter-operator variance within APT experiments. This work chain is compared to current procedures via experimental trials with different APT operators, to determine the relative effectiveness and precision of the two approaches. It is found that there is little loss of precision (and occasionally gain) when participants are given computer assistance. We find that in either case, inter-operator precision for ranging varies between 0 and 2 "significant figures" (2σ confidence in the first n digits of the reported value) when reporting compositions. Intra-operator precision is weakly tested and found to vary between 1 and 3 significant figures, depending upon species composition levels. Finally it is suggested that inconsistencies in inter-operator peak labelling may be the largest source of scatter when reporting composition data in APT. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Unger, Stephen; Ames, Troy; Frye, Stuart; Chien, Steve; Cappelaere, Pat; Tran, Danny; Derezinski, Linda; Paules, Granville
2007-01-01
This paper will describe the progress of a 3 year research award from the NASA Earth Science Technology Office (ESTO) that began October 1, 2006, in response to a NASA Announcement of Research Opportunity on the topic of sensor webs. The key goal of this research is to prototype an interoperable sensor architecture that will enable interoperability between a heterogeneous set of space-based, Unmanned Aerial System (UAS)-based and ground based sensors. Among the key capabilities being pursued is the ability to automatically discover and task the sensors via the Internet and to automatically discover and assemble the necessary science processing algorithms into workflows in order to transform the sensor data into valuable science products. Our first set of sensor web demonstrations will prototype science products useful in managing wildfires and will use such assets as the Earth Observing 1 spacecraft, managed out of NASA/GSFC, a UASbased instrument, managed out of Ames and some automated ground weather stations, managed by the Forest Service. Also, we are collaborating with some of the other ESTO awardees to expand this demonstration and create synergy between our research efforts. Finally, we are making use of Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) suite of standards and some Web 2.0 capabilities to Beverage emerging technologies and standards. This research will demonstrate and validate a path for rapid, low cost sensor integration, which is not tied to a particular system, and thus be able to absorb new assets in an easily evolvable, coordinated manner. This in turn will help to facilitate the United States contribution to the Global Earth Observation System of Systems (GEOSS), as agreed by the U.S. and 60 other countries at the third Earth Observation Summit held in February of 2005.
SDI-based business processes: A territorial analysis web information system in Spain
NASA Astrophysics Data System (ADS)
Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.
2012-09-01
Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.
Doiron, Dany; Marcon, Yannick; Fortier, Isabel; Burton, Paul; Ferretti, Vincent
2017-01-01
Abstract Motivation Improving the dissemination of information on existing epidemiological studies and facilitating the interoperability of study databases are essential to maximizing the use of resources and accelerating improvements in health. To address this, Maelstrom Research proposes Opal and Mica, two inter-operable open-source software packages providing out-of-the-box solutions for epidemiological data management, harmonization and dissemination. Implementation Opal and Mica are two standalone but inter-operable web applications written in Java, JavaScript and PHP. They provide web services and modern user interfaces to access them. General features Opal allows users to import, manage, annotate and harmonize study data. Mica is used to build searchable web portals disseminating study and variable metadata. When used conjointly, Mica users can securely query and retrieve summary statistics on geographically dispersed Opal servers in real-time. Integration with the DataSHIELD approach allows conducting more complex federated analyses involving statistical models. Availability Opal and Mica are open-source and freely available at [www.obiba.org] under a General Public License (GPL) version 3, and the metadata models and taxonomies that accompany them are available under a Creative Commons licence. PMID:29025122
Telemonitoring of patients with Parkinson's disease using inertia sensors.
Piro, N E; Baumann, L; Tengler, M; Piro, L; Blechschmidt-Trapp, R
2014-01-01
Medical treatment in patients suffering from Parkinson's disease is very difficult as dose-finding is mainly based on selective and subjective impressions by the physician. To allow for the objective evaluation of patients' symptoms required for optimal dosefinding, a telemonitoring system tracks the motion of patients in their surroundings. The system focuses on providing interoperability and usability in order to ensure high acceptance. Patients wear inertia sensors and perform standardized motor tasks. Data are recorded, processed and then presented to the physician in a 3D animated form. In addition, the same data is rated based on the UPDRS score. Interoperability is realized by developing the system in compliance with the recommendations of the Continua Health Alliance. Detailed requirements analysis and continuous collaboration with respective user groups help to achieve high usability. A sensor platform was developed that is capable of measuring acceleration and angular rate of motions as well as the absolute orientation of the device itself through an included compass sensor. The system architecture was designed and required infrastructure, and essential parts of the communication between the system components were implemented following Continua guidelines. Moreover, preliminary data analysis based on three-dimensional acceleration and angular rate data could be established. A prototype system for the telemonitoring of Parkinson's disease patients was successfully developed. The developed sensor platform fully satisfies the needs of monitoring patients of Parkinson's disease and is comparable to other sensor platforms, although these sensor platforms have yet to be tested rigorously against each other. Suitable approaches to provide interoperability and usability were identified and realized and remain to be tested in the field.
Bleser, Gabriele; Damen, Dima; Behera, Ardhendu; Hendeby, Gustaf; Mura, Katharina; Miezal, Markus; Gee, Andrew; Petersen, Nils; Maçães, Gustavo; Domingues, Hugo; Gorecky, Dominic; Almeida, Luis; Mayol-Cuevas, Walterio; Calway, Andrew; Cohn, Anthony G.; Hogg, David C.; Stricker, Didier
2015-01-01
Today, the workflows that are involved in industrial assembly and production activities are becoming increasingly complex. To efficiently and safely perform these workflows is demanding on the workers, in particular when it comes to infrequent or repetitive tasks. This burden on the workers can be eased by introducing smart assistance systems. This article presents a scalable concept and an integrated system demonstrator designed for this purpose. The basic idea is to learn workflows from observing multiple expert operators and then transfer the learnt workflow models to novice users. Being entirely learning-based, the proposed system can be applied to various tasks and domains. The above idea has been realized in a prototype, which combines components pushing the state of the art of hardware and software designed with interoperability in mind. The emphasis of this article is on the algorithms developed for the prototype: 1) fusion of inertial and visual sensor information from an on-body sensor network (BSN) to robustly track the user’s pose in magnetically polluted environments; 2) learning-based computer vision algorithms to map the workspace, localize the sensor with respect to the workspace and capture objects, even as they are carried; 3) domain-independent and robust workflow recovery and monitoring algorithms based on spatiotemporal pairwise relations deduced from object and user movement with respect to the scene; and 4) context-sensitive augmented reality (AR) user feedback using a head-mounted display (HMD). A distinguishing key feature of the developed algorithms is that they all operate solely on data from the on-body sensor network and that no external instrumentation is needed. The feasibility of the chosen approach for the complete action-perception-feedback loop is demonstrated on three increasingly complex datasets representing manual industrial tasks. These limited size datasets indicate and highlight the potential of the chosen technology as a combined entity as well as point out limitations of the system. PMID:26126116
Zhou, Yuan; Ancker, Jessica S; Upadhye, Mandar; McGeorge, Nicolette M; Guarrera, Theresa K; Hegde, Sudeep; Crane, Peter W; Fairbanks, Rollin J; Bisantz, Ann M; Kaushal, Rainu; Lin, Li
2013-01-01
The effect of health information technology (HIT) on efficiency and workload among clinical and nonclinical staff has been debated, with conflicting evidence about whether electronic health records (EHRs) increase or decrease effort. None of this paper to date, however, examines the effect of interoperability quantitatively using discrete event simulation techniques. To estimate the impact of EHR systems with various levels of interoperability on day-to-day tasks and operations of ambulatory physician offices. Interviews and observations were used to collect workflow data from 12 adult primary and specialty practices. A discrete event simulation model was constructed to represent patient flows and clinical and administrative tasks of physicians and staff members. High levels of EHR interoperability were associated with reduced time spent by providers on four tasks: preparing lab reports, requesting lab orders, prescribing medications, and writing referrals. The implementation of an EHR was associated with less time spent by administrators but more time spent by physicians, compared with time spent at paper-based practices. In addition, the presence of EHRs and of interoperability did not significantly affect the time usage of registered nurses or the total visit time and waiting time of patients. This paper suggests that the impact of using HIT on clinical and nonclinical staff work efficiency varies, however, overall it appears to improve time efficiency more for administrators than for physicians and nurses.
Interoperable web applications for sharing data and products of the International DORIS Service
NASA Astrophysics Data System (ADS)
Soudarin, L.; Ferrage, P.
2017-12-01
The International DORIS Service (IDS) was created in 2003 under the umbrella of the International Association of Geodesy (IAG) to foster scientific research related to the French satellite tracking system DORIS and to deliver scientific products, mostly related to the International Earth rotation and Reference systems Service (IERS). Since its start, the organization has continuously evolved, leading to additional and improved operational products from an expanded set of DORIS Analysis Centers. In addition, IDS has developed services for sharing data and products with the users. Metadata and interoperable web applications are proposed to explore, visualize and download the key products such as the position time series of the geodetic points materialized at the ground tracking stations. The Global Geodetic Observing System (GGOS) encourages the IAG Services to develop such interoperable facilities on their website. The objective for GGOS is to set up an interoperable portal through which the data and products produced by the IAG Services can be served to the user community. We present the web applications proposed by IDS to visualize time series of geodetic observables or to get information about the tracking ground stations and the tracked satellites. We discuss the future plans for IDS to meet the recommendations of GGOS. The presentation also addresses the needs for the IAG Services to adopt common metadata thesaurus to describe data and products, and interoperability standards to share them.
Design Reference Missions (DRM): Integrated ODM 'Air-Taxi' Mission Features
NASA Technical Reports Server (NTRS)
Kloesel, Kurt; Starr, Ginn; Saltzman, John A.
2017-01-01
Design Reference Missions (DRM): Integrated ODM Air-Taxi Mission Features, Hybrid Electric Integrated System Testbed (HEIST) flight control. Structural Health, Energy Storage, Electric Components, Loss of Control, Degraded Systems, System Health, Real-Time IO Operator Geo-Fencing, Regional Noise Abatement and Trusted Autonomy Inter-operability.
Meyer, Markus; Donsa, Klaus; Truskaller, Thomas; Frohner, Matthias; Pohn, Birgit; Felfernig, Alexander; Sinner, Frank; Pieber, Thomas
2018-01-01
A fast and accurate data transmission from glucose meter to clinical decision support systems (CDSSs) is crucial for the management of type 2 diabetes mellitus since almost all therapeutic interventions are derived from glucose measurements. Aim was to develop a prototype of an automated glucose measurement transmission protocol based on the Continua Design Guidelines and to embed the protocol into a CDSS used by healthcare professionals. A literature and market research was performed to analyze the state-of-the-art and thereupon develop, integrate and validate an automated glucose measurement transmission protocol in an iterative process. Findings from literature and market research guided towards the development of a standardized glucose measurement transmission protocol using a middleware. The interface description to communicate with the glucose meter was illustrated and embedded into a CDSS. A prototype of an interoperable transmission of glucose measurements was developed and implemented in a CDSS presenting a promising way to reduce medication errors and improve user satisfaction.
Leveraging Terminology Services for Extract-Transform-Load Processes: A User-Centered Approach
Peterson, Kevin J.; Jiang, Guoqian; Brue, Scott M.; Liu, Hongfang
2016-01-01
Terminology services serve an important role supporting clinical and research applications, and underpin a diverse set of processes and use cases. Through standardization efforts, terminology service-to-system interactions can leverage well-defined interfaces and predictable integration patterns. Often, however, users interact more directly with terminologies, and no such blueprints are available for describing terminology service-to-user interactions. In this work, we explore the main architecture principles necessary to build a user-centered terminology system, using an Extract-Transform-Load process as our primary usage scenario. To analyze our architecture, we present a prototype implementation based on the Common Terminology Services 2 (CTS2) standard using the Patient-Centered Network of Learning Health Systems (LHSNet) project as a concrete use case. We perform a preliminary evaluation of our prototype architecture using three architectural quality attributes: interoperability, adaptability and usability. We find that a design-time focus on user needs, cognitive models, and existing patterns is essential to maximize system utility. PMID:28269898
An overview on STEP-NC compliant controller development
NASA Astrophysics Data System (ADS)
Othman, M. A.; Minhat, M.; Jamaludin, Z.
2017-10-01
The capabilities of conventional Computer Numerical Control (CNC) machine tools as termination organiser to fabricate high-quality parts promptly, economically and precisely are undeniable. To date, most CNCs follow the programming standard of ISO 6983, also called G & M code. However, in fluctuating shop floor environment, flexibility and interoperability of current CNC system to react dynamically and adaptively are believed still limited. This outdated programming language does not explicitly relate to each other to have control of arbitrary locations other than the motion of the block-by-block. To address this limitation, new standard known as STEP-NC was developed in late 1990s and is formalized as an ISO 14649. It adds intelligence to the CNC in term of interoperability, flexibility, adaptability and openness. This paper presents an overview of the research work that have been done in developing a STEP-NC controller standard and the capabilities of STEP-NC to overcome modern manufacturing demands. Reviews stated that most existing STEP-NC controller prototypes are based on type 1 and type 2 implementation levels. There are still lack of effort being done to develop type 3 and type 4 STEP-NC compliant controller.
Turning Interoperability Operational with GST
NASA Astrophysics Data System (ADS)
Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha
2013-04-01
GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially how it complies with existing or developing standards or quasi-standards and how it applies and extents services towards interoperability in the Earth sciences.
47 CFR 90.203 - Certification required.
Code of Federal Regulations, 2010 CFR
2010-10-01
... States after March 15, 1988. Marketing of these transmitters shall not be permitted after March 15, 1989... capability to be programmed for operation on the mutual aid channels as designated in § 90.617(a)(1) of the... is capable of operating on the nationwide public safety interoperability calling channel in the 150...
Design and Development of an Integrated Workstation Automation Hub
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weber, Andrew; Ghatikar, Girish; Sartor, Dale
Miscellaneous Electronic Loads (MELs) account for one third of all electricity consumption in U.S. commercial buildings, and are drivers for a significant energy use in India. Many of the MEL-specific plug-load devices are concentrated at workstations in offices. The use of intelligence, and integrated controls and communications at the workstation for an Office Automation Hub – offers the opportunity to improve both energy efficiency and occupant comfort, along with services for Smart Grid operations. Software and hardware solutions are available from a wide array of vendors for the different components, but an integrated system with interoperable communications is yet tomore » be developed and deployed. In this study, we propose system- and component-level specifications for the Office Automation Hub, their functions, and a prioritized list for the design of a proof-of-concept system. Leveraging the strength of both the U.S. and India technology sectors, this specification serves as a guide for researchers and industry in both countries to support the development, testing, and evaluation of a prototype product. Further evaluation of such integrated technologies for performance and cost is necessary to identify the potential to reduce energy consumptions in MELs and to improve occupant comfort.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chakraborty, S.; Kroposki, B.; Kramer, W.
Integrating renewable energy and distributed generations into the Smart Grid architecture requires power electronic (PE) for energy conversion. The key to reaching successful Smart Grid implementation is to develop interoperable, intelligent, and advanced PE technology that improves and accelerates the use of distributed energy resource systems. This report describes the simulation, design, and testing of a single-phase DC-to-AC inverter developed to operate in both islanded and utility-connected mode. It provides results on both the simulations and the experiments conducted, demonstrating the ability of the inverter to provide advanced control functions such as power flow and VAR/voltage regulation. This report alsomore » analyzes two different techniques used for digital signal processor (DSP) code generation. Initially, the DSP code was written in C programming language using Texas Instrument's Code Composer Studio. In a later stage of the research, the Simulink DSP toolbox was used to self-generate code for the DSP. The successful tests using Simulink self-generated DSP codes show promise for fast prototyping of PE controls.« less
Transition of advanced technology to military, homeland security, and law enforcement users
NASA Astrophysics Data System (ADS)
Jarrett, Stephen M.
2004-09-01
With the attack on the United States and the subsequent war on terror and the wars in Afghanistan and Iraq a need has been exposed for the transition of technology to all of our defenders both combat forces on the foreign battlefield and domestic forces here at home. The establishment of the Department of Homeland Security has also provided a focus on inserting technology to dramatically improve the capability of airport security forces, law enforcement, and all first responder networks. The drastic increase in the use of Special Forces in combat has also required new advanced technology capabilities at a much faster rate of development than the standard military procurement system. Technology developers must address the questions of interoperability, cost, commercialization, of how these groups will use the technology delivered and the adoption criteria of users in the deployment environment. The successful transition to the field must address the formation of complex concepts of operations in the user's adoption criteria. Prototype transition for two systems, a pocket infrared camera and an acoustic/seismic detector, will be highlighted in their effect on the wars in Iraq and Afghanistan and in the heightening of homeland security.
NASA Astrophysics Data System (ADS)
Babu Roshni, Satheesh; Jayakrishnan, M. P.; Mohanan, P.; Peethambharan Surendran, Kuzhichalil
2017-10-01
In this paper, we investigated the simulation and fabrication of an E-shaped microstrip patch antenna realized on multilayered polyester fabric suitable for WiMAX (Worldwide Interoperability for Microwave Access) applications. The main challenges while designing a textile antenna were to provide adequate thickness, surface uniformity and water wettability to the textile substrate. Here, three layers of polyester fabric were stacked together in order to obtain sufficient thickness, and were subsequently dip coated with polyvinyl butyral (PVB) solution. The PVB-coated polyester fabric showed a hydrophobic nature with a contact angle of 91°. The RMS roughness of the uncoated and PVB-coated polyester fabric was about 341 nm and 15 nm respectively. The promising properties, such as their flexibility, light weight and cost effectiveness, enable effortless integration of the proposed antenna into clothes like polyester jackets. Simulated and measured results in terms of return loss as well as gain were showcased to confirm the usefulness of the fabricated prototype. The fabricated antenna successfully operates at 3.37 GHz with a return loss of 21 dB and a maximum measured gain of 3.6 dB.
Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.
Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher
2017-08-01
Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.
46 CFR 8.570 - Interim approval of prototype SIP company or vessel plans.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 46 Shipping 1 2010-10-01 2010-10-01 false Interim approval of prototype SIP company or vessel... of prototype SIP company or vessel plans. (a) A company operating under an approved prototype SIP... continue operating under the plans while revisions are developed to bring the prototype SIP company or...
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Nichols, Kelvin F.
2006-01-01
To date very little effort has been made to provide interoperability between various space agency projects. To effectively get to the Moon and beyond systems must interoperate. To provide interoperability, standardization and registries of various technologies will be required. These registries will be created as they relate to space flight. With the new NASA Moon/Mars initiative a requirement to standardize and control the naming conventions of very disparate systems and technologies are emerging. The need to provide numbering to the many processes, schemas, vehicles, robots, space suits and technologies (e.g. versions), to name a few, in the highly complex Constellation Initiative is imperative. The number of corporations, developer personnel, system interfaces, people interfaces will require standardization and registries on a scale not currently envisioned. It would only take one exception (stove piped system development) to weaken, if not, destroy interoperability. To start, a standardized registry process must be defined that allows many differing engineers, organizations and operators the ability to easily access disparate registry information across numerous technological and scientific disciplines. Once registries are standardized the need to provide registry support in terms of setup and operations, resolution of conflicts between registries and other issues will need to be addressed. Registries should not be confused with repositories. No end user data is "stored" in a registry nor is it a configuration control system. Once a registry standard is created and approved, the technologies that should be registered must be identified and prioritized. In this paper, we will identify and define a registry process that is compatible with the Constellation Initiative and other non related space activities and organizations. We will then identify and define the various technologies that should use a registry to provide interoperability. The first set of technologies will be those that are currently in need of expansion namely the assignment of satellite designations and the process which controls assignments. Second, we will analyze the technologies currently standardized under the Consultative Committee for Space Data Systems (CCSDS) banner. Third, we will analyze the current CCSDS working group and birds of a feather activities to ascertain registry requirements. Lastly, we will identify technologies that are either currently under the auspices of another
INL Control System Situational Awareness Technology Annual Report 2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gordon Rueff; Bryce Wheeler; Todd Vollmer
The overall goal of this project is to develop an interoperable set of tools to provide a comprehensive, consistent implementation of cyber security and overall situational awareness of control and sensor network implementations. The operation and interoperability of these tools will fill voids in current technological offerings and address issues that remain an impediment to the security of control systems. This report provides an FY 2012 update on the Sophia, Mesh Mapper, Intelligent Cyber Sensor, and Data Fusion projects with respect to the year-two tasks and annual reporting requirements of the INL Control System Situational Awareness Technology report (July 2010).
NASA Astrophysics Data System (ADS)
Thomas, Paul A.; Marshall, Gillian; Faulkner, David; Kent, Philip; Page, Scott; Islip, Simon; Oldfield, James; Breckon, Toby P.; Kundegorski, Mikolaj E.; Clark, David J.; Styles, Tim
2016-05-01
Currently, most land Intelligence, Surveillance and Reconnaissance (ISR) assets (e.g. EO/IR cameras) are simply data collectors. Understanding, decision making and sensor control are performed by the human operators, involving high cognitive load. Any automation in the system has traditionally involved bespoke design of centralised systems that are highly specific for the assets/targets/environment under consideration, resulting in complex, non-flexible systems that exhibit poor interoperability. We address a concept of Autonomous Sensor Modules (ASMs) for land ISR, where these modules have the ability to make low-level decisions on their own in order to fulfil a higher-level objective, and plug in, with the minimum of preconfiguration, to a High Level Decision Making Module (HLDMM) through a middleware integration layer. The dual requisites of autonomy and interoperability create challenges around information fusion and asset management in an autonomous hierarchical system, which are addressed in this work. This paper presents the results of a demonstration system, known as Sensing for Asset Protection with Integrated Electronic Networked Technology (SAPIENT), which was shown in realistic base protection scenarios with live sensors and targets. The SAPIENT system performed sensor cueing, intelligent fusion, sensor tasking, target hand-off and compensation for compromised sensors, without human control, and enabled rapid integration of ISR assets at the time of system deployment, rather than at design-time. Potential benefits include rapid interoperability for coalition operations, situation understanding with low operator cognitive burden and autonomous sensor management in heterogenous sensor systems.
The Future of Geospatial Standards
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Simonis, I.
2016-12-01
The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds, where we can extract a trend for the future of geospatial standards. We see a number of key elements in focus, but simultaneously a broadening of standards to address particular communities' needs.
Report to Congress on Sustainable Ranges, 2011
2011-07-01
Interoperation of live participants and their operational systems. `` Realistic LVC representations of non-participant friendly warfighting capabilities...across the full range of military operations (ROMO). `` Realistic LVC representations of opposing forces (OPFOR), neutral, and factional entities that...entities. `` Suitable representations of the real world environment where the warfighting capabilities exist. Table 2-2 Live, Virtual, and
Doiron, Dany; Marcon, Yannick; Fortier, Isabel; Burton, Paul; Ferretti, Vincent
2017-10-01
Improving the dissemination of information on existing epidemiological studies and facilitating the interoperability of study databases are essential to maximizing the use of resources and accelerating improvements in health. To address this, Maelstrom Research proposes Opal and Mica, two inter-operable open-source software packages providing out-of-the-box solutions for epidemiological data management, harmonization and dissemination. Opal and Mica are two standalone but inter-operable web applications written in Java, JavaScript and PHP. They provide web services and modern user interfaces to access them. Opal allows users to import, manage, annotate and harmonize study data. Mica is used to build searchable web portals disseminating study and variable metadata. When used conjointly, Mica users can securely query and retrieve summary statistics on geographically dispersed Opal servers in real-time. Integration with the DataSHIELD approach allows conducting more complex federated analyses involving statistical models. Opal and Mica are open-source and freely available at [www.obiba.org] under a General Public License (GPL) version 3, and the metadata models and taxonomies that accompany them are available under a Creative Commons licence. © The Author 2017; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association
Designing for Change: Interoperability in a scaling and adapting environment
NASA Astrophysics Data System (ADS)
Yarmey, L.
2015-12-01
The Earth Science cyberinfrastructure landscape is constantly changing. Technologies advance and technical implementations are refined or replaced. Data types, volumes, packaging, and use cases evolve. Scientific requirements emerge and mature. Standards shift while systems scale and adapt. In this complex and dynamic environment, interoperability remains a critical component of successful cyberinfrastructure. Through the resource- and priority-driven iterations on systems, interfaces, and content, questions fundamental to stable and useful Earth Science cyberinfrastructure arise. For instance, how are sociotechnical changes planned, tracked, and communicated? How should operational stability balance against 'new and shiny'? How can ongoing maintenance and mitigation of technical debt be managed in an often short-term resource environment? The Arctic Data Explorer is a metadata brokering application developed to enable discovery of international, interdisciplinary Arctic data across distributed repositories. Completely dependent on interoperable third party systems, the Arctic Data Explorer publicly launched in 2013 with an original 3000+ data records from four Arctic repositories. Since then the search has scaled to 25,000+ data records from thirteen repositories at the time of writing. In the final months of original project funding, priorities shift to lean operations with a strategic eye on the future. Here we present lessons learned from four years of Arctic Data Explorer design, development, communication, and maintenance work along with remaining questions and potential directions.
Data interoperability software solution for emergency reaction in the Europe Union
NASA Astrophysics Data System (ADS)
Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.
2014-09-01
Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision-making slower and more difficult. However, spread and development of networks and IT-based Emergency Management Systems (EMS) has improved emergency responses, becoming more coordinated. Despite improvements made in recent years, EMS have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision-making. In addition, from a technical perspective, the consolidation of current EMS and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMS surrounded by different contexts. To overcome these problems we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries cultural linguistic issues. To deal with the diversity of data protocols and formats, we have designed a Service Oriented Architecture for Data Interoperability (named DISASTER) providing a flexible extensible solution to solve the mediation issues. Web Services have been adopted as specific technology to implement such paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency first responders: the Netherlands-Germany border fire.
From Information Management to Information Visualization
Karami, Mahtab
2016-01-01
Summary Objective The development and implementation of a dashboard of medical imaging department (MID) performance indicators. Method Several articles discussing performance measures of imaging departments were searched for this study. All the related measures were extracted. Then, a panel of imaging experts were asked to rate these measures with an open ended question to seek further potential indicators. A second round was performed to confirm the performance rating. The indicators and their ratings were then reviewed by an executive panel. Based on the final panel’s rating, a list of indicators to be used was developed. A team of information technology consultants were asked to determine a set of user interface requirements for the building of the dashboard. In the first round, based on the panel’s rating, a list of main features or requirements to be used was determined. Next, Qlikview was utilized to implement the dashboard to visualize a set of selected KPI metrics. Finally, an evaluation of the dashboard was performed. Results 92 MID indicators were identified. On top of this, 53 main user interface requirements to build of the prototype of dashboard were determined. Then, the project team successfully implemented a prototype of radiology management dashboards into study site. The visual display that was designed was rated highly by users. Conclusion To develop a dashboard, management of information is essential. It is recommended that a quality map be designed for the MID. It can be used to specify the sequence of activities, their related indicators and required data for calculating these indicators. To achieve both an effective dashboard and a comprehensive view of operations, it is necessary to design a data warehouse for gathering data from a variety of systems. Utilizing interoperability standards for exchanging data among different systems can be also effective in this regard. PMID:27437043
NASA Technical Reports Server (NTRS)
Hall, Edward; Magner, James
2011-01-01
This report is provided as part of ITT s NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract NNC05CA85C, Task 7: New ATM Requirements-Future Communications, C-Band and L-Band Communications Standard Development and was based on direction provided by FAA project-level agreements for New ATM Requirements-Future Communications. Task 7 included two subtasks. Subtask 7-1 addressed C-band (5091- to 5150-MHz) airport surface data communications standards development, systems engineering, test bed and prototype development, and tests and demonstrations to establish operational capability for the Aeronautical Mobile Airport Communications System (AeroMACS). Subtask 7-2 focused on systems engineering and development support of the L-band digital aeronautical communications system (L-DACS). Subtask 7-1 consisted of two phases. Phase I included development of AeroMACS concepts of use, requirements, architecture, and initial high-level safety risk assessment. Phase II builds on Phase I results and is presented in two volumes. Volume I is devoted to concepts of use, system requirements, and architecture, including AeroMACS design considerations. Volume II (this document) describes an AeroMACS prototype evaluation and presents final AeroMACS recommendations. This report also describes airport categorization and channelization methodologies. The purposes of the airport categorization task were (1) to facilitate initial AeroMACS architecture designs and enable budgetary projections by creating a set of airport categories based on common airport characteristics and design objectives, and (2) to offer high-level guidance to potential AeroMACS technology and policy development sponsors and service providers. A channelization plan methodology was developed because a common global methodology is needed to assure seamless interoperability among diverse AeroMACS services potentially supplied by multiple service providers.
NASA Technical Reports Server (NTRS)
Hall, Edward; Isaacs, James; Henriksen, Steve; Zelkin, Natalie
2011-01-01
This report is provided as part of ITT s NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract NNC05CA85C, Task 7: New ATM Requirements-Future Communications, C-Band and L-Band Communications Standard Development and was based on direction provided by FAA project-level agreements for New ATM Requirements-Future Communications. Task 7 included two subtasks. Subtask 7-1 addressed C-band (5091- to 5150-MHz) airport surface data communications standards development, systems engineering, test bed and prototype development, and tests and demonstrations to establish operational capability for the Aeronautical Mobile Airport Communications System (AeroMACS). Subtask 7-2 focused on systems engineering and development support of the L-band digital aeronautical communications system (L-DACS). Subtask 7-1 consisted of two phases. Phase I included development of AeroMACS concepts of use, requirements, architecture, and initial high-level safety risk assessment. Phase II builds on Phase I results and is presented in two volumes. Volume I (this document) is devoted to concepts of use, system requirements, and architecture, including AeroMACS design considerations. Volume II describes an AeroMACS prototype evaluation and presents final AeroMACS recommendations. This report also describes airport categorization and channelization methodologies. The purposes of the airport categorization task were (1) to facilitate initial AeroMACS architecture designs and enable budgetary projections by creating a set of airport categories based on common airport characteristics and design objectives, and (2) to offer high-level guidance to potential AeroMACS technology and policy development sponsors and service providers. A channelization plan methodology was developed because a common global methodology is needed to assure seamless interoperability among diverse AeroMACS services potentially supplied by multiple service providers.
Mahoney, Diane Feeney; Burleson, Winslow; Lozano, Cecil; Ravishankar, Vijay; Mahoney, Edward Leo
2015-01-01
Background Prior research has critiqued the lack of attention to the stressors associated with dementia related dressing issues, stigmatizing patient clothing, and wearable technology challenges. This paper describes the conceptual development and feasibility testing of an innovative ‘smart dresser’ context aware affective system (DRESS) to enable dressing by people with moderate memory loss through individualized audio and visual task prompting in real time. Methods Mixed method feasibility study involving qualitative focus groups with 25 Alzheimer’s family caregivers experiencing dressing difficulties to iteratively inform system design and a quantitative usability trial with 10 healthy subjects in a controlled laboratory setting to assess validity of technical operations. Results Caregivers voiced the need for tangible dressing assistance to reduce their frustration from time spent in repetitive cueing and power struggles over dressing. They contributed 6 changes that influenced the prototype development, most notably adding a dresser top iPad to mimic a familiar ‘TV screen’ for the audio and visual cueing. DRESS demonstrated promising overall functionality, however the validity of identification of dressing status ranged from 0% for the correct pants dressing to 100% for all shirts dressing scenarios. Adjustments were made to the detection components of the system raising the accuracy of detection of all acted dressing scenarios for pants from 50% to 82%. Conclusions Findings demonstrate family caregiver acceptability of the proposed system, the successful interoperability of the built system’s components, and the system’s ability to interpret correct and incorrect dressing actions in controlled laboratory simulations. Future research will advance the system to the alpha stage and subsequent testing with end users in real world settings. PMID:26321895
Rea, Susan; Pathak, Jyotishman; Savova, Guergana; Oniki, Thomas A; Westberg, Les; Beebe, Calvin E; Tao, Cui; Parker, Craig G; Haug, Peter J; Huff, Stanley M; Chute, Christopher G
2012-08-01
The Strategic Health IT Advanced Research Projects (SHARP) Program, established by the Office of the National Coordinator for Health Information Technology in 2010 supports research findings that remove barriers for increased adoption of health IT. The improvements envisioned by the SHARP Area 4 Consortium (SHARPn) will enable the use of the electronic health record (EHR) for secondary purposes, such as care process and outcomes improvement, biomedical research and epidemiologic monitoring of the nation's health. One of the primary informatics problem areas in this endeavor is the standardization of disparate health data from the nation's many health care organizations and providers. The SHARPn team is developing open source services and components to support the ubiquitous exchange, sharing and reuse or 'liquidity' of operational clinical data stored in electronic health records. One year into the design and development of the SHARPn framework, we demonstrated end to end data flow and a prototype SHARPn platform, using thousands of patient electronic records sourced from two large healthcare organizations: Mayo Clinic and Intermountain Healthcare. The platform was deployed to (1) receive source EHR data in several formats, (2) generate structured data from EHR narrative text, and (3) normalize the EHR data using common detailed clinical models and Consolidated Health Informatics standard terminologies, which were (4) accessed by a phenotyping service using normalized data specifications. The architecture of this prototype SHARPn platform is presented. The EHR data throughput demonstration showed success in normalizing native EHR data, both structured and narrative, from two independent organizations and EHR systems. Based on the demonstration, observed challenges for standardization of EHR data for interoperable secondary use are discussed. Copyright © 2012 Elsevier Inc. All rights reserved.
Rea, Susan; Pathak, Jyotishman; Savova, Guergana; Oniki, Thomas A.; Westberg, Les; Beebe, Calvin E.; Tao, Cui; Parker, Craig G.; Haug, Peter J.; Huff, Stanley M.; Chute, Christopher G.
2016-01-01
The Strategic Health IT Advanced Research Projects (SHARP) Program, established by the Office of the National Coordinator for Health Information Technology in 2010 supports research findings that remove barriers for increased adoption of health IT. The improvements envisioned by the SHARP Area 4 Consortium (SHARPn) will enable the use of the electronic health record (EHR) for secondary purposes, such as care process and outcomes improvement, biomedical research and epidemiologic monitoring of the nation’s health. One of the primary informatics problem areas in this endeavor is the standardization of disparate health data from the nation’s many health care organizations and providers. The SHARPn team is developing open source services and components to support the ubiquitous exchange, sharing and reuse or ‘liquidity’ of operational clinical data stored in electronic health records. One year into the design and development of the SHARPn framework, we demonstrated end to end data flow and a prototype SHARPn platform, using thousands of patient electronic records sourced from two large healthcare organizations: Mayo Clinic and Intermountain Healthcare. The platform was deployed to (1) receive source EHR data in several formats, (2) generate structured data from EHR narrative text, and (3) normalize the EHR data using common detailed clinical models and Consolidated Health Informatics standard terminologies, which were (4) accessed by a phenotyping service using normalized data specifications. The architecture of this prototype SHARPn platform is presented. The EHR data throughput demonstration showed success in normalizing native EHR data, both structured and narrative, from two independent organizations and EHR systems. Based on the demonstration, observed challenges for standardization of EHR data for interoperable secondary use are discussed. PMID:22326800
Advanced Communications Technology Satellite Now Operating in an Inclined Orbit
NASA Technical Reports Server (NTRS)
Bauer, Robert A.
1999-01-01
The Advanced Communications Technology Satellite (ACTS) system has been modified to support operation in an inclined orbit that is virtually transparent to users, and plans are to continue this final phase of its operation through September 2000. The next 2 years of ACTS will provide a new opportunity for using the technologies that this system brought online over 5 years ago and that are still being used to resolve the technical issues that face NASA and the satellite industry in the area of seamless networking and interoperability with terrestrial systems. New goals for ACTS have been defined that align the program with recent changes in NASA and industry. ACTS will be used as a testbed to: Show how NASA and other Government agencies can use commercial systems for 1. future support of their operations Test, characterize, and resolve technical issues in using advanced communications 2. protocols such as asynchronous transfer mode (ATM) and transmission control protocol/Internet protocol (TCP/IP) over long latency links as found when interoperating satellites with terrestrial systems Evaluate narrow-spot-beam Ka-band satellite operation in an inclined orbit 3. Verify Ka-band satellite technologies since no other Ka-band system is yet 4. available in the United States
Connecting Hazard Analysts and Risk Managers to Sensor Information.
Le Cozannet, Gonéri; Hosford, Steven; Douglas, John; Serrano, Jean-Jacques; Coraboeuf, Damien; Comte, Jérémie
2008-06-11
Hazard analysts and risk managers of natural perils, such as earthquakes, landslides and floods, need to access information from sensor networks surveying their regions of interest. However, currently information about these networks is difficult to obtain and is available in varying formats, thereby restricting accesses and consequently possibly leading to decision-making based on limited information. As a response to this issue, state-of-the-art interoperable catalogues are being currently developed within the framework of the Group on Earth Observations (GEO) workplan. This article provides an overview of the prototype catalogue that was developed to improve access to information about the sensor networks surveying geological hazards (geohazards), such as earthquakes, landslides and volcanoes.
Connecting Hazard Analysts and Risk Managers to Sensor Information
Le Cozannet, Gonéri; Hosford, Steven; Douglas, John; Serrano, Jean-Jacques; Coraboeuf, Damien; Comte, Jérémie
2008-01-01
Hazard analysts and risk managers of natural perils, such as earthquakes, landslides and floods, need to access information from sensor networks surveying their regions of interest. However, currently information about these networks is difficult to obtain and is available in varying formats, thereby restricting accesses and consequently possibly leading to decision-making based on limited information. As a response to this issue, state-of-the-art interoperable catalogues are being currently developed within the framework of the Group on Earth Observations (GEO) workplan. This article provides an overview of the prototype catalogue that was developed to improve access to information about the sensor networks surveying geological hazards (geohazards), such as earthquakes, landslides and volcanoes. PMID:27879915
2006-09-01
Control Force Agility Shared Situational Awareness Attentional Demand Interoperability Network Based Operations Effect Based Operations Speed of...Command Self Synchronization Reach Back Reach Forward Information Superiority Increased Mission Effectiveness Humansystems® Team Modelling...communication effectiveness and Distributed Mission Training (DMT) effectiveness . The NASA Ames Centre - Distributed Research Facilities platform could
2009-06-12
Phasing Model ......................................................................................................9 Figure 2. The Continuum of...the communist periphery. In a high-intensity conflict, doctrine at the time called for conventional forces to fight the traditional, linear fight...operations and proximity of cross component forces in a non- linear battlespace – Rigid business rules, translator applications, or manual workarounds to
Interoperable PKI Data Distribution in Computational Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pala, Massimiliano; Cholia, Shreyas; Rea, Scott A.
One of the most successful working examples of virtual organizations, computational grids need authentication mechanisms that inter-operate across domain boundaries. Public Key Infrastructures(PKIs) provide sufficient flexibility to allow resource managers to securely grant access to their systems in such distributed environments. However, as PKIs grow and services are added to enhance both security and usability, users and applications must struggle to discover available resources-particularly when the Certification Authority (CA) is alien to the relying party. This article presents how to overcome these limitations of the current grid authentication model by integrating the PKI Resource Query Protocol (PRQP) into the Gridmore » Security Infrastructure (GSI).« less
Search for supporting methodologies - Or how to support SEI for 35 years
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Masline, Richard C.
1991-01-01
Concepts relevant to the development of an evolvable information management system are examined in terms of support for the Space Exploration Initiative. The issues of interoperability within NASA and industry initiatives are studied including the Open Systems Interconnection standard and the operating system of the Open Software Foundation. The requirements of partitioning functionality into separate areas are determined with attention given to the infrastructure required to ensure system-wide compliance. The need for a decision-making context is a key to the distributed implementation of the program, and this environment is concluded to be next step in developing an evolvable, interoperable, and securable support network.
NASA Technical Reports Server (NTRS)
2006-01-01
The Global Change Master Directory (GCMD) has been one of the best known Earth science and global change data discovery online resources throughout its extended operational history. The growing popularity of the system since its introduction on the World Wide Web in 1994 has created an environment where resolving issues of scalability, security, and interoperability have been critical to providing the best available service to the users and partners of the GCMD. Innovative approaches developed at the GCMD in these areas will be presented with a focus on how they relate to current and future GO-ESSP community needs.
Watershed and Economic Data InterOperability (WEDO) System
Hydrologic modeling is essential for environmental, economic, and human health decision-making. However, sharing of modeling studies is limited within the watershed modeling community. Distribution of hydrologic modeling research typically involves publishing summarized data in p...
Multi-view Decision Making (MVDM) Workshop
2009-02-01
reflect the realities of system-of-systems development, acquisition, fielding and support: multi-view decision making (MVDM). MVDM addresses the...including mission risk, interoperable acquisition, and operational security and survivability. Hence, a multi-view approach to decision making is
Watershed and Economic Data InterOperability (WEDO) System (presentation)
Hydrologic modeling is essential for environmental, economic, and human health decision- making. However, sharing of modeling studies is limited within the watershed modeling community. Distribution of hydrologic modeling research typically involves publishing summarized data in ...
Interoperability Assets for Patient Summary Components: A Gap Analysis.
Heitmann, Kai U; Cangioli, Giorgio; Melgara, Marcello; Chronaki, Catherine
2018-01-01
The International Patient Summary (IPS) standards aim to define the specifications for a minimal and non-exhaustive Patient Summary, which is specialty-agnostic and condition-independent, but still clinically relevant. Meanwhile, health systems are developing and implementing their own variation of a patient summary while, the eHealth Digital Services Infrastructure (eHDSI) initiative is deploying patient summary services across countries in the Europe. In the spirit of co-creation, flexible governance, and continuous alignment advocated by eStandards, the Trillum-II initiative promotes adoption of the patient summary by engaging standards organizations, and interoperability practitioners in a community of practice for digital health to share best practices, tools, data, specifications, and experiences. This paper compares operational aspects of patient summaries in 14 case studies in Europe, the United States, and across the world, focusing on how patient summary components are used in practice, to promote alignment and joint understanding that will improve quality of standards and lower costs of interoperability.
NASA and Industry Benefits of ACTS High Speed Network Interoperability Experiments
NASA Technical Reports Server (NTRS)
Zernic, M. J.; Beering, D. R.; Brooks, D. E.
2000-01-01
This paper provides synopses of the design. implementation, and results of key high data rate communications experiments utilizing the technologies of NASA's Advanced Communications Technology Satellite (ACTS). Specifically, the network protocol and interoperability performance aspects will be highlighted. The objectives of these key experiments will be discussed in their relevant context to NASA missions, as well as, to the comprehensive communications industry. Discussion of the experiment implementation will highlight the technical aspects of hybrid network connectivity, a variety of high-speed interoperability architectures, a variety of network node platforms, protocol layers, internet-based applications, and new work focused on distinguishing between link errors and congestion. In addition, this paper describes the impact of leveraging government-industry partnerships to achieve technical progress and forge synergistic relationships. These relationships will be the key to success as NASA seeks to combine commercially available technology with its own internal technology developments to realize more robust and cost effective communications for space operations.
Integrated Nationwide Electronic Health Records system: Semi-distributed architecture approach.
Fragidis, Leonidas L; Chatzoglou, Prodromos D; Aggelidis, Vassilios P
2016-11-14
The integration of heterogeneous electronic health records systems by building an interoperable nationwide electronic health record system provides undisputable benefits in health care, like superior health information quality, medical errors prevention and cost saving. This paper proposes a semi-distributed system architecture approach for an integrated national electronic health record system incorporating the advantages of the two dominant approaches, the centralized architecture and the distributed architecture. The high level design of the main elements for the proposed architecture is provided along with diagrams of execution and operation and data synchronization architecture for the proposed solution. The proposed approach effectively handles issues related to redundancy, consistency, security, privacy, availability, load balancing, maintainability, complexity and interoperability of citizen's health data. The proposed semi-distributed architecture offers a robust interoperability framework without healthcare providers to change their local EHR systems. It is a pragmatic approach taking into account the characteristics of the Greek national healthcare system along with the national public administration data communication network infrastructure, for achieving EHR integration with acceptable implementation cost.
Family of Beyond Line-of-Sight - Terminals (FAB-T)
2013-12-01
Inter- operable with the AEHF, APS, Milstar, and UFO -E/EE Inter- operable with the AEHF, APS, Milstar, and UFO -E/EE Inter- operable with the...AEHF, APS, Milstar, and UFO -E/EE Milstar connectivity has been extensively tested; partial AEHF on-orbit testing has been conducted...Program SR-3300. This performance parameter only applies to the CPT configuration. 8. Interoperability with UFO /E and UFO /EE is predicated on
The Combat Cloud: Enabling Multi-Domain Command and Control Across the Range of Military Operations
2017-03-01
and joint by their very nature.3 The Combat Cloud architecture will enable MDC2 by increasing the interoperability of existing networks...order to provide operating platforms with a robust architecture that communicates with relevant players, operates at reduced levels of connectivity...responsibility or aircraft platform, and a Combat Cloud architecture helps focus thought toward achieving efficient MDC2 and effects rather than
Small Countries’ Special Operations Forces Contribution to the NATO Response Force
2014-06-13
tasks: Military Assistance, Direct Actions and Special Reconnaissance. The Jackal Stone exercise serves as a bedrock designed to build special...operations capabilities and improve interoperability among European partner nations. In Jackal Stone 2012, Army Major General Michael S. Repass, Commander...coalition or NATO operation, and the significance of teaming up with another 48 capable partner.96 Exercise Jackal Stone is an annual event and it is
NASA Astrophysics Data System (ADS)
Arko, R. A.; Stocks, K.; Chandler, C. L.; Smith, S. R.; Miller, S. P.; Maffei, A. R.; Glaves, H. M.; Carbotte, S. M.
2013-12-01
The U.S. National Science Foundation supports a fleet of academic research vessels operating throughout the world's oceans. In addition to supporting the mission-specific goals of each expedition, these vessels routinely deploy a suite of underway environmental sensors, operating like mobile observatories. Recognizing that the data from these instruments have value beyond each cruise, NSF funded R2R in 2009 to ensure that these data are routinely captured, cataloged and described, and submitted to the appropriate national repository for long-term public access. In 2013, R2R joined the Ocean Data Interoperability Platform (ODIP; http://odip.org/). The goal of ODIP is to remove barriers to the effective sharing of data across scientific domains and international boundaries, by providing a forum to harmonize diverse regional systems. To advance this goal, ODIP organizes international workshops to foster the development of common standards and develop prototypes to evaluate and test potential standards and interoperability solutions. ODIP includes major organizations engaged in ocean data stewardship in the EU, US, and Australia, supported by the International Oceanographic Data and Information Exchange (IODE). Within the broad scope of ODIP, R2R focuses on contributions in 4 key areas: ● Implement a 'Linked Open Data' approach to disseminate data and documentation, using existing World Wide Web Consortium (W3C) specifications and machine-readable formats. Exposing content as Linked Open Data will provide a simple mechanism for ODIP collaborators to browse and compare data sets among repositories. ● Map key vocabularies used by R2R to their European and Australian counterparts. The existing heterogeneity among terms inhibits data discoverability, as a user searching on the term with which s/he is familiar may not find all data of interest. Mapping key terms across the different ODIP partners, relying on the backbone thesaurus provided by the NERC Vocabulary Server (http://vocab.nerc.ac.uk/), is a first step towards wider data discoverability. ● Upgrade existing R2R ISO metadata records to be compatible with the new SeaDataNet II Cruise Summary Report (CSR) profile, and publish the records in a standards-compliant Web portal, built on the GeoNetwork open-source package. ● Develop the future workforce. R2R is enlisting and exposing a group of five students to new informatics technologies and international collaboration. Students are undertaking a coordinated series of projects in 2013 and 2014 at each of the R2R partner institutions, combined with travel to selected meetings where they will engage in the ODIP Workshop process; present results; and exchange ideas with working scientists and software developers in Europe and Australia. Students work closely with staff at the R2R partner institutions, in projects that build on the R2R-ODIP technical work components described above.
Secure and interoperable communication infrastructures for PPDR organisations
NASA Astrophysics Data System (ADS)
Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David
2016-05-01
The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.
Interoperability: A Necessary Means Towards Operational Success in NATO.
1987-05-05
the Germans reorganized along the Chir river to make preparations for the conduct of a relief operation for 6th Army. The 6th Army did make plans to...was forced to alter the original offen- sLve plan (Operation Saturn) when operations at the Chir river failed and the German relief effort commenced...April 1986. Higgins , George A. MAJ., US Army. Ooerational Tenets of Generals Heinz Guderian and George S. Patton, Jr. MMAS Thesis, Ft. Leavenworth
Facilitating secondary use of medical data by using openEHR archetypes.
Kohl, Christian D; Garde, Sebastian; Knaup, Petra
2010-01-01
Clinical trials are of high importance for medical progress. But even though more and more clinical data is available in electronic patient records (EPRs) and more and more electronic data capture (EDC) systems are used in trials, there is still a gap which makes EPR / EDC interoperability difficult and hampers secondary use of medical routine data. The openEHR architecture for Electronic Health Records is based on a two level modeling approach which makes use of 'archetypes'. We want to analyze whether archetypes can help to bridge this gap by building an integrated EPR / EDC system based on openEHR archetypes. We used the 'openEHR Reference Framework and Application' (Opereffa) and existing archetypes for medical data. Furthermore, we developed dedicated archetypes to document study meta data. We developed a first prototype implementation of an archetype based integrated EPR / EDC system. Next steps will be the evaluation of an extended prototype in a real clinical trial scenario. Opereffa was a good starting point for our work. OpenEHR archetypes proved useful for secondary use of health data.
Prototype development and demonstration for integrated dynamic transit operations.
DOT National Transportation Integrated Search
2016-01-01
This document serves as the Final Report specific to the Integrated Dynamic Transit Operations (IDTO) Prototype Development and Deployment Project, hereafter referred to as IDTO Prototype Deployment or IDTO PD project. This project was performed unde...
Integrated Semantics Service Platform for the Internet of Things: A Case Study of a Smart Office
Ryu, Minwoo; Kim, Jaeho; Yun, Jaeseok
2015-01-01
The Internet of Things (IoT) allows machines and devices in the world to connect with each other and generate a huge amount of data, which has a great potential to provide useful knowledge across service domains. Combining the context of IoT with semantic technologies, we can build integrated semantic systems to support semantic interoperability. In this paper, we propose an integrated semantic service platform (ISSP) to support ontological models in various IoT-based service domains of a smart city. In particular, we address three main problems for providing integrated semantic services together with IoT systems: semantic discovery, dynamic semantic representation, and semantic data repository for IoT resources. To show the feasibility of the ISSP, we develop a prototype service for a smart office using the ISSP, which can provide a preset, personalized office environment by interpreting user text input via a smartphone. We also discuss a scenario to show how the ISSP-based method would help build a smart city, where services in each service domain can discover and exploit IoT resources that are wanted across domains. We expect that our method could eventually contribute to providing people in a smart city with more integrated, comprehensive services based on semantic interoperability. PMID:25608216
Integrated semantics service platform for the Internet of Things: a case study of a smart office.
Ryu, Minwoo; Kim, Jaeho; Yun, Jaeseok
2015-01-19
The Internet of Things (IoT) allows machines and devices in the world to connect with each other and generate a huge amount of data, which has a great potential to provide useful knowledge across service domains. Combining the context of IoT with semantic technologies, we can build integrated semantic systems to support semantic interoperability. In this paper, we propose an integrated semantic service platform (ISSP) to support ontological models in various IoT-based service domains of a smart city. In particular, we address three main problems for providing integrated semantic services together with IoT systems: semantic discovery, dynamic semantic representation, and semantic data repository for IoT resources. To show the feasibility of the ISSP, we develop a prototype service for a smart office using the ISSP, which can provide a preset, personalized office environment by interpreting user text input via a smartphone. We also discuss a scenario to show how the ISSP-based method would help build a smart city, where services in each service domain can discover and exploit IoT resources that are wanted across domains. We expect that our method could eventually contribute to providing people in a smart city with more integrated, comprehensive services based on semantic interoperability.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-10
... New Hampshire InterOperability Laboratory, Durham, NH; Verimatrix, Inc., San Diego, CA; Vobile, Inc... Inc., San Diego, CA; Vidiator, Bellevue, WA; Virtual Logix, Montigny-le Bretonneux, France; Vishwak...
Rapid prototyping of soil moisture estimates using the NASA Land Information System
NASA Astrophysics Data System (ADS)
Anantharaj, V.; Mostovoy, G.; Li, B.; Peters-Lidard, C.; Houser, P.; Moorhead, R.; Kumar, S.
2007-12-01
The Land Information System (LIS), developed at the NASA Goddard Space Flight Center, is a functional Land Data Assimilation System (LDAS) that incorporates a suite of land models in an interoperable computational framework. LIS has been integrated into a computational Rapid Prototyping Capabilities (RPC) infrastructure. LIS consists of a core, a number of community land models, data servers, and visualization systems - integrated in a high-performance computing environment. The land surface models (LSM) in LIS incorporate surface and atmospheric parameters of temperature, snow/water, vegetation, albedo, soil conditions, topography, and radiation. Many of these parameters are available from in-situ observations, numerical model analysis, and from NASA, NOAA, and other remote sensing satellite platforms at various spatial and temporal resolutions. The computational resources, available to LIS via the RPC infrastructure, support e- Science experiments involving the global modeling of land-atmosphere studies at 1km spatial resolutions as well as regional studies at finer resolutions. The Noah Land Surface Model, available with-in the LIS is being used to rapidly prototype soil moisture estimates in order to evaluate the viability of other science applications for decision making purposes. For example, LIS has been used to further extend the utility of the USDA Soil Climate Analysis Network of in-situ soil moisture observations. In addition, LIS also supports data assimilation capabilities that are used to assimilate remotely sensed soil moisture retrievals from the AMSR-E instrument onboard the Aqua satellite. The rapid prototyping of soil moisture estimates using LIS and their applications will be illustrated during the presentation.
The EuroGEOSS Advanced Operating Capacity
NASA Astrophysics Data System (ADS)
Nativi, S.; Vaccari, L.; Stock, K.; Diaz, L.; Santoro, M.
2012-04-01
The concept of multidisciplinary interoperability for managing societal issues is a major challenge presently faced by the Earth and Space Science Informatics community. With this in mind, EuroGEOSS project was launched on May 1st 2009 for a three year period aiming to demonstrate the added value to the scientific community and society of providing existing earth observing systems and applications in an interoperable manner and used within the GEOSS and INSPIRE frameworks. In the first period, the project built an Initial Operating Capability (IOC) in the three strategic areas of Drought, Forestry and Biodiversity; this was then enhanced into an Advanced Operating Capacity (AOC) for multidisciplinary interoperability. Finally, the project extended the infrastructure to other scientific domains (geology, hydrology, etc.). The EuroGEOSS multidisciplinary AOC is based on the Brokering Approach. This approach aims to achieve multidisciplinary interoperability by developing an extended SOA (Service Oriented Architecture) where a new type of "expert" components is introduced: the Broker. These implement all mediation and distribution functionalities needed to interconnect the distributed and heterogeneous resources characterizing a System of Systems (SoS) environment. The EuroGEOSS AOC is comprised of the following components: • EuroGEOSS Discovery Broker: providing harmonized discovery functionalities by mediating and distributing user queries against tens of heterogeneous services; • EuroGEOSS Access Broker: enabling users to seamlessly access and use heterogeneous remote resources via a unique and standard service; • EuroGEOSS Web 2.0 Broker: enhancing the capabilities of the Discovery Broker with queries towards the new Web 2.0 services; • EuroGEOSS Semantic Discovery Broker: enhancing the capabilities of the Discovery Broker with semantic query-expansion; • EuroGEOSS Natural Language Search Component: providing users with the possibilities to search for resources using natural language queries; • Service Composition Broker: allowing users to compose and execute complex Business Processes, based on the technology developed by the FP7 UncertWeb project. Recently, the EuroGEOSS Brokering framework was presented at the GEO-VIII Plenary and Exhibition in Istanbul and introduced into the GEOSS Common Infrastructure.
Test Protocols for Advanced Inverter Interoperability Functions – Main Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.
2013-11-01
Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated onmore » grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not currently required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as some of these inverter capabilities are being incorporated in large demonstration and commercial projects. The test protocols are intended to be used to verify acceptable performance of inverters within the standard framework described in IEC TR 61850-90-7. These test protocols, as they are refined and validated over time, can become precursors for future certification test procedures for DER advanced grid support functions.« less
Test Protocols for Advanced Inverter Interoperability Functions - Appendices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.
2013-11-01
Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated onmore » grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not now required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as some of these inverter capabilities are being incorporated in large demonstration and commercial projects. The test protocols are intended to be used to verify acceptable performance of inverters within the standard framework described in IEC TR 61850-90-7. These test protocols, as they are refined and validated over time, can become precursors for future certification test procedures for DER advanced grid support functions.« less
Development and Operations of the Astrophysics Data System
NASA Technical Reports Server (NTRS)
Murray, Stephen S.
1998-01-01
Preparations for the AAS meeting in January are progressing. We will have a talk, a poster, and a demonstration. We organized a meeting during the AAS conference to discuss bibliographic codes in order to make sure the different information providers can inter-operate. Our new server should be on-line for the AAS meeting. This will improve the search speed considerably.
Lee, M-Y; Won, H-S; Jeon, E-J; Yoon, H C; Choi, J Y; Hong, S J; Kim, M-J
2014-06-01
To evaluate the reproducibility of measurement of the fetal left modified myocardial performance index (Mod-MPI) determined using a novel automated system. This was a prospective study of 116 ultrasound examinations from 110 normal singleton pregnancies at 12 + 1 to 37 + 1 weeks' gestation. Two experienced operators each measured the left Mod-MPI twice manually and twice automatically using the Auto Mod-MPI system. Intra- and interoperator reproducibility were assessed using intraclass correlation coefficients (ICCs) and the manual and automated measurements obtained by the more experienced operator were compared using Bland-Altman plots and ICCs. Both operators successfully measured the left Mod-MPI in all cases using the Auto Mod-MPI system. For both operators, intraoperator reproducibility was higher when performing automated measurements (ICC = 0.967 and 0.962 for Operators 1 and 2, respectively) than when performing manual measurements (ICC = 0.857 and 0.856 for Operators 1 and 2, respectively). Interoperator agreement was also better for automated than for manual measurements (ICC = 0.930 vs 0.723, respectively). There was good agreement between the automated and manual values measured by the more experienced operator. The Auto Mod-MPI system is a reliable technique for measuring fetal left Mod-MPI and demonstrates excellent reproducibility. Copyright © 2013 ISUOG. Published by John Wiley & Sons Ltd.
2008-01-01
Special Operations Medicine Staphylococcus, E. coli, and alimentary tract anaerobes (AI). Initial antibacterial activity should not be directed at multidrug... active service. 6. Hospital ships (T-AH) may provide support to major amphibious operations or be designated as a theater-level 3 support capability...USSOCOM is actively engaged in the JLLP, where the goal is to improve interoperability between all LL centers in the community of practice to enhance
Menezes, Pedro Monteiro; Cook, Timothy Wayne; Cavalini, Luciana Tricai
2016-01-01
To present the technical background and the development of a procedure that enriches the semantics of Health Level Seven version 2 (HL7v2) messages for software-intensive systems in telemedicine trauma care. This study followed a multilevel model-driven approach for the development of semantically interoperable health information systems. The Pre-Hospital Trauma Life Support (PHTLS) ABCDE protocol was adopted as the use case. A prototype application embedded the semantics into an HL7v2 message as an eXtensible Markup Language (XML) file, which was validated against an XML schema that defines constraints on a common reference model. This message was exchanged with a second prototype application, developed on the Mirth middleware, which was also used to parse and validate both the original and the hybrid messages. Both versions of the data instance (one pure XML, one embedded in the HL7v2 message) were equally validated and the RDF-based semantics recovered by the receiving side of the prototype from the shared XML schema. This study demonstrated the semantic enrichment of HL7v2 messages for intensive-software telemedicine systems for trauma care, by validating components of extracts generated in various computing environments. The adoption of the method proposed in this study ensures the compliance of the HL7v2 standard in Semantic Web technologies.
An Interoperable, Agricultural Information System Based on Satellite Remote Sensing Data
NASA Technical Reports Server (NTRS)
Teng, William; Chiu, Long; Doraiswamy, Paul; Kempler, Steven; Liu, Zhong; Pham, Long; Rui, Hualan
2005-01-01
Monitoring global agricultural crop conditions during the growing season and estimating potential seasonal production are critically important for market development of US. agricultural products and for global food security. The Goddard Space Flight Center Earth Sciences Data and Information Services Center Distributed Active Archive Center (GES DISC DAAC) is developing an Agricultural Information System (AIS), evolved from an existing TRMM Online Visualization and Analysis System (TOVAS), which will operationally provide satellite remote sensing data products (e.g., rainfall) and services. The data products will include crop condition and yield prediction maps, generated from a crop growth model with satellite data inputs, in collaboration with the USDA Agricultural Research Service. The AIS will enable the remote, interoperable access to distributed data, by using the GrADS-DODS Server (GDS) and by being compliant with Open GIS Consortium standards. Users will be able to download individual files, perform interactive online analysis, as well as receive operational data flows. AIS outputs will be integrated into existing operational decision support systems for global crop monitoring, such as those of the USDA Foreign Agricultural Service and the U.N. World Food Program.
Distributed controller clustering in software defined networks.
Abdelaziz, Ahmed; Fong, Ang Tan; Gani, Abdullah; Garba, Usman; Khan, Suleman; Akhunzada, Adnan; Talebian, Hamid; Choo, Kim-Kwang Raymond
2017-01-01
Software Defined Networking (SDN) is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs) brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN) SDN and Open Network Operating System (ONOS) controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability.
Tool and data interoperability in the SSE system
NASA Technical Reports Server (NTRS)
Shotton, Chuck
1988-01-01
Information is given in viewgraph form on tool and data interoperability in the Software Support Environment (SSE). Information is given on industry problems, SSE system interoperability issues, SSE solutions to tool and data interoperability, and attainment of heterogeneous tool/data interoperability.
Public Key Infrastructure Study
1994-04-01
commerce. This Public Key Infrastructure (PKI) study focuses on the United States Federal Government operations, but also addresses national and global ... issues in order to facilitate the interoperation of protected electronic commerce among the various levels of government in the U.S., private citizens
Socio-Technical Considerations for the Use of Blockchain Technology in Healthcare.
Wong, Ming Chao; Yee, Kwang Chien; Nøhr, Christian
2018-01-01
Blockchain technology is often considered as the fourth industrial revolution that will change the world. The enthusiasm of the transformative nature of blockchain technology has infiltrated healthcare. Blockchain is often seen as the much needed and perfect technology for healthcare, addressing the difficult and complex issues of security and inter-operability. More importantly, the "value" and trust-based system can deliver automated action and response via its smart contract mechanism. Healthcare, however, is a complex system. Health information technology (HIT) so far, has not delivered its promise of transforming healthcare due to its complex socio-technical and context sensitive interaction. The introduction of blockchain technology will need to consider a whole range of socio-technical issues in order to improve the quality and safety of patient care. This paper presents a discussion on these socio-technical issues. More importantly, this paper argues that in order to achieve the best outcome from blockchain technology, there is a need to consider a clinical transformation from "information" to "value " and trust. This paper argues that urgent research is needed to address these socio-technical issues in order to facilitate best outcomes for blockchain in healthcare. These socio-technical issues must then be further evaluated by means of working prototypes in the medical domain in coming years.
Hudson, John M; Milot, Laurent; Parry, Craig; Williams, Ross; Burns, Peter N
2013-06-01
This study assessed the reproducibility of shear wave elastography (SWE) in the liver of healthy volunteers. Intra- and inter-operator reliability and repeatability were quantified in three different liver segments in a sample of 15 subjects, scanned during four independent sessions (two scans on day 1, two scans 1 wk later) by two operators. A total of 1440 measurements were made. Reproducibility was assessed using the intra-class correlation coefficient (ICC) and a repeated measures analysis of variance. The shear wave speed was measured and used to estimate Young's modulus using the Supersonics Imagine Aixplorer. The median Young's modulus measured through the inter-costal space was 5.55 ± 0.74 kPa. The intra-operator reliability was better for same-day evaluations (ICC = 0.91) than the inter-operator reliability (ICC = 0.78). Intra-observer agreement decreased when scans were repeated on a different day. Inter-session repeatability was between 3.3% and 9.9% for intra-day repeated scans, compared with to 6.5%-12% for inter-day repeated scans. No significant difference was observed in subjects with a body mass index greater or less than 25 kg/m(2). Copyright © 2013 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
Towards technical interoperability in telemedicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard Layne, II
2004-05-01
For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.
The interoperability force in the ERP field
NASA Astrophysics Data System (ADS)
Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon
2015-04-01
Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.
IEEE 1547 Standards Advancing Grid Modernization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basso, Thomas; Chakraborty, Sudipta; Hoke, Andy
Technology advances including development of advanced distributed energy resources (DER) and grid-integrated operations and controls functionalities have surpassed the requirements in current standards and codes for DER interconnection with the distribution grid. The full revision of IEEE Standards 1547 (requirements for DER-grid interconnection and interoperability) and 1547.1 (test procedures for conformance to 1547) are establishing requirements and best practices for state-of-the-art DER including variable renewable energy sources. The revised standards will also address challenges associated with interoperability and transmission-level effects, in addition to strictly addressing the distribution grid needs. This paper provides the status and future direction of the ongoingmore » development focus for the 1547 standards.« less
Interplanetary Overlay Network Bundle Protocol Implementation
NASA Technical Reports Server (NTRS)
Burleigh, Scott C.
2011-01-01
The Interplanetary Overlay Network (ION) system's BP package, an implementation of the Delay-Tolerant Networking (DTN) Bundle Protocol (BP) and supporting services, has been specifically designed to be suitable for use on deep-space robotic vehicles. Although the ION BP implementation is unique in its use of zero-copy objects for high performance, and in its use of resource-sensitive rate control, it is fully interoperable with other implementations of the BP specification (Internet RFC 5050). The ION BP implementation is built using the same software infrastructure that underlies the implementation of the CCSDS (Consultative Committee for Space Data Systems) File Delivery Protocol (CFDP) built into the flight software of Deep Impact. It is designed to minimize resource consumption, while maximizing operational robustness. For example, no dynamic allocation of system memory is required. Like all the other ION packages, ION's BP implementation is designed to port readily between Linux and Solaris (for easy development and for ground system operations) and VxWorks (for flight systems operations). The exact same source code is exercised in both environments. Initially included in the ION BP implementations are the following: libraries of functions used in constructing bundle forwarders and convergence-layer (CL) input and output adapters; a simple prototype bundle forwarder and associated CL adapters designed to run over an IPbased local area network; administrative tools for managing a simple DTN infrastructure built from these components; a background daemon process that silently destroys bundles whose time-to-live intervals have expired; a library of functions exposed to applications, enabling them to issue and receive data encapsulated in DTN bundles; and some simple applications that can be used for system checkout and benchmarking.
2011-12-01
Task Based Approach to Planning.” Paper 08F- SIW -033. In Proceed- ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability...Paper 06F- SIW -003. In Proceed- 2597 Blais ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability Standards Organi...MSDL).” Paper 10S- SIW -003. In Proceedings of the Spring Simulation Interoperability Workshop. Simulation Interoperability Standards Organization
Maturity model for enterprise interoperability
NASA Astrophysics Data System (ADS)
Guédria, Wided; Naudet, Yannick; Chen, David
2015-01-01
Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.
2016-12-07
Phasing Model8 WESBROCK, HARNED, AND PLOUS 88 | FEATURES PRISM 6, no. 3 views on how to design, plan, and execute operat ions and campaigns. The...population-centric” operational environment. SOF views campaign design dif- ferently from the six-phase model in joint doc- trine depicted above.9...U.S. interests. This difference between SOF and CF views of cam- paigning can hamper integration from the start of an operation if components of the
Defense Acquisitions: Assessments of Selected Weapon Programs
2011-03-01
Frequency (UHF) Follow-On ( UFO ) satellite system currently in operation and provide interoperability with legacy terminals. MUOS consists of a...delivery of MUOS capabilities is time-critical due to the operational failures of two UFO satellites. The MUOS program has taken several steps to...launch increased due to the unexpected failures of two UFO satellites. Based on the current health of on-orbit satellites, UHF communication
Defense Acquisitions: Assessments of Selected Weapon Programs
2013-03-01
UHF) Follow-On ( UFO ) satellite system currently in operation and provide interoperability with legacy terminals. MUOS consists of a network of...MUOS satellites remain important due to the past operational failures of two UFO satellites and predicted end-of-life of on-orbit UFO satellites...Despite the delay and earlier, unexpected failures of two UFO satellites, the required availability level of UHF communication capabilities has been
2011-07-01
Jack E. Edwa appendix III. t rds Director, Defense Capabilities and Managemen Page 36 GAO-11-569 Defense Logistics List of Committees...Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302... operations . DOD faces asset visibility challenges due, in part, to a lack of interoperability among information technology
Command and Control for Joint Air Operations
2010-01-12
systems, to include collaborative air planning tools such as the theater battle management core system ( TBMCS ). Operational level air planning occurs in...sight communications and data exchange equipment in order to respond to joint force requirements. For example, the TBMCS is often used. The use of ATO...generation and dissemination software portions of TBMCS has been standardized. This ATO feature allows the JAOC to be interoperable with other
Command and Control Common Semantic Core Required to Enable Net-centric Operations
2008-05-20
automated processing capability. A former US Marine Corps component C4 director during Operation Iraqi Freedom identified the problems of 1) uncertainty...interoperability improvements to warfighter community processes, thanks to ubiquitous automated processing , are likely high and somewhat easier to quantify. A...synchronized with the actions of other partners / warfare communities. This requires high- quality information, rapid sharing and automated processing – which
NASA Astrophysics Data System (ADS)
McDonald, K. R.; Faundeen, J. L.; Petiteville, I.
2005-12-01
The Committee on Earth Observation Satellites (CEOS) was established in 1984 in response to a recommendation from the Economic Summit of Industrialized Nations Working Group on Growth, Technology, and Employment's Panel of Experts on Satellite Remote Sensing. CEOS participants are Members, who are national or international governmental organizations who operate civil spaceborne Earth observation satellites, and Associates who are governmental organizations with civil space programs in development or international scientific or governmental bodies who have an interest in and support CEOS objectives. The primary objective of CEOS is to optimize benefits of satellite Earth observations through cooperation of its participants in mission planning and in development of compatible data products, formats, services, applications and policies. To pursue its objectives, CEOS establishes working groups and associated subgroups that focus on relevant areas of interest. While the structure of CEOS has evolved over its lifetime, today there are three permanent working groups. One is the Working Group on Calibration and Validation that addresses sensor-specific calibration and validation and geophysical parameter validation. A second is the Working Group on Education, Training, and Capacity Building that facilitates activities that enhance international education and training in Earth observation techniques, data analysis, interpretation and applications, with a particular focus on developing countries. The third permanent working group is the Working Group on Information Systems and Services (WGISS). The purpose of WGISS is to promote collaboration in the development of the systems and services based on international standards that manage and supply the Earth observation data and information from participating agencies' missions. WGISS places great emphasis on the use of demonstration projects involving user groups to solve the critical interoperability issues associated with the achievement of global services and its structure reflects that objective. The Technology and Services Subgroup initiates tasks to explore emerging technologies that can be employed to create data and information systems and to develop interoperable services. The interests of the subgroup span the full range of the information processing chain from the initial ingestion of satellite data into archives through to the incorporation of derived information into end-user applications. The subgroup has overseen the creation of an Interoperable Directory Network and an Interoperable Catalog System and has tasks that are investigating the use of new technologies such as Web Services, Grid, and Open Geographical Information Systems to provide enhanced capabilities. The WGISS Projects and Applications Subgroup works with outside organizations to understand their requirements and then helps them to exploit the tools and services available through WGISS and its members and associates. WGISS has instituted the concept of a WGISS Test Facility to test and develop information systems and services prototypes collaboratively with these organizations to meet their specific requirements. This approach has the dual benefit of addressing real information systems and services needs of science and applications projects and helping WGISS to expand and improve its capabilities based on the experience and lessons learned from working with the projects.
EO Domain Specific Knowledge Enabled Services (KES-B)
NASA Astrophysics Data System (ADS)
Varas, J.; Busto, J.; Torguet, R.
2004-09-01
This paper recovers and describes a number of major statements with respect to the vision, mission and technological approaches of the Technological Research Project (TRP) "EO Domain Specific Knowledge Enabled Services" (project acronym KES-B), which is currently under development at the European Space Research Institute (ESRIN) under contract "16397/02/I- SB". Resulting from the on-going R&D activities, the KES-B project aims are to demonstrate with a prototype system the feasibility of the application of innovative knowledge-based technologies to provide services for easy, scheduled and controlled exploitation of EO resources (e.g.: data, algorithms, procedures, storage, processors, ...), to automate the generation of products, and to support users in easily identifying and accessing the required information or products by using their own vocabulary, domain knowledge and preferences. The ultimate goals of KES-B are summarized in the provision of the two main types of KES services: 1st the Search service (also referred to as Product Exploitation or Information Retrieval; and 2nd the Production service (also referred to as Information Extraction), with the strategic advantage that they are enabled by Knowledge consolidated (formalized) within the system. The KES-B system technical solution approach is driven by a strong commitment for the adoption of industry (XML-based) language standards, aiming to have an interoperable, scalable and flexible operational prototype. In that sense, the Search KES services builds on the basis of the adoption of consolidated and/or emergent W3C semantic-web standards. Remarkably the languages/models Dublin Core (DC), Universal Resource Identifier (URI), Resource Description Framework (RDF) and Ontology Web Language (OWL), and COTS like Protege [1] and JENA [2] are being integrated in the system as building bricks for the construction of the KES based Search services. On the other hand, the Production KES services builds on top of workflow management standards and tools. In this side, the Business Process Execution Language (BPEL), the Web Services Definition Language (WSDL), and the Collaxa [3] COTS tool for workflow management are being integrated for the construction of the KES-B Production Services. The KES-B platform (web portal and web-server) architecture is build on the basis of the J2EE reference architecture. These languages represent the mean for the codification of the different types of knowledge that are to be formalized in the system. This representing the ontological architecture of the system. This shall enable in fact the interoperability with other KES-based systems committing as well to those standards. The motivation behind this vision is pointing towards the construction of the Semantic-web based GRID supply- chain infrastructure for EO-services, in line with the INSPIRE initiative suggestions.
ACTS 118x: High Speed TCP Interoperability Testing
NASA Technical Reports Server (NTRS)
Brooks, David E.; Buffinton, Craig; Beering, Dave R.; Welch, Arun; Ivancic, William D.; Zernic, Mike; Hoder, Douglas J.
1999-01-01
With the recent explosion of the Internet and the enormous business opportunities available to communication system providers, great interest has developed in improving the efficiency of data transfer over satellite links using the Transmission Control Protocol (TCP) of the Internet Protocol (IP) suite. The NASA's ACTS experiments program initiated a series of TCP experiments to demonstrate scalability of TCP/IP and determine to what extent the protocol can be optimized over a 622 Mbps satellite link. Through partnerships with the government technology oriented labs, computer, telecommunication, and satellite industries NASA Glenn was able to: (1) promote the development of interoperable, high-performance TCP/IP implementations across multiple computing / operating platforms; (2) work with the satellite industry to answer outstanding questions regarding the use of standard protocols (TCP/IP and ATM) for the delivery of advanced data services, and for use in spacecraft architectures; and (3) conduct a series of TCP/IP interoperability tests over OC12 ATM over a satellite network in a multi-vendor environment using ACTS. The experiments' various network configurations and the results are presented.
NASA Astrophysics Data System (ADS)
Tisdale, M.
2017-12-01
NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying user requirements from government, private, public and academic communities. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), and OGC Web Coverage Services (WCS) while leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams at ASDC are utilizing these services through the development of applications using the Web AppBuilder for ArcGIS and the ArcGIS API for Javascript. These services provide greater exposure of ASDC data holdings to the GIS community and allow for broader sharing and distribution to various end users. These capabilities provide interactive visualization tools and improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry. The presentation will cover how the ASDC is developing geospatial web services and applications to improve data discoverability, accessibility, and interoperability.
A step-by-step methodology for enterprise interoperability projects
NASA Astrophysics Data System (ADS)
Chalmeta, Ricardo; Pazos, Verónica
2015-05-01
Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.
RTO Technical Publications: A Quarterly Listing
NASA Technical Reports Server (NTRS)
2004-01-01
This is a listing of recent unclassified RTO technical publications processed by the NASA Center for AeroSpace Information. Contents include the following: RTO Technical Publications: A Quarterly Listing. Implications of Multilingual Interoperability of Speech Technology for Military Use. Non-Lethal Weapons and Future Peace Enforcement Operations.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-16
... sensor network and provide extended sensor network and components to fill critical situational awareness... different agencies), and share resources. The IOCs will improve tactical decision-making, situational awareness, operations monitoring/ interoperability, rules-based processing, and joint planning in a...
47 CFR 27.4 - Terms and definitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES MISCELLANEOUS WIRELESS... interoperable wireless broadband network operating on the 758-763 MHz and 788-793 MHz bands and the 763-768 MHz and 793-798 MHz bands in accordance with the Commission's rules. Advanced wireless service (AWS). A...
Emergence of a Common Modeling Architecture for Earth System Science (Invited)
NASA Astrophysics Data System (ADS)
Deluca, C.
2010-12-01
Common modeling architecture can be viewed as a natural outcome of common modeling infrastructure. The development of model utility and coupling packages (ESMF, MCT, OpenMI, etc.) over the last decade represents the realization of a community vision for common model infrastructure. The adoption of these packages has led to increased technical communication among modeling centers and newly coupled modeling systems. However, adoption has also exposed aspects of interoperability that must be addressed before easy exchange of model components among different groups can be achieved. These aspects include common physical architecture (how a model is divided into components) and model metadata and usage conventions. The National Unified Operational Prediction Capability (NUOPC), an operational weather prediction consortium, is collaborating with weather and climate researchers to define a common model architecture that encompasses these advanced aspects of interoperability and looks to future needs. The nature and structure of the emergent common modeling architecture will be discussed along with its implications for future model development.
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Nichols, Kelvin F.; Witherspoon, Keith R.
2006-01-01
To date very little effort has been made to provide interoperability between various space agency projects. To effectively get to the Moon and beyond systems must interoperate. To provide interoperability, standardization and registries of various technologies will be required. These registries will be created as they relate to space flight. With the new NASA Moon/Mars initiative, a requirement to standardize and control the naming conventions of very disparate systems and technologies is emerging. The need to provide numbering to the many processes, schemas, vehicles, robots, space suits and technologies (e.g. versions), to name a few, in the highly complex Constellation initiative is imperative. The number of corporations, developer personnel, system interfaces, people interfaces will require standardization and registries on a scale not currently envisioned. It would only take one exception (stove piped system development) to weaken, if not, destroy interoperability. To start, a standardized registry process must be defined that allows many differing engineers, organizations and operators the ability to easily access disparate registry information across numerous technological and scientific disciplines. Once registries are standardized the need to provide registry support in terms of setup and operations, resolution of conflicts between registries and other issues will need to be addressed. Registries should not be confused with repositories. No end user data is "stored" in a registry nor is it a configuration control system. Once a registry standard is created and approved, the technologies that should be registered must be identified and prioritized. In this paper, we will identify and define a registry process that is compatible with the Constellation initiative and other non related space activities and organizations. We will then identify and define the various technologies that should use a registry to provide interoperability. The first set of technologies will be those that are currently in need of expansion namely the assignment of satellite designations and the process which controls assignments. Second, we will analyze the technologies currently standardized under the Consultative Committee for Space Data Systems (CCSDS) banner. Third, we will analyze the current CCSDS working group and Birds of a Feather (BoF) activities to ascertain registry requirements. Lastly, we will identify technologies that are either currently under the auspices of another standards body or technologies that are currently not standardized. For activities one through three, we will provide the analysis by either discipline or technology with rationale, identification and brief description of requirements and precedence. For activity four, we will provide a list of current standards bodies e.g. IETF and a list of potential candidates.
NASA Astrophysics Data System (ADS)
Fink, Wolfgang; George, Thomas; Tarbell, Mark A.
2007-04-01
Robotic reconnaissance operations are called for in extreme environments, not only those such as space, including planetary atmospheres, surfaces, and subsurfaces, but also in potentially hazardous or inaccessible operational areas on Earth, such as mine fields, battlefield environments, enemy occupied territories, terrorist infiltrated environments, or areas that have been exposed to biochemical agents or radiation. Real time reconnaissance enables the identification and characterization of transient events. A fundamentally new mission concept for tier-scalable reconnaissance of operational areas, originated by Fink et al., is aimed at replacing the engineering and safety constrained mission designs of the past. The tier-scalable paradigm integrates multi-tier (orbit atmosphere surface/subsurface) and multi-agent (satellite UAV/blimp surface/subsurface sensing platforms) hierarchical mission architectures, introducing not only mission redundancy and safety, but also enabling and optimizing intelligent, less constrained, and distributed reconnaissance in real time. Given the mass, size, and power constraints faced by such a multi-platform approach, this is an ideal application scenario for a diverse set of MEMS sensors. To support such mission architectures, a high degree of operational autonomy is required. Essential elements of such operational autonomy are: (1) automatic mapping of an operational area from different vantage points (including vehicle health monitoring); (2) automatic feature extraction and target/region-of-interest identification within the mapped operational area; and (3) automatic target prioritization for close-up examination. These requirements imply the optimal deployment of MEMS sensors and sensor platforms, sensor fusion, and sensor interoperability.
Pickersgill, C H; Marr, C M; Reid, S W
2001-01-01
A quantitative investigation of the variation that can occur during the course of ultrasonography of the equine superficial digital flexor tendons (SDFT) was undertaken. The aim of this investigation was to use an objective measure, namely the measurement of CSA, to quantify the variability occurring during the course of the ultrasonographic assessment of the equine SDFT. The effects of 3 variables on the CSA measurements were determined. 1) Image acquisition operator (IAc): two different operators undertaking the ultrasonographic examination; 2) image analysis operator (IAn): two different operators undertaking the calculation of CSA values from previously stored images; and 3) analytical equipment (used during CSA measurement) (IEq): the use of 2 different sets of equipment during calculation of CSA values. Tendon cross-sectional area (CSA) measurements were used as the comparative variable of 3 potential sources: interoperator, during image acquisition; interoperator, during CSA measurement; and intraoperator, when using different analytical equipment. Two operators obtained transverse ultrasonographic images from the forelimb SDFTs of 16 National Hunt (NH) Thoroughbred (TB) racehorses, each undertaking analysis of their own and the other operator's images. One operator undertook analysis of their images using 2 sets of equipment. There was no statistically significant difference in the results obtained when different operators undertook image acquisition (P>0.05). At all but the most distal level, there was no significant difference when different equipment was used during analysis (P>0.05). A significant difference (P<0.01) was reported when different operators undertook image analysis, one operator consistently returning larger measurements. Different operators undertaking different stages of an examination can result in significant variability. To reduce confounding during ultrasonographic investigations involving multiple persons, one operator should undertake image analysis, although different operators may undertake image acquisition.
2005-06-01
virtualisation of distributed computing and data resources such as processing, network bandwidth, and storage capacity, to create a single system...and Simulation (M&S) will be integrated into this heterogeneous SOA. M&S functionality will be available in the form of operational M&S services. One...documents defining net centric warfare, the use of M&S functionality is a common theme. Alberts and Hayes give a good overview on net centric operations
Building the Synergy between Public Sector and Research Data Infrastructures
NASA Astrophysics Data System (ADS)
Craglia, Massimo; Friis-Christensen, Anders; Ostländer, Nicole; Perego, Andrea
2014-05-01
INSPIRE is a European Directive aiming to establish a EU-wide spatial data infrastructure to give cross-border access to information that can be used to support EU environmental policies, as well as other policies and activities having an impact on the environment. In order to ensure cross-border interoperability of data infrastructures operated by EU Member States, INSPIRE sets out a framework based on common specifications for metadata, data, network services, data and service sharing, monitoring and reporting. The implementation of INSPIRE has reached important milestones: the INSPIRE Geoportal was launched in 2011 providing a single access point for the discovery of INSPIRE data and services across EU Member States (currently, about 300K), while all the technical specifications for the interoperability of data across the 34 INSPIRE themes were adopted at the end of 2013. During this period a number of EU and international initiatives has been launched, concerning cross-domain interoperability and (Linked) Open Data. In particular, the EU Open Data Portal, launched in December 2012, made provisions to access government and scientific data from EU institutions and bodies, and the EU ISA Programme (Interoperability Solutions for European Public Administrations) promotes cross-sector interoperability by sharing and re-using EU-wide and national standards and components. Moreover, the Research Data Alliance (RDA), an initiative jointly funded by the European Commission, the US National Science Foundation and the Australian Research Council, was launched in March 2013 to promote scientific data sharing and interoperability. The Joint Research Centre of the European Commission (JRC), besides being the technical coordinator of the implementation of INSPIRE, is also actively involved in the initiatives promoting cross-sector re-use in INSPIRE, and sustainable approaches to address the evolution of technologies - in particular, how to support Linked Data in INSPIRE and the use of global persistent identifiers. It is evident that government and scientific data infrastructures are currently facing a number of issues that have already been addressed in INSPIRE. Sharing experiences and competencies will avoid re-inventing the wheel, and help promoting the cross-domain adoption of consistent solutions. Actually, one of the lessons learnt from INSPIRE and the initiatives in which JRC is involved, is that government and research data are not two separate worlds. Government data are commonly used as a basis to create scientific data, and vice-versa. Consequently, it is fundamental to adopt a consistent approach to address interoperability and data management issues shared by both government and scientific data. The presentation illustrates some of the lessons learnt during the implementation of INSPIRE and in work on data and service interoperability coordinated with European and international initiatives. We describe a number of critical interoperability issues and barriers affecting both scientific and government data, concerning, e.g., data terminologies, quality and licensing, and propose how these problems could be effectively addressed by a closer collaboration of the government and scientific communities, and the sharing of experiences and practices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-06-01
Proposed action is to construct at BNL a 5,600-ft[sup 2] support building, install and operate a prototypic 200 MeV accelerator and a prototypic 700 MeV storage ring within, and to construct and operate a 15 kV substation to power the building. The accelerator and storage ring would comprise the x-ray lithography source or XLS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-06-01
Proposed action is to construct at BNL a 5,600-ft{sup 2} support building, install and operate a prototypic 200 MeV accelerator and a prototypic 700 MeV storage ring within, and to construct and operate a 15 kV substation to power the building. The accelerator and storage ring would comprise the x-ray lithography source or XLS.
A Dynamic Infrastructure for Interconnecting Disparate ISR/ISTAR Assets (the ITA Sensor Fabric)
2009-07-01
areas of sensor identification, classification, interoperability and sensor data sharing, dissemination and consumability. This paper presents the ITA...sensors in the area of operations. This paper also presents a use case scenario developed in partnership with the U.S. Army Research Laboratory (ARL) and... paper we describe the Fabric, and its application to a simulated representative coalition operation scenario. The Fabric spans the network from the
NASA Technical Reports Server (NTRS)
Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio
1992-01-01
This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.
Enabling communication concurrency through flexible MPI endpoints
Dinan, James; Grant, Ryan E.; Balaji, Pavan; ...
2014-09-23
MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. Our paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Also, endpoints enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. Furthermore, these characteristics are illustrated through several examples and an empirical study thatmore » contrasts current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less
Enabling communication concurrency through flexible MPI endpoints
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dinan, James; Grant, Ryan E.; Balaji, Pavan
MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. Our paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Also, endpoints enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. Furthermore, these characteristics are illustrated through several examples and an empirical study thatmore » contrasts current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less
Enabling communication concurrency through flexible MPI endpoints
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dinan, James; Grant, Ryan E.; Balaji, Pavan
MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. This paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Endpoints also enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. These characteristics are illustrated through several examples and an empirical study that contrastsmore » current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less
MSAT and cellular hybrid networking
NASA Astrophysics Data System (ADS)
Baranowsky, Patrick W., II
Westinghouse Electric Corporation is developing both the Communications Ground Segment and the Series 1000 Mobile Phone for American Mobile Satellite Corporation's (AMSC's) Mobile Satellite (MSAT) system. The success of the voice services portion of this system depends, to some extent, upon the interoperability of the cellular network and the satellite communication circuit switched communication channels. This paper will describe the set of user-selectable cellular interoperable modes (cellular first/satellite second, etc.) provided by the Mobile Phone and described how they are implemented with the ground segment. Topics including roaming registration and cellular-to-satellite 'seamless' call handoff will be discussed, along with the relevant Interim Standard IS-41 Revision B Cellular Radiotelecommunications Intersystem Operations and IOS-553 Mobile Station - Land Station Compatibility Specification.
Integrated multi-sensor package (IMSP) for unmanned vehicle operations
NASA Astrophysics Data System (ADS)
Crow, Eddie C.; Reichard, Karl; Rogan, Chris; Callen, Jeff; Seifert, Elwood
2007-10-01
This paper describes recent efforts to develop integrated multi-sensor payloads for small robotic platforms for improved operator situational awareness and ultimately for greater robot autonomy. The focus is on enhancements to perception through integration of electro-optic, acoustic, and other sensors for navigation and inspection. The goals are to provide easier control and operation of the robot through fusion of multiple sensor outputs, to improve interoperability of the sensor payload package across multiple platforms through the use of open standards and architectures, and to reduce integration costs by embedded sensor data processing and fusion within the sensor payload package. The solutions investigated in this project to be discussed include: improved capture, processing and display of sensor data from multiple, non-commensurate sensors; an extensible architecture to support plug and play of integrated sensor packages; built-in health, power and system status monitoring using embedded diagnostics/prognostics; sensor payload integration into standard product forms for optimized size, weight and power; and the use of the open Joint Architecture for Unmanned Systems (JAUS)/ Society of Automotive Engineers (SAE) AS-4 interoperability standard. This project is in its first of three years. This paper will discuss the applicability of each of the solutions in terms of its projected impact to reducing operational time for the robot and teleoperator.
NASA Technical Reports Server (NTRS)
Gong, Chester; Wu, Minghong G.; Santiago, Confesor
2016-01-01
The Unmanned Aircraft Systems Integration in the National Airspace System project, or UAS Integration in the NAS, aims to reduce technical barriers related to safety and operational challenges associated with enabling routine UAS access to the NAS. The UAS Integration in the NAS Project conducted a flight test activity, referred to as Flight Test 3 (FT3), involving several Detect-and-Avoid (DAA) research prototype systems between June 15, 2015 and August 12, 2015 at the Armstrong Flight Research Center (AFRC). This report documents the flight testing and analysis results for the NASA Ames-developed JADEM-Autoresolver DAA system, referred to as 'Autoresolver' herein. Four flight test days (June 17, 18, 22, and July 22) were dedicated to Autoresolver testing. The objectives of this test were as follows: 1. Validate CPA prediction accuracy and detect-and-avoid (DAA, formerly known as self-separation) alerting logic in realistic flight conditions. 2. Validate DAA trajectory model including maneuvers. 3. Evaluate TCAS/DAA interoperability. 4. Inform final Minimum Operating Performance Standards (MOPS). Flight test scenarios were designed to collect data to directly address the objectives 1-3. Objective 4, inform final MOPS, was a general objective applicable to the UAS in the NAS project as a whole, of which flight test is a subset. This report presents analysis results completed in support of the UAS in the NAS project FT3 data review conducted on October 20, 2015. Due to time constraints and, to a lesser extent, TCAS data collection issues, objective 3 was not evaluated in this analysis.
An incremental database access method for autonomous interoperable databases
NASA Technical Reports Server (NTRS)
Roussopoulos, Nicholas; Sellis, Timos
1994-01-01
We investigated a number of design and performance issues of interoperable database management systems (DBMS's). The major results of our investigation were obtained in the areas of client-server database architectures for heterogeneous DBMS's, incremental computation models, buffer management techniques, and query optimization. We finished a prototype of an advanced client-server workstation-based DBMS which allows access to multiple heterogeneous commercial DBMS's. Experiments and simulations were then run to compare its performance with the standard client-server architectures. The focus of this research was on adaptive optimization methods of heterogeneous database systems. Adaptive buffer management accounts for the random and object-oriented access methods for which no known characterization of the access patterns exists. Adaptive query optimization means that value distributions and selectives, which play the most significant role in query plan evaluation, are continuously refined to reflect the actual values as opposed to static ones that are computed off-line. Query feedback is a concept that was first introduced to the literature by our group. We employed query feedback for both adaptive buffer management and for computing value distributions and selectivities. For adaptive buffer management, we use the page faults of prior executions to achieve more 'informed' management decisions. For the estimation of the distributions of the selectivities, we use curve-fitting techniques, such as least squares and splines, for regressing on these values.
NASA Astrophysics Data System (ADS)
Pang, Zhibo; Zheng, Lirong; Tian, Junzhe; Kao-Walter, Sharon; Dubrova, Elena; Chen, Qiang
2015-01-01
In-home health care services based on the Internet-of-Things are promising to resolve the challenges caused by the ageing of population. But the existing research is rather scattered and shows lack of interoperability. In this article, a business-technology co-design methodology is proposed for cross-boundary integration of in-home health care devices and services. In this framework, three key elements of a solution (business model, device and service integration architecture and information system integration architecture) are organically integrated and aligned. In particular, a cooperative Health-IoT ecosystem is formulated, and information systems of all stakeholders are integrated in a cooperative health cloud as well as extended to patients' home through the in-home health care station (IHHS). Design principles of the IHHS includes the reuse of 3C platform, certification of the Health Extension, interoperability and extendibility, convenient and trusted software distribution, standardised and secured electrical health care record handling, effective service composition and efficient data fusion. These principles are applied to the design of an IHHS solution called iMedBox. Detailed device and service integration architecture and hardware and software architecture are presented and verified by an implemented prototype. The quantitative performance analysis and field trials have confirmed the feasibility of the proposed design methodology and solution.
NASA Astrophysics Data System (ADS)
Fulker, D. W.; Gallagher, J. H. R.
2015-12-01
OPeNDAP's Hyrax data server is an open-source framework fostering interoperability via easily-deployed Web services. Compatible with solutions listed in the (PA001) session description—federation, rigid standards and brokering/mediation—the framework can support tight or loose coupling, even with dependence on community-contributed software. Hyrax is a Web-services framework with a middleware-like design and a handler-style architecture that together reduce the interoperability challenge (for N datatypes and M user contexts) to an O(N+M) problem, similar to brokering. Combined with an open-source ethos, this reduction makes Hyrax a community tool for gaining interoperability. E.g., in its response to the Big Earth Data Initiative (BEDI), NASA references OPeNDAP-based interoperability. Assuming its suitability, the question becomes: how sustainable is OPeNDAP, a small not-for-profit that produces open-source software, i.e., has no software-sales? In other words, if geoscience interoperability depends on OPeNDAP and similar organizations, are those entities in turn sustainable? Jim Collins (in Good to Great) highlights three questions that successful companies can answer (paraphrased here): What is your passion? Where is your world-class excellence? What drives your economic engine? We attempt to shed light on OPeNDAP sustainability by examining these. Passion: OPeNDAP has a focused passion for improving the effectiveness of scientific data sharing and use, as deeply-cooperative community endeavors. Excellence: OPeNDAP has few peers in remote, scientific data access. Skills include computer science with experience in data science, (operational, secure) Web services, and software design (for servers and clients, where the latter vary from Web pages to standalone apps and end-user programs). Economic Engine: OPeNDAP is an engineering services organization more than a product company, despite software being key to OPeNDAP's reputation. In essence, provision of engineering expertise, via contracts and grants, is the economic engine. Hence sustainability, as needed to address global grand challenges in geoscience, depends on agencies' and others' abilities and willingness to offer grants and let contracts for continually upgrading open-source software from OPeNDAP and others.
Designing learning management system interoperability in semantic web
NASA Astrophysics Data System (ADS)
Anistyasari, Y.; Sarno, R.; Rochmawati, N.
2018-01-01
The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.
AN OVERVIEW OF THE INTEROPERABILITY ROADMAP FOR COM/.NET-BASED CAPE-OPEN
The CAPE-OPEN standard interfaces have been designed to permit flexibility and modularization of process simulation environments (PMEs) in order to use process modeling components such as unit operation or thermodynamic property models across a range of tolls employed in the life...
Human activities involving significant terrain alteration (e.g., earthworks operations associated with mines, urban development, landslides) can lead to broad-ranging changes in the surrounding terrestrial and aquatic environments. Potential aesthetic impacts can be associated wi...
Prototyping with AIDA for a hospital pharmacy system.
Molenaar, G C; Boon, W M
1987-01-01
The CENTRASYS system for the Hospital Pharmacy, developed as part of a research project of the Department of Medical Informatics is described. The role of AIDA, a fourth-generation software package, as a prototyping tool is discussed. It is concluded that AIDA facilitates prototyping and is also very suitable as a vehicle for systems in operation. It is further concluded that prototyping is of great help in the developmental phase of a project, but that great care has to be taken during evaluation of the prototypes: minimize the number of test sites and try to avoid that users become dependent on the system, because every prototype needs further tuning before it really becomes an operational system.
Prototype design based on NX subdivision modeling application
NASA Astrophysics Data System (ADS)
Zhan, Xianghui; Li, Xiaoda
2018-04-01
Prototype design is an important part of the product design, through a quick and easy way to draw a three-dimensional product prototype. Combined with the actual production, the prototype could be modified several times, resulting in a highly efficient and reasonable design before the formal design. Subdivision modeling is a common method of modeling product prototypes. Through Subdivision modeling, people can in a short time with a simple operation to get the product prototype of the three-dimensional model. This paper discusses the operation method of Subdivision modeling for geometry. Take a vacuum cleaner as an example, the NX Subdivision modeling functions are applied. Finally, the development of Subdivision modeling is forecasted.
Medical Device Plug-and-Play Interoperability Standards and Technology Leadership
2017-10-01
Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...Sept 2016 – 20 Sept 2017 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology ...efficiency through interoperable medical technologies . We played a leadership role on interoperability safety standards (AAMI, AAMI/UL Joint
Moving Towards a Science-Driven Workbench for Earth Science Solutions
NASA Astrophysics Data System (ADS)
Graves, S. J.; Djorgovski, S. G.; Law, E.; Yang, C. P.; Keiser, K.
2017-12-01
The NSF-funded EarthCube Integration and Test Environment (ECITE) prototype was proposed as a 2015 Integrated Activities project and resulted in the prototyping of an EarthCube federated cloud environment and the Integration and Testing Framework. The ECITE team has worked with EarthCube science and technology governance committees to define the types of integration, testing and evaluation necessary to achieve and demonstrate interoperability and functionality that benefit and support the objectives of the EarthCube cyber-infrastructure. The scope of ECITE also includes reaching beyond NSF and EarthCube to work with the broader Earth science community, such as the Earth Science Information Partners (ESIP) to incorporate lessons learned from other testbed activities, and ultimately provide broader community benefits. This presentation will discuss evolving ECITE ideas for a science-driven workbench that will start with documented science use cases, map the use cases to solution scenarios that identify the available technology and data resources that match the use case, the generation of solution workflows and test plans, the testing and evaluation of the solutions in a cloud environment, and finally the documentation of identified technology and data gaps that will assist with driving the development of additional EarthCube resources.
Dynamic Communication Resource Negotiations
NASA Technical Reports Server (NTRS)
Chow, Edward; Vatan, Farrokh; Paloulian, George; Frisbie, Steve; Srostlik, Zuzana; Kalomiris, Vasilios; Apgar, Daniel
2012-01-01
Today's advanced network management systems can automate many aspects of the tactical networking operations within a military domain. However, automation of joint and coalition tactical networking across multiple domains remains challenging. Due to potentially conflicting goals and priorities, human agreement is often required before implementation into the network operations. This is further complicated by incompatible network management systems and security policies, rendering it difficult to implement automatic network management, thus requiring manual human intervention to the communication protocols used at various network routers and endpoints. This process of manual human intervention is tedious, error-prone, and slow. In order to facilitate a better solution, we are pursuing a technology which makes network management automated, reliable, and fast. Automating the negotiation of the common network communication parameters between different parties is the subject of this paper. We present the technology that enables inter-force dynamic communication resource negotiations to enable ad-hoc inter-operation in the field between force domains, without pre-planning. It also will enable a dynamic response to changing conditions within the area of operations. Our solution enables the rapid blending of intra-domain policies so that the forces involved are able to inter-operate effectively without overwhelming each other's networks with in-appropriate or un-warranted traffic. It will evaluate the policy rules and configuration data for each of the domains, then generate a compatible inter-domain policy and configuration that will update the gateway systems between the two domains.
Galarraga, M; Serrano, L; Martinez, I; de Toledo, P; Reynolds, Melvin
2007-01-01
Advances in Information and Communication Technologies, ICT, are bringing new opportunities and use cases in the field of systems and Personal Health Devices used for the telemonitoring of citizens in Home or Mobile scenarios. At a time of such challenges, this review arises from the need to identify robust technical telemonitoring solutions that are both open and interoperable. These systems demand standardized solutions to be cost effective and to take advantage of standardized operation and interoperability. Thus, the fundamental challenge is to design plug-&-play devices that, either as individual elements or as components, can be incorporated in a simple way into different Telecare systems, perhaps configuring a personal user network. Moreover, there is an increasing market pressure from companies not traditionally involved in medical markets, asking for a standard for Personal Health Devices, which foresee a vast demand for telemonitoring, wellness, Ambient Assisted Living (AAL) and e-health applications. However, the newly emerging situations imply very strict requirements for the protocols involved in the communication. The ISO/IEEE 11073 family of standards is adapting and moving in order to face the challenge and might appear the best positioned international standards to reach this goal. This work presents an updated survey of these standards, trying to track the changes that are being fulfilled, and tries to serve as a starting-point for those who want to familiarize themselves with them.
Distributed controller clustering in software defined networks
Gani, Abdullah; Akhunzada, Adnan; Talebian, Hamid; Choo, Kim-Kwang Raymond
2017-01-01
Software Defined Networking (SDN) is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs) brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN) SDN and Open Network Operating System (ONOS) controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability. PMID:28384312
NASA Astrophysics Data System (ADS)
Murarka, Naveen; Chambers, Jon
2012-06-01
Multiple sensors, providing actionable intelligence to the war fighter, often have difficulty interoperating with each other. Northrop Grumman (NG) is dedicated to solving these problems and providing complete solutions for persistent surveillance. In August, 2011, NG was invited to participate in the Tactical Network Topology (TNT) Capabilities Based Experimentation at Camp Roberts, CA to demonstrate integrated system capabilities providing Forward Operating Base (FOB) protection. This experiment was an opportunity to leverage previous efforts from NG's Rotorcraft Avionics Innovation Laboratory (RAIL) to integrate five prime systems with widely different capabilities. The five systems included a Hostile Fire and Missile Warning Sensor System, SCORPION II Unattended Ground Sensor system, Smart Integrated Vehicle Area Network (SiVAN), STARLite Synthetic Aperture Radar (SAR)/Ground Moving Target Indications (GMTI) radar system, and a vehicle with Target Location Module (TLM) and Laser Designation Module (LDM). These systems were integrated with each other and a Tactical Operations Center (TOC) equipped with RaptorX and Falconview providing a Common Operational Picture (COP) via Cursor on Target (CoT) messages. This paper will discuss this exercise, and the lessons learned, by integrating these five prime systems for persistent surveillance and FOB protection.
NASA Astrophysics Data System (ADS)
Shiklomanov, A. I.; Okladnikov, I.; Gordov, E. P.; Proussevitch, A. A.; Titov, A. G.
2016-12-01
Presented is a collaborative project carrying out by joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center, University of New Hampshire, USA. Its main objective is development of a hardware and software prototype of Distributed Research Center (DRC) for monitoring and projecting of regional climatic and and their impacts on the environment over the Northern extratropical areas. In the framework of the project new approaches to "cloud" processing and analysis of large geospatial datasets (big geospatial data) are being developed. It will be deployed on technical platforms of both institutions and applied in research of climate change and its consequences. Datasets available at NCEI and IMCES include multidimensional arrays of climatic, environmental, demographic, and socio-economic characteristics. The project is aimed at solving several major research and engineering tasks: 1) structure analysis of huge heterogeneous climate and environmental geospatial datasets used in the project, their preprocessing and unification; 2) development of a new distributed storage and processing model based on a "shared nothing" paradigm; 3) development of a dedicated database of metadata describing geospatial datasets used in the project; 4) development of a dedicated geoportal and a high-end graphical frontend providing intuitive user interface, internet-accessible online tools for analysis of geospatial data and web services for interoperability with other geoprocessing software packages. DRC will operate as a single access point to distributed archives of spatial data and online tools for their processing. Flexible modular computational engine running verified data processing routines will provide solid results of geospatial data analysis. "Cloud" data analysis and visualization approach will guarantee access to the DRC online tools and data from all over the world. Additionally, exporting of data processing results through WMS and WFS services will be used to provide their interoperability. Financial support of this activity by the RF Ministry of Education and Science under Agreement 14.613.21.0037 (RFMEFI61315X0037) and by the Iola Hubbard Climate Change Endowment is acknowledged.
Enriched biodiversity data as a resource and service.
Vos, Rutger Aldo; Biserkov, Jordan Valkov; Balech, Bachir; Beard, Niall; Blissett, Matthew; Brenninkmeijer, Christian; van Dooren, Tom; Eades, David; Gosline, George; Groom, Quentin John; Hamann, Thomas D; Hettling, Hannes; Hoehndorf, Robert; Holleman, Ayco; Hovenkamp, Peter; Kelbert, Patricia; King, David; Kirkup, Don; Lammers, Youri; DeMeulemeester, Thibaut; Mietchen, Daniel; Miller, Jeremy A; Mounce, Ross; Nicolson, Nicola; Page, Rod; Pawlik, Aleksandra; Pereira, Serrano; Penev, Lyubomir; Richards, Kevin; Sautter, Guido; Shorthouse, David Peter; Tähtinen, Marko; Weiland, Claus; Williams, Alan R; Sierra, Soraya
2014-01-01
Recent years have seen a surge in projects that produce large volumes of structured, machine-readable biodiversity data. To make these data amenable to processing by generic, open source "data enrichment" workflows, they are increasingly being represented in a variety of standards-compliant interchange formats. Here, we report on an initiative in which software developers and taxonomists came together to address the challenges and highlight the opportunities in the enrichment of such biodiversity data by engaging in intensive, collaborative software development: The Biodiversity Data Enrichment Hackathon. The hackathon brought together 37 participants (including developers and taxonomists, i.e. scientific professionals that gather, identify, name and classify species) from 10 countries: Belgium, Bulgaria, Canada, Finland, Germany, Italy, the Netherlands, New Zealand, the UK, and the US. The participants brought expertise in processing structured data, text mining, development of ontologies, digital identification keys, geographic information systems, niche modeling, natural language processing, provenance annotation, semantic integration, taxonomic name resolution, web service interfaces, workflow tools and visualisation. Most use cases and exemplar data were provided by taxonomists. One goal of the meeting was to facilitate re-use and enhancement of biodiversity knowledge by a broad range of stakeholders, such as taxonomists, systematists, ecologists, niche modelers, informaticians and ontologists. The suggested use cases resulted in nine breakout groups addressing three main themes: i) mobilising heritage biodiversity knowledge; ii) formalising and linking concepts; and iii) addressing interoperability between service platforms. Another goal was to further foster a community of experts in biodiversity informatics and to build human links between research projects and institutions, in response to recent calls to further such integration in this research domain. Beyond deriving prototype solutions for each use case, areas of inadequacy were discussed and are being pursued further. It was striking how many possible applications for biodiversity data there were and how quickly solutions could be put together when the normal constraints to collaboration were broken down for a week. Conversely, mobilising biodiversity knowledge from their silos in heritage literature and natural history collections will continue to require formalisation of the concepts (and the links between them) that define the research domain, as well as increased interoperability between the software platforms that operate on these concepts.
Cook, Timothy Wayne; Cavalini, Luciana Tricai
2016-01-01
Objectives To present the technical background and the development of a procedure that enriches the semantics of Health Level Seven version 2 (HL7v2) messages for software-intensive systems in telemedicine trauma care. Methods This study followed a multilevel model-driven approach for the development of semantically interoperable health information systems. The Pre-Hospital Trauma Life Support (PHTLS) ABCDE protocol was adopted as the use case. A prototype application embedded the semantics into an HL7v2 message as an eXtensible Markup Language (XML) file, which was validated against an XML schema that defines constraints on a common reference model. This message was exchanged with a second prototype application, developed on the Mirth middleware, which was also used to parse and validate both the original and the hybrid messages. Results Both versions of the data instance (one pure XML, one embedded in the HL7v2 message) were equally validated and the RDF-based semantics recovered by the receiving side of the prototype from the shared XML schema. Conclusions This study demonstrated the semantic enrichment of HL7v2 messages for intensive-software telemedicine systems for trauma care, by validating components of extracts generated in various computing environments. The adoption of the method proposed in this study ensures the compliance of the HL7v2 standard in Semantic Web technologies. PMID:26893947
Big issues, small systems: managing with information in medical research.
Jones, J; Preston, H
2000-08-01
This subject of this article is the design of a database system for handling files related to the work of the Molecular Genetics Department of the International Blood Group Reference Laboratory. It examines specialist information needs identified within this organization and it indicates how the design of the Rhesus Information Tracking System was able to meet current needs. Rapid Applications Development prototyping forms the basis of the investigation, linked to interview, questionnaire, and observation techniques in order to establish requirements for interoperability. In particular, the place of this specialist database within the much broader information strategy of the National Blood Service will be examined. This unique situation is analogous to management activities in broader environments and a number of generic issues are highlighted by the research.
The challenges facing wearable sensor systems.
McAdams, Eric; Gehin, Claudine; Massot, Bertrand; McLaughlin, James
2012-01-01
It has been pointed out that, in spite of significant national and international funding programmes, there is a dearth of successfully commercialised wearable monitoring systems. Although problems such as financial reimbursement, device interoperability and the present lack of the required connected healthcare infrastructure are major hurdles to the provision of remote clinical monitoring of home-based patients, the "Mount Everest" of monitoring applications, why are wearable systems not already commercialised and used in less demanding applications? The numerous wearable systems which appear on the Web and even in the literature are, for the most part, basic prototypes unsuited to the demands of real-life applications. SMEs which do seek to commercialise clinically promising systems are unfortunately faced with many challenges and few as yet have survived long enough to successfully commercialise their innovations.
A New Design for Airway Management Training with Mixed Reality and High Fidelity Modeling.
Shen, Yunhe; Hananel, David; Zhao, Zichen; Burke, Daniel; Ballas, Crist; Norfleet, Jack; Reihsen, Troy; Sweet, Robert
2016-01-01
Restoring airway function is a vital task in many medical scenarios. Although various simulation tools have been available for learning such skills, recent research indicated that fidelity in simulating airway management deserves further improvements. In this study, we designed and implemented a new prototype for practicing relevant tasks including laryngoscopy, intubation and cricothyrotomy. A large amount of anatomical details or landmarks were meticulously selected and reconstructed from medical scans, and 3D-printed or molded to the airway intervention model. This training model was augmented by virtually and physically presented interactive modules, which are interoperable with motion tracking and sensor data feedback. Implementation results showed that this design is a feasible approach to develop higher fidelity airway models that can be integrated with mixed reality interfaces.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-27
... works such as video games and slide presentations). B. Computer programs that enable wireless telephone... enabling interoperability of such applications, when they have been lawfully obtained, with computer... new printer driver to a computer constitutes a `modification' of the operating system already...
Smart Grid | Climate Neutral Research Campuses | NREL
begun to build smart grids. Most operate electricity grids that include power generation; load control plant managers use these communications for energy management and load shedding, which are among the top familiar with equipment interoperability, central dispatch, and load shedding. These are common in smart
Continuation of the interoperable coordinated signal system deployment in White Plains, New York.
DOT National Transportation Integrated Search
2015-12-01
The City of White Plains, NY owns and operates an advanced traffic control system (TCS) that monitors : and controls over 130 intersections in real time. Its Traffic Department facility is not staffed 24 hours a : day, 7 days a week, but two other ce...
Opening up Library Automation Software
ERIC Educational Resources Information Center
Breeding, Marshall
2009-01-01
Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…
Grossman, Robert L.; Heath, Allison; Murphy, Mark; Patterson, Maria; Wells, Walt
2017-01-01
Data commons collocate data, storage, and computing infrastructure with core services and commonly used tools and applications for managing, analyzing, and sharing data to create an interoperable resource for the research community. An architecture for data commons is described, as well as some lessons learned from operating several large-scale data commons. PMID:29033693
ERIC Educational Resources Information Center
Thornton, Bradley D.; Smalley, Robert A.
2008-01-01
Building information modeling (BIM) uses three-dimensional modeling concepts, information technology and interoperable software to design, construct and operate a facility. However, BIM can be more than a tool for virtual modeling--it can provide schools with a 3-D walkthrough of a project while it still is on the electronic drawing board. BIM can…
The Mars Express/NASA Project at JPL
NASA Technical Reports Server (NTRS)
Thompson, Thomas W.; Horttor, R. L.; Acton, C. H., Jr.; Zamani, P.; Johnson, W. T. K.; Plaut, J. J.; Holmes, D. P.; No, S.; Asmar, S. W.; Goltz, G.
2005-01-01
An overview of the Mars Express/NASA Project at JPL is presented. The topics include: 1) Mars Express Mission Experiments and Investigators; 2) Mars Advanced Radar for Subsurface and Ionospheric Soundig (MARSIS) Overview; 3) MARSIS Experiment Overview; 4) Interoperability Concept; 5) Mars Express Science Operations; 6) Mars Express Schedule (2003-2007);
A Future Vision for Remotely Piloted Aircraft: Leveraging Interoperability and Networked Operations
2013-06-21
over the next 25 years Balances the effects envisioned in the USAF UAS Flight Plan with the reality of constrained resources and ambitious...theater-level unmanned systems must detect, avoid, or counter threats – operating from permissive to highly contested access in all weather...Rapid Reaction Group II/III SUAS Unit Light Footprint, Low Cost ISR Option Networked Autonomous C2 System Air-Launched SUAS Common
Training Challenges for the U.S. Army in the Pacific
2013-03-01
each other on a frequent enough basis to be thought of as “interoperable.” All the parties involved have different standard operating procedures ...13 All have different command and control technologies and all have different procedures to plan, prepare, coordinate and synchronize operations...does not fix it. Joint Task Force tactics, techniques and procedures are eventually developed but they take time and amount to a band aid as opposed
Graphics-oriented Battlefield Tracking Systems: U.S. Army and Air Force Interoperability
2010-12-10
systems. There are hundreds of applications used to control joint air operations, to synchronize land forces, and to develop a common operational...plan and synchronize the employment of 5 combat power (Department of Defense 2010c, II-9). The subsystem that facilitates graphics-oriented...during the battle recounts, “we had so many different assets up in the air . . . they were stacked up on so many different levels” ( Jung 2009
The Armys Armored Multi Purpose Vehicle (AMPV): Background and Issues for Congress
2017-01-11
M-113 personnel carriers, which are still in service in a variety of support capacities in Armored Brigade Combat Teams (ABCTs). While M-113s no...reliability, and interoperability by mission role variant within the Heavy Brigade Combat Team (HBCT) [now known as the Armored Brigade Combat Team – ABCT... teams within complex operational environments. For example, “commanders will not allow them to leave Forward Operating Bases (FOBs) or enter
2004-11-01
fixed schedule checkups and overhauls), Underway Replenishment (the goal being operation in sea states 3 and higher via technical improvements to...determine whether sufficient local resources are available to deal with current conditions. Scheduling Agent: Assists the Emergency Operations Bureau to...will commence per predefined schedule within 15 minutes) and subsequently alerts its subscribers that the rolling power blackout has commenced. The
Human Factors and Technical Considerations for a Computerized Operator Support System Prototype
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ulrich, Thomas Anthony; Lew, Roger Thomas; Medema, Heather Dawne
2015-09-01
A prototype computerized operator support system (COSS) has been developed in order to demonstrate the concept and provide a test bed for further research. The prototype is based on four underlying elements consisting of a digital alarm system, computer-based procedures, PI&D system representations, and a recommender module for mitigation actions. At this point, the prototype simulates an interface to a sensor validation module and a fault diagnosis module. These two modules will be fully integrated in the next version of the prototype. The initial version of the prototype is now operational at the Idaho National Laboratory using the U.S. Departmentmore » of Energy’s Light Water Reactor Sustainability (LWRS) Human Systems Simulation Laboratory (HSSL). The HSSL is a full-scope, full-scale glass top simulator capable of simulating existing and future nuclear power plant main control rooms. The COSS is interfaced to the Generic Pressurized Water Reactor (gPWR) simulator with industry-typical control board layouts. The glass top panels display realistic images of the control boards that can be operated by touch gestures. A section of the simulated control board was dedicated to the COSS human-system interface (HSI), which resulted in a seamless integration of the COSS into the normal control room environment. A COSS demonstration scenario has been developed for the prototype involving the Chemical & Volume Control System (CVCS) of the PWR simulator. It involves a primary coolant leak outside of containment that would require tripping the reactor if not mitigated in a very short timeframe. The COSS prototype presents a series of operator screens that provide the needed information and soft controls to successfully mitigate the event.« less
Interoperability and information discovery
Christian, E.
2001-01-01
In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.
Federal Register 2010, 2011, 2012, 2013, 2014
2015-08-03
...] Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments AGENCY: Food... workshop entitled ``FDA/CDC/NLM Workshop on Promoting Semantic Interoperability of Laboratory Data.'' The... to promoting the semantic interoperability of laboratory data between in vitro diagnostic devices and...
Biometric identification: a holistic perspective
NASA Astrophysics Data System (ADS)
Nadel, Lawrence D.
2007-04-01
Significant advances continue to be made in biometric technology. However, the global war on terrorism and our increasingly electronic society have created the societal need for large-scale, interoperable biometric capabilities that challenge the capabilities of current off-the-shelf technology. At the same time, there are concerns that large-scale implementation of biometrics will infringe our civil liberties and offer increased opportunities for identity theft. This paper looks beyond the basic science and engineering of biometric sensors and fundamental matching algorithms and offers approaches for achieving greater performance and acceptability of applications enabled with currently available biometric technologies. The discussion focuses on three primary biometric system aspects: performance and scalability, interoperability, and cost benefit. Significant improvements in system performance and scalability can be achieved through careful consideration of the following elements: biometric data quality, human factors, operational environment, workflow, multibiometric fusion, and integrated performance modeling. Application interoperability hinges upon some of the factors noted above as well as adherence to interface, data, and performance standards. However, there are times when the price of conforming to such standards can be decreased local system performance. The development of biometric performance-based cost benefit models can help determine realistic requirements and acceptable designs.
Expert operator's associate: A knowledge based system for spacecraft control
NASA Technical Reports Server (NTRS)
Nielsen, Mogens; Grue, Klaus; Lecouat, Francois
1991-01-01
The Expert Operator's Associate (EOA) project is presented which studies the applicability of expert systems for day-to-day space operations. A prototype expert system is developed, which operates on-line with an existing spacecraft control system at the European Space Operations Centre, and functions as an 'operator's assistant' in controlling satellites. The prototype is demonstrated using an existing real-time simulation model of the MARECS-B2 telecommunication satellite. By developing a prototype system, the extent to which reliability and effectivens of operations can be enhanced by AI based support is examined. In addition the study examines the questions of acquisition and representation of the 'knowledge' for such systems, and the feasibility of 'migration' of some (currently) ground-based functions into future spaceborne autonomous systems.
75 FR 63462 - Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-15
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid Interoperability Standards October 7, 2010... directs the development of a framework to achieve interoperability of smart grid devices and systems...
Federal Register 2010, 2011, 2012, 2013, 2014
2016-10-04
...] Workshop on Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments... Semantic Interoperability of Laboratory Data.'' The purpose of this public workshop is to receive and... Semantic Interoperability of Laboratory Data.'' Received comments will be placed in the docket and, except...
Building a Global Earth Observation System of Systems (GEOSS) and Its Interoperability Challenges
NASA Astrophysics Data System (ADS)
Ryan, B. J.
2015-12-01
Launched in 2005 by industrialized nations, the Group on Earth Observations (GEO) began building the Global Earth Observation System of Systems (GEOSS). Consisting of both a policy framework, and an information infrastructure, GEOSS, was intended to link and/or integrate the multitude of Earth observation systems, primarily operated by its Member Countries and Participating Organizations, so that users could more readily benefit from global information assets for a number of society's key environmental issues. It was recognized that having ready access to observations from multiple systems was a prerequisite for both environmental decision-making, as well as economic development. From the very start, it was also recognized that the shear complexity of the Earth's system cannot be captured by any single observation system, and that a federated, interoperable approach was necessary. While this international effort has met with much success, primarily in advancing broad, open data policies and practices, challenges remain. In 2014 (Geneva, Switzerland) and 2015 (Mexico City, Mexico), Ministers from GEO's Member Countries, including the European Commission, came together to assess progress made during the first decade (2005 to 2015), and approve implementation strategies and mechanisms for the second decade (2016 to 2025), respectively. The approved implementation strategies and mechanisms are intended to advance GEOSS development thereby facilitating the increased uptake of Earth observations for informed decision-making. Clearly there are interoperability challenges that are technological in nature, and several will be discussed in this presentation. There are, however, interoperability challenges that can be better characterized as economic, governmental and/or political in nature, and these will be discussed as well. With the emergence of the Sustainable Development Goals (SDGs), the World Conference on Disaster Risk Reduction (WCDRR), and the United Nations Framework Convention on Climate Change (UNFCCC) having occurred this year, it will be essential that the interoperability challenges described herein, regardless of their nature, be expeditiously addressed so that Earth observations can indeed inform societal decision-making.
NASA Astrophysics Data System (ADS)
Mazzetti, P.; Nativi, S.; Verlato, M.; Angelini, V.
2009-04-01
In the context of the EU co-funded project CYCLOPS (http://www.cyclops-project.eu) the problem of designing an advanced e-Infrastructure for Civil Protection (CP) applications has been addressed. As a preliminary step, some studies about European CP systems and operational applications were performed in order to define their specific system requirements. At a higher level it was verified that CP applications are usually conceived to map CP Business Processes involving different levels of processing including data access, data processing, and output visualization. At their core they usually run one or more Earth Science models for information extraction. The traditional approach based on the development of monolithic applications presents some limitations related to flexibility (e.g. the possibility of running the same models with different input data sources, or different models with the same data sources) and scalability (e.g. launching several runs for different scenarios, or implementing more accurate and computing-demanding models). Flexibility can be addressed adopting a modular design based on a SOA and standard services and models, such as OWS and ISO for geospatial services. Distributed computing and storage solutions could improve scalability. Basing on such considerations an architectural framework has been defined. It is made of a Web Service layer providing advanced services for CP applications (e.g. standard geospatial data sharing and processing services) working on the underlying Grid platform. This framework has been tested through the development of prototypes as proof-of-concept. These theoretical studies and proof-of-concept demonstrated that although Grid and geospatial technologies would be able to provide significant benefits to CP applications in terms of scalability and flexibility, current platforms are designed taking into account requirements different from CP. In particular CP applications have strict requirements in terms of: a) Real-Time capabilities, privileging time-of-response instead of accuracy, b) Security services to support complex data policies and trust relationships, c) Interoperability with existing or planned infrastructures (e.g. e-Government, INSPIRE compliant, etc.). Actually these requirements are the main reason why CP applications differ from Earth Science applications. Therefore further research is required to design and implement an advanced e-Infrastructure satisfying those specific requirements. In particular five themes where further research is required were identified: Grid Infrastructure Enhancement, Advanced Middleware for CP Applications, Security and Data Policies, CP Applications Enablement, and Interoperability. For each theme several research topics were proposed and detailed. They are targeted to solve specific problems for the implementation of an effective operational European e-Infrastructure for CP applications.
Villain, Max A; Greenfield, David S
2003-01-01
To assess reproducibility of quadrantic and clock hour sectors of retinal nerve fiber layer thickness in normal eyes using optical coherence tomography. Normal eyes of healthy volunteers meeting eligibility criteria were imaged by two inexperienced operators. Six 360 degrees circular scans with a diameter of 3.4 mm centered on the optic disc were obtained during each scanning session, and a baseline image was formed using 3 high-quality images defined by the software. Images were obtained on three different days within a 4-week period. Variance and coefficient of variation (CV) were calculated for quadrantic and retinal nerve fiber layer clock hour sectors obtained from the baseline image. Five normal eyes were scanned. Intraoperator reproducibility was high. The mean (+/- SD) CV for total retinal nerve fiber layer thickness was 5.3 +/- 3.82% and 4.33 +/- 3.7% for operators 1 and 2, respectively. Interoperator reproducibility was good with statistically similar variance for all quadrantic and clock hour retinal nerve fiber layer parameters (P = .42 to .99). The nasal retinal nerve fiber layer was the most variable sector for both operators (mean CV: 10.42% and 7.83% for operators 1 and 2, respectively). Differences in mean total, nasal, temporal, and superior retinal nerve fiber layer thickness were not statistically significant between operators for all eyes; however, for inferior retinal nerve fiber layer thickness, there was a significant (P = .0007) difference between operators in one eye. Peripapillary retinal nerve fiber layer thickness assessments using optical coherence tomography have good intraoperator and interoperator reproducibility. Inexperienced operators can generate useful measurement data with acceptable levels of variance.
NASA Astrophysics Data System (ADS)
Kenney, M. A.
2014-12-01
Climate and environmental decisions require science that couples human and natural systems to quantify or articulate the observed physical, natural, and societal changes or likely consequences of different decision options. Despite the need for such policy-relevant research, multidisciplinary collaborations can be wrought with challenges of data integration, model interoperability, and communication across disciplinary divides. In this talk, I will present several examples where I have collaborated with colleagues from the physical, natural, and social sciences to develop novel, actionable science to inform decision-making. Specifically, I will discuss a cost analysis of water and sediment diversions to optimize land building in the Mississippi River delta (winner of American Geophysical Union Water Resources Research Editor's Choice Award 2014) and the development of a National Climate Indicator System that uses knowledge across the physical, natural, and social sciences to establish an end-to-end indicator system of climate changes, impacts, vulnerabilities, and responses. The latter project is in the process of moving from research to operations, an additional challenge and opportunity, as we work with the U.S. Global Change Research Program and their affiliated Federal agencies to establish it beyond the research prototype. Using these examples, I will provide some lessons learned that would have general applicability to socio-environmental research collaborations and integration of data, models, and information systems to support climate and environmental decision-making.
Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101
2012-07-01
interoperability, although they are supported by some interoperability attributes For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p
DOE Office of Scientific and Technical Information (OSTI.GOV)
Universal Common Communication Substrate (UCCS) is a low-level communication substrate that exposes high-performance communication primitives, while providing network interoperability. It is intended to support multiple upper layer protocol (ULPs) or programming models including SHMEM,UPC,Titanium,Co-Array Fortran,Global Arrays,MPI,GASNet, and File I/O. it provides various communication operations including one-sided and two-sided point-to-point, collectives, and remote atomic operations. In addition to operations for ULPs, it provides an out-of-band communication channel required typically required to wire-up communication libraries.
Research into display sharing techniques for distributed computing environments
NASA Technical Reports Server (NTRS)
Hugg, Steven B.; Fitzgerald, Paul F., Jr.; Rosson, Nina Y.; Johns, Stephen R.
1990-01-01
The X-based Display Sharing solution for distributed computing environments is described. The Display Sharing prototype includes the base functionality for telecast and display copy requirements. Since the prototype implementation is modular and the system design provided flexibility for the Mission Control Center Upgrade (MCCU) operational consideration, the prototype implementation can be the baseline for a production Display Sharing implementation. To facilitate the process the following discussions are presented: Theory of operation; System of architecture; Using the prototype; Software description; Research tools; Prototype evaluation; and Outstanding issues. The prototype is based on the concept of a dedicated central host performing the majority of the Display Sharing processing, allowing minimal impact on each individual workstation. Each workstation participating in Display Sharing hosts programs to facilitate the user's access to Display Sharing as host machine.
Sauer, Juergen; Sonderegger, Andreas
2009-07-01
An empirical study examined the impact of prototype fidelity on user behaviour, subjective user evaluation and emotion. The independent factors of prototype fidelity (paper prototype, computer prototype, fully operational appliance) and aesthetics of design (high vs. moderate) were varied in a between-subjects design. The 60 participants of the experiment were asked to complete two typical tasks of mobile phone usage: sending a text message and suppressing a phone number. Both performance data and a number of subjective measures were recorded. The results suggested that task completion time may be overestimated when a computer prototype is being used. Furthermore, users appeared to compensate for deficiencies in aesthetic design by overrating the aesthetic qualities of reduced fidelity prototypes. Finally, user emotions were more positively affected by the operation of the more attractive mobile phone than by the less appealing one.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-25
...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...
Warfighter IT Interoperability Standards Study
2012-07-22
data (e.g. messages) between systems ? ii) What process did you used to validate and certify semantic interoperability between your...other systems at this time There was no requirement to validate and certify semantic interoperability The DLS program exchanges data with... semantics Testing for System Compliance with Data Models Verify and Certify Interoperability Using Data
A new environment for multiple spacecraft power subsystem mission operations
NASA Technical Reports Server (NTRS)
Bahrami, K. A.
1990-01-01
The engineering analysis subsystem environment (EASE) is being developed to enable fewer controllers to monitor and control power and other spacecraft engineering subsystems. The EASE prototype has been developed to support simultaneous real-time monitoring of several spacecraft engineering subsystems. It is being designed to assist with offline analysis of telemetry data to determine trends, and to help formulate uplink commands to the spacecraft. An early version of the EASE prototype has been installed in the JPL Space Flight Operations Facility for online testing. The EASE prototype is installed in the Galileo Mission Support Area. The underlying concept, development, and testing of the EASE prototype and how it will aid in the ground operations of spacecraft power subsystems are discussed.
Enabling interoperability in planetary sciences and heliophysics: The case for an information model
NASA Astrophysics Data System (ADS)
Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.
2018-01-01
The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.
2014-12-12
Intercultural Factors. ................................................................................................. 27 Allied Administrative...infrastructure and communities .3 Kosovo’s final status resolution went through a number of developmental phases. In 2005, the UN assigned a Norwegian diplomat...sovereignty and territorial integrity 2. Support to civilian authorities 3. Support to communities 4. Participation in international and peace
2006-06-01
systems. Cyberspace is the electronic medium of net-centric operations, communications systems, and computers, in which horizontal integration and online...will be interoperable, more robust, responsive, and able to support faster spacecraft initialization times. This Intergrated Satellite Control... horizontally and vertically integrated information through machine-to-machine conversations enabled by a peer-based network of sensors, command
Code of Federal Regulations, 2012 CFR
2012-10-01
... physical relationships in the subsystem or system; (2) A description of the railroad operation or... requirements; (5) A preliminary human factors analysis, including a complete description of all human-machine interfaces and the impact of interoperability requirements on the same; (6) An analysis of the applicability...
Code of Federal Regulations, 2011 CFR
2011-10-01
... physical relationships in the subsystem or system; (2) A description of the railroad operation or... requirements; (5) A preliminary human factors analysis, including a complete description of all human-machine interfaces and the impact of interoperability requirements on the same; (6) An analysis of the applicability...
Code of Federal Regulations, 2013 CFR
2013-10-01
... physical relationships in the subsystem or system; (2) A description of the railroad operation or... requirements; (5) A preliminary human factors analysis, including a complete description of all human-machine interfaces and the impact of interoperability requirements on the same; (6) An analysis of the applicability...
Code of Federal Regulations, 2014 CFR
2014-10-01
... physical relationships in the subsystem or system; (2) A description of the railroad operation or... requirements; (5) A preliminary human factors analysis, including a complete description of all human-machine interfaces and the impact of interoperability requirements on the same; (6) An analysis of the applicability...
2017-02-22
manages operations through guidance, policies, programs, and organizations. The NSG is designed to be a mutually supportive enterprise that...deliberate technical design and deliberate human actions. Geospatial engineer teams (GETs) within the geospatial intelligence cells are the day-to-day...standards working group and are designated by the AGC Geospatial Acquisition Support Directorate as required for interoperability. Applicable standards
NASA Astrophysics Data System (ADS)
Tisdale, M.
2016-12-01
NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), OGC Web Coverage Services (WCS) and leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams and ASDC are utilizing these services, developing applications using the Web AppBuilder for ArcGIS and ArcGIS API for Javascript, and evaluating restructuring their data production and access scripts within the ArcGIS Python Toolbox framework and Geoprocessing service environment. These capabilities yield a greater usage and exposure of ASDC data holdings and provide improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry.
The Italian Cloud-based brokering Infrastructure to sustain Interoperability for Operative Hydrology
NASA Astrophysics Data System (ADS)
Boldrini, E.; Pecora, S.; Bussettini, M.; Bordini, F.; Nativi, S.
2015-12-01
This work presents the informatics platform carried out to implement the National Hydrological Operative Information System of Italy. In particular, the presentation will focus on the governing aspects of the cloud infrastructure and brokering software that make possible to sustain the hydrology data flow between heterogeneous user clients and data providers.The Institute for Environmental Protection and Research, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale) in collaboration with the Regional Agency for Environmental Protection in the Emilia-Romagna region, ARPA-ER (Agenzia Regionale per la Prevenzione e l´Ambiente dell´Emilia-Romagna) and CNR-IIA (National Research Council of Italy) designed and developed an innovative platform for the discovery and access of hydrological data coming from 19 Italian administrative regions and 2 Italian autonomous provinces, in near real time. ISPRA has deployed and governs such a system. The presentation will introduce and discuss the technological barriers for interoperability as well as social and policy ones. The adopted solutions will be described outlining the sustainability challenges and benefits.
Cloud-based Communications Planning Collaboration and Interoperability
2012-06-01
battle concept is derived from the observation that all actions in the battle space have the ability to affect other areas or functions in the battle... space . This is equally true for tactical networks, which grow and transform dynamically as an operation evolves. Changes in one aspect of the network...availability of any updated network plans not only to the local SYSCON and TECHCON, but to all other units operating in the battle space (keeping in mind
NASA Astrophysics Data System (ADS)
Pulsifer, P. L.; Parsons, M. A.; Duerr, R. E.; Fox, P. A.; Khalsa, S. S.; McCusker, J. P.; McGuinness, D. L.
2012-12-01
To address interoperability, we first need to understand how human perspectives and worldviews influence the way people conceive of and describe geophysical phenomena. There is never a single, unambiguous description of a phenomenon - the terminology used is based on the relationship people have with it and what their interests are. So how can these perspectives be reconciled in a way that is not only clear to different people but also formally described so that information systems can interoperate? In this paper we explore conceptions of Arctic sea ice as a means of exploring these issues. We examine multiple conceptions of sea ice and related processes as fundamental components of the Earth system. Arctic sea ice is undergoing rapid and dramatic decline. This will have huge impact on climate and biological systems as well as on shipping, exploration, human culture, and geopolitics. Local hunters, operational shipping forecasters, global climate researchers, and others have critical needs for sea ice data and information, but they conceive of, and describe sea ice phenomena in very different ways. Our hypothesis is that formally representing these diverse conceptions in a suite of formal ontologies can help facilitate sharing of information across communities and enhance overall Arctic data interoperability. We present initial work to model operational, research, and Indigenous (Iñupiat and Yup'ik) concepts of sea ice phenomena and data. Our results illustrate important and surprising differences in how these communities describe and represent sea ice, and we describe our approach to resolving incongruities and inconsistencies. We begin by exploring an intriguing information artifact, the World Meteorological Organization "egg code". The egg code is a compact, information rich way of illustrating detailed ice conditions that has been used broadly for a century. There is much agreement on construction and content encoding, but there are important regional differences in its application. Furthermore, it is an analog encoding scheme whose meaning has evolved over time. By semantically modeling the egg code, its subtle variations, and how it connects to other data, we illustrate a mechanism for translating across data formats and representations. But there are limits to what semantically modeling the egg-code can achieve. The egg-code and common operational sea ice formats do not address community needs, notably the timing and processes of sea ice freeze-up and break-up which have profound impact on local hunting, shipping, oil exploration, and safety. We work with local experts from four very different Indigenous communities and scientific creators of sea ice forecasts to establish an understanding of concepts and terminology related to fall freeze-up and spring break up from the individually represented regions. This helps expand our conceptions of sea ice while also aiding in understanding across cultures and communities, and in passing knowledge to younger generations. This is an early step to expanding concepts of interoperability to very different ways of knowing to make data truly relevant and locally useful.
Development of 3000 m Subsea Blowout Preventer Experimental Prototype
NASA Astrophysics Data System (ADS)
Cai, Baoping; Liu, Yonghong; Huang, Zhiqian; Ma, Yunpeng; Zhao, Yubin
2017-12-01
A subsea blowout preventer experimental prototype is developed to meet the requirement of training operators, and the prototype consists of hydraulic control system, electronic control system and small-sized blowout preventer stack. Both the hydraulic control system and the electronic system are dual-mode redundant systems. Each system works independently and is switchable when there are any malfunctions. And it significantly improves the operation reliability of the equipment.
NASA Technical Reports Server (NTRS)
Fern, Lisa
2017-01-01
The Phase 1 DAA Minimum Operational Performance Standards (MOPS) provided requirements for two classes of DAA equipment: equipment Class 1 contains the basic DAA equipment required to assist a pilot in remaining well clear, while equipment Class 2 integrates the Traffic Alert and Collision Avoidance (TCAS) II system. Thus, the Class 1 system provides RWC functionality only, while the Class 2 system is intended to provide both RWC and Collision Avoidance (CA) functionality, in compliance with the Minimum Aviation System Performance (MASPS) for the Interoperability of Airborne Collision Avoidance Systems. The FAAs TCAS Program Office is currently developing Airborne Collision Avoidance System X (ACAS X) to support the objectives of the Federal Aviation Administrations (FAA) Next Generation Air Transportation System Program (NextGen). ACAS X has a suite of variants with a common underlying design that are intended to be optimized for their intended airframes and operations. ACAS Xu being is designed for UAS and allows for new surveillance technologies and tailored logic for platforms with different performance characteristics. In addition to Collision Avoidance (CA) alerting and guidance, ACAS Xu is being tuned to provide RWC alerting and guidance in compliance with the SC 228 DAA MOPS. With a single logic performing both RWC and CA functions, ACAS Xu will provide industry with an integrated DAA solution that addresses many of the interoperability shortcomings of Phase I systems. While the MOPS for ACAS Xu will specify an integrated DAA system, it will need to show compliance with the RWC alerting thresholds and alerting requirements defined in the DAA Phase 2 MOPS. Further, some functional components of the ACAS Xu system such as the remote pilots displayed guidance might be mostly references to the corresponding requirements in the DAA MOPS. To provide a seamless, integrated, RWC-CA system to assist the pilot in remaining well clear and avoiding collisions, several issues need to be addressed within the Phase 2 SC-228 DAA efforts. Interoperability of the RWC and CA alerting and guidance, and ensuring pilot comprehension, compliance and performance, will be a primary research area.
Yang, Long; Shang, Xian-Wen; Fan, Jian-Nan; He, Zhi-Xu; Wang, Jian-Ji; Liu, Miao; Zhuang, Yong; Ye, Chuan
2016-01-01
To evaluate the effect of 3D printing in treating trimalleolar fractures and its roles in physician-patient communication, thirty patients with trimalleolar fractures were randomly divided into the 3D printing assisted-design operation group (Group A) and the no-3D printing assisted-design group (Group B). In Group A, 3D printing was used by the surgeons to produce a prototype of the actual fracture to guide the surgical treatment. All patients underwent open reduction and internal fixation. A questionnaire was designed for doctors and patients to verify the verisimilitude and effectiveness of the 3D-printed prototype. Meanwhile, the operation time and the intraoperative blood loss were compared between the two groups. The fracture prototypes were accurately printed, and the average overall score of the verisimilitude and effectiveness of the 3D-printed prototypes was relatively high. Both the operation time and the intraoperative blood loss in Group A were less than those in Group B (P < 0.05). Patient satisfaction using the 3D-printed prototype and the communication score were 9.3 ± 0.6 points. A 3D-printed prototype can faithfully reflect the anatomy of the fracture site; it can effectively help the doctors plan the operation and represent an effective tool for physician-patient communication.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-18
... Docket 07-100; FCC 11-6] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the... interoperable public safety broadband network. The establishment of a common air interface for 700 MHz public safety broadband networks will create a foundation for interoperability and provide a clear path for the...
Juzwishin, Donald W M
2009-01-01
Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.
D-ATM, a working example of health care interoperability: From dirt path to gravel road.
DeClaris, John-William
2009-01-01
For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.
Mission Control Technologies: A New Way of Designing and Evolving Mission Systems
NASA Technical Reports Server (NTRS)
Trimble, Jay; Walton, Joan; Saddler, Harry
2006-01-01
Current mission operations systems are built as a collection of monolithic software applications. Each application serves the needs of a specific user base associated with a discipline or functional role. Built to accomplish specific tasks, each application embodies specialized functional knowledge and has its own data storage, data models, programmatic interfaces, user interfaces, and customized business logic. In effect, each application creates its own walled-off environment. While individual applications are sometimes reused across multiple missions, it is expensive and time consuming to maintain these systems, and both costly and risky to upgrade them in the light of new requirements or modify them for new purposes. It is even more expensive to achieve new integrated activities across a set of monolithic applications. These problems impact the lifecycle cost (especially design, development, testing, training, maintenance, and integration) of each new mission operations system. They also inhibit system innovation and evolution. This in turn hinders NASA's ability to adopt new operations paradigms, including increasingly automated space systems, such as autonomous rovers, autonomous onboard crew systems, and integrated control of human and robotic missions. Hence, in order to achieve NASA's vision affordably and reliably, we need to consider and mature new ways to build mission control systems that overcome the problems inherent in systems of monolithic applications. The keys to the solution are modularity and interoperability. Modularity will increase extensibility (evolution), reusability, and maintainability. Interoperability will enable composition of larger systems out of smaller parts, and enable the construction of new integrated activities that tie together, at a deep level, the capabilities of many of the components. Modularity and interoperability together contribute to flexibility. The Mission Control Technologies (MCT) Project, a collaboration of multiple NASA Centers, led by NASA Ames Research Center, is building a framework to enable software to be assembled from flexible collections of components and services.
Impacts Assessment of Integrated Dynamic Transit Operations : Final Report
DOT National Transportation Integrated Search
2016-03-02
This document details the impact assessment conducted by the Volpe Center for the Integrated Dynamic Transit Operations (IDTO) prototype demonstrations in Columbus, Ohio and Central Florida. The prototype is one result of the U.S. Department of Trans...
Operations management system advanced automation: Fault detection isolation and recovery prototyping
NASA Technical Reports Server (NTRS)
Hanson, Matt
1990-01-01
The purpose of this project is to address the global fault detection, isolation and recovery (FDIR) requirements for Operation's Management System (OMS) automation within the Space Station Freedom program. This shall be accomplished by developing a selected FDIR prototype for the Space Station Freedom distributed processing systems. The prototype shall be based on advanced automation methodologies in addition to traditional software methods to meet the requirements for automation. A secondary objective is to expand the scope of the prototyping to encompass multiple aspects of station-wide fault management (SWFM) as discussed in OMS requirements documentation.
Multiple video sequences synchronization during minimally invasive surgery
NASA Astrophysics Data System (ADS)
Belhaoua, Abdelkrim; Moreau, Johan; Krebs, Alexandre; Waechter, Julien; Radoux, Jean-Pierre; Marescaux, Jacques
2016-03-01
Hybrid operating rooms are an important development in the medical ecosystem. They allow integrating, in the same procedure, the advantages of radiological imaging and surgical tools. However, one of the challenges faced by clinical engineers is to support the connectivity and interoperability of medical-electrical point-of-care devices. A system that could enable plug-and-play connectivity and interoperability for medical devices would improve patient safety, save hospitals time and money, and provide data for electronic medical records. In this paper, we propose a hardware platform dedicated to collect and synchronize multiple videos captured from medical equipment in real-time. The final objective is to integrate augmented reality technology into an operation room (OR) in order to assist the surgeon during a minimally invasive operation. To the best of our knowledge, there is no prior work dealing with hardware based video synchronization for augmented reality applications on OR. Whilst hardware synchronization methods can embed temporal value, so called timestamp, into each sequence on-the-y and require no post-processing, they require specialized hardware. However the design of our hardware is simple and generic. This approach was adopted and implemented in this work and its performance is evaluated by comparison to the start-of-the-art methods.
Interoperability of Information Systems Managed and Used by the Local Health Departments.
Shah, Gulzar H; Leider, Jonathon P; Luo, Huabin; Kaur, Ravneet
2016-01-01
In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide.
SP-100 GES/NAT radiation shielding systems design and development testing
NASA Astrophysics Data System (ADS)
Disney, Richard K.; Kulikowski, Henry D.; McGinnis, Cynthia A.; Reese, James C.; Thomas, Kevin; Wiltshire, Frank
1991-01-01
Advanced Energy Systems (AES) of Westinghouse Electric Corporation is under subcontract to the General Electric Company to supply nuclear radiation shielding components for the SP-100 Ground Engineering System (GES) Nuclear Assembly Test to be conducted at Westinghouse Hanford Company at Richland, Washington. The radiation shielding components are integral to the Nuclear Assembly Test (NAT) assembly and include prototypic and non-prototypic radiation shielding components which provide prototypic test conditions for the SP-100 reactor subsystem and reactor control subsystem components during the GES/NAT operations. W-AES is designing three radiation shield components for the NAT assembly; a prototypic Generic Flight System (GFS) shield, the Lower Internal Facility Shield (LIFS), and the Upper Internal Facility Shield (UIFS). This paper describes the design approach and development testing to support the design, fabrication, and assembly of these three shield components for use within the vacuum vessel of the GES/NAT. The GES/NAT shields must be designed to operate in a high vacuum which simulates space operations. The GFS shield and LIFS must provide prototypic radiation/thermal environments and mechanical interfaces for reactor system components. The NAT shields, in combination with the test facility shielding, must provide adequate radiation attenuation for overall test operations. Special design considerations account for the ground test facility effects on the prototypic GFS shield. Validation of the GFS shield design and performance will be based on detailed Monte Carlo analyses and developmental testing of design features. Full scale prototype testing of the shield subsystems is not planned.
Creating executable architectures using Visual Simulation Objects (VSO)
NASA Astrophysics Data System (ADS)
Woodring, John W.; Comiskey, John B.; Petrov, Orlin M.; Woodring, Brian L.
2005-05-01
Investigations have been performed to identify a methodology for creating executable models of architectures and simulations of architecture that lead to an understanding of their dynamic properties. Colored Petri Nets (CPNs) are used to describe architecture because of their strong mathematical foundations, the existence of techniques for their verification and graph theory"s well-established history of success in modern science. CPNs have been extended to interoperate with legacy simulations via a High Level Architecture (HLA) compliant interface. It has also been demonstrated that an architecture created as a CPN can be integrated with Department of Defense Architecture Framework products to ensure consistency between static and dynamic descriptions. A computer-aided tool, Visual Simulation Objects (VSO), which aids analysts in specifying, composing and executing architectures, has been developed to verify the methodology and as a prototype commercial product.
CCSDS SOIS Subnetwork Services: A First Reference Implementation
NASA Astrophysics Data System (ADS)
Gunes-Lasnet, S.; Notebaert, O.; Farges, P.-Y.; Fowell, S.
2008-08-01
The CCSDS SOIS working groups are developing a range of standards for spacecraft onboard interfaces with the intention of promoting reuse of hardware and software designs across a range of missions while enabling interoperability of onboard systems from diverse sources. The CCSDS SOIS working groups released in June 2007 their red books for both Subnetwork and application support layers. In order to allow the verification of these recommended standards and to pave the way for future implementation onboard spacecrafts, it is essential for these standards to be prototyped on a representative spacecraft platform, to provide valuable feed back to the SOIS working group. A first reference implementation of both Subnetwork and Application Support SOIS services over SpaceWire and Mil-Std-1553 bus is thus being realised by SciSys Ltd and Astrium under an ESA contract.
A risk management approach to CAIS development
NASA Technical Reports Server (NTRS)
Hart, Hal; Kerner, Judy; Alden, Tony; Belz, Frank; Tadman, Frank
1986-01-01
The proposed DoD standard Common APSE Interface Set (CAIS) was developed as a framework set of interfaces that will support the transportability and interoperability of tools in the support environments of the future. While the current CAIS version is a promising start toward fulfilling those goals and current prototypes provide adequate testbeds for investigations in support of completing specifications for a full CAIS, there are many reasons why the proposed CAIS might fail to become a usable product and the foundation of next-generation (1990'S) project support environments such as NASA's Space Station software support environment. The most critical threats to the viability and acceptance of the CAIS include performance issues (especially in piggybacked implementations), transportability, and security requirements. To make the situation worse, the solution to some of these threats appears to be at conflict with the solutions to others.
An infrastructure for ontology-based information systems in biomedicine: RICORDO case study.
Wimalaratne, Sarala M; Grenon, Pierre; Hoehndorf, Robert; Gkoutos, Georgios V; de Bono, Bernard
2012-02-01
The article presents an infrastructure for supporting the semantic interoperability of biomedical resources based on the management (storing and inference-based querying) of their ontology-based annotations. This infrastructure consists of: (i) a repository to store and query ontology-based annotations; (ii) a knowledge base server with an inference engine to support the storage of and reasoning over ontologies used in the annotation of resources; (iii) a set of applications and services allowing interaction with the integrated repository and knowledge base. The infrastructure is being prototyped and developed and evaluated by the RICORDO project in support of the knowledge management of biomedical resources, including physiology and pharmacology models and associated clinical data. The RICORDO toolkit and its source code are freely available from http://ricordo.eu/relevant-resources. sarala@ebi.ac.uk.
Donati, Marco; Camomilla, Valentina; Vannozzi, Giuseppe; Cappozzo, Aurelio
2008-07-19
The quantitative description of joint mechanics during movement requires the reconstruction of the position and orientation of selected anatomical axes with respect to a laboratory reference frame. These anatomical axes are identified through an ad hoc anatomical calibration procedure and their position and orientation are reconstructed relative to bone-embedded frames normally derived from photogrammetric marker positions and used to describe movement. The repeatability of anatomical calibration, both within and between subjects, is crucial for kinematic and kinetic end results. This paper illustrates an anatomical calibration approach, which does not require anatomical landmark manual palpation, described in the literature to be prone to great indeterminacy. This approach allows for the estimate of subject-specific bone morphology and automatic anatomical frame identification. The experimental procedure consists of digitization through photogrammetry of superficial points selected over the areas of the bone covered with a thin layer of soft tissue. Information concerning the location of internal anatomical landmarks, such as a joint center obtained using a functional approach, may also be added. The data thus acquired are matched with the digital model of a deformable template bone. Consequently, the repeatability of pelvis, knee and hip joint angles is determined. Five volunteers, each of whom performed five walking trials, and six operators, with no specific knowledge of anatomy, participated in the study. Descriptive statistics analysis was performed during upright posture, showing a limited dispersion of all angles (less than 3 deg) except for hip and knee internal-external rotation (6 deg and 9 deg, respectively). During level walking, the ratio of inter-operator and inter-trial error and an absolute subject-specific repeatability were assessed. For pelvic and hip angles, and knee flexion-extension the inter-operator error was equal to the inter-trial error-the absolute error ranging from 0.1 deg to 0.9 deg. Knee internal-external rotation and ab-adduction showed, on average, inter-operator errors, which were 8% and 28% greater than the relevant inter-trial errors, respectively. The absolute error was in the range 0.9-2.9 deg.
National electronic health record interoperability chronology.
Hufnagel, Stephen P
2009-05-01
The federal initiative for electronic health record (EHR) interoperability began in 2000 and set the stage for the establishment of the 2004 Executive Order for EHR interoperability by 2014. This article discusses the chronology from the 2001 e-Government Consolidated Health Informatics (CHI) initiative through the current congressional mandates for an aligned, interoperable, and agile DoD AHLTA and VA VistA.
On the formal definition of the systems' interoperability capability: an anthropomorphic approach
NASA Astrophysics Data System (ADS)
Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav
2017-03-01
The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.
Employing Semantic Technologies for the Orchestration of Government Services
NASA Astrophysics Data System (ADS)
Sabol, Tomáš; Furdík, Karol; Mach, Marián
The main aim of the eGovernment is to provide efficient, secure, inclusive services for its citizens and businesses. The necessity to integrate services and information resources, to increase accessibility, to reduce the administrative burden on citizens and enterprises - these are only a few reasons why the paradigm of the eGovernment has been shifted from the supply-driven approach toward the connected governance, emphasizing the concept of interoperability (Archmann and Nielsen 2008). On the EU level, the interoperability is explicitly addressed as one of the four main challenges, including in the i2010 strategy (i2010 2005). The Commission's Communication (Interoperability for Pan-European eGovernment Services 2006) strongly emphasizes the necessity of interoperable eGovernment services, based on standards, open specifications, and open interfaces. The Pan-European interoperability initiatives, such as the European Interoperability Framework (2004) and IDABC, as well as many projects supported by the European Commission within the IST Program and the Competitiveness and Innovation Program (CIP), illustrate the importance of interoperability on the EU level.
Empowering open systems through cross-platform interoperability
NASA Astrophysics Data System (ADS)
Lyke, James C.
2014-06-01
Most of the motivations for open systems lie in the expectation of interoperability, sometimes referred to as "plug-and-play". Nothing in the notion of "open-ness", however, guarantees this outcome, which makes the increased interest in open architecture more perplexing. In this paper, we explore certain themes of open architecture. We introduce the concept of "windows of interoperability", which can be used to align disparate portions of architecture. Such "windows of interoperability", which concentrate on a reduced set of protocol and interface features, might achieve many of the broader purposes assigned as benefits in open architecture. Since it is possible to engineer proprietary systems that interoperate effectively, this nuanced definition of interoperability may in fact be a more important concept to understand and nurture for effective systems engineering and maintenance.
In-field Access to Geoscientific Metadata through GPS-enabled Mobile Phones
NASA Astrophysics Data System (ADS)
Hobona, Gobe; Jackson, Mike; Jordan, Colm; Butchart, Ben
2010-05-01
Fieldwork is an integral part of much geosciences research. But whilst geoscientists have physical or online access to data collections whilst in the laboratory or at base stations, equivalent in-field access is not standard or straightforward. The increasing availability of mobile internet and GPS-supported mobile phones, however, now provides the basis for addressing this issue. The SPACER project was commissioned by the Rapid Innovation initiative of the UK Joint Information Systems Committee (JISC) to explore the potential for GPS-enabled mobile phones to access geoscientific metadata collections. Metadata collections within the geosciences and the wider geospatial domain can be disseminated through web services based on the Catalogue Service for Web(CSW) standard of the Open Geospatial Consortium (OGC) - a global grouping of over 380 private, public and academic organisations aiming to improve interoperability between geospatial technologies. CSW offers an XML-over-HTTP interface for querying and retrieval of geospatial metadata. By default, the metadata returned by CSW is based on the ISO19115 standard and encoded in XML conformant to ISO19139. The SPACER project has created a prototype application that enables mobile phones to send queries to CSW containing user-defined keywords and coordinates acquired from GPS devices built-into the phones. The prototype has been developed using the free and open source Google Android platform. The mobile application offers views for listing titles, presenting multiple metadata elements and a Google Map with an overlay of bounding coordinates of datasets. The presentation will describe the architecture and approach applied in the development of the prototype.
Lin, Shih-Sung; Hung, Min-Hsiung; Tsai, Chang-Lung; Chou, Li-Ping
2012-12-01
The study aims to provide an ease-of-use approach for senior patients to utilize remote healthcare systems. An ease-of-use remote healthcare system (RHS) architecture using RFID (Radio Frequency Identification) and networking technologies is developed. Specifically, the codes in RFID tags are used for authenticating the patients' ID to secure and ease the login process. The patient needs only to take one action, i.e. placing a RFID tag onto the reader, to automatically login and start the RHS and then acquire automatic medical services. An ease-of-use emergency monitoring and reporting mechanism is developed as well to monitor and protect the safety of the senior patients who have to be left alone at home. By just pressing a single button, the RHS can automatically report the patient's emergency information to the clinic side so that the responsible medical personnel can take proper urgent actions for the patient. Besides, Web services technology is used to build the Internet communication scheme of the RHS so that the interoperability and data transmission security between the home server and the clinical server can be enhanced. A prototype RHS is constructed to validate the effectiveness of our designs. Testing results show that the proposed RHS architecture possesses the characteristics of ease to use, simplicity to operate, promptness in login, and no need to preserve identity information. The proposed RHS architecture can effectively increase the willingness of senior patients who act slowly or are unfamiliar with computer operations to use the RHS. The research results can be used as an add-on for developing future remote healthcare systems.
Java Architecture for Detect and Avoid Extensibility and Modeling
NASA Technical Reports Server (NTRS)
Santiago, Confesor; Mueller, Eric Richard; Johnson, Marcus A.; Abramson, Michael; Snow, James William
2015-01-01
Unmanned aircraft will equip with a detect-and-avoid (DAA) system that enables them to comply with the requirement to "see and avoid" other aircraft, an important layer in the overall set of procedural, strategic and tactical separation methods designed to prevent mid-air collisions. This paper describes a capability called Java Architecture for Detect and Avoid Extensibility and Modeling (JADEM), developed to prototype and help evaluate various DAA technological requirements by providing a flexible and extensible software platform that models all major detect-and-avoid functions. Figure 1 illustrates JADEM's architecture. The surveillance module can be actual equipment on the unmanned aircraft or simulators that model the process by which sensors on-board detect other aircraft and provide track data to the traffic display. The track evaluation function evaluates each detected aircraft and decides whether to provide an alert to the pilot and its severity. Guidance is a combination of intruder track information, alerting, and avoidance/advisory algorithms behind the tools shown on the traffic display to aid the pilot in determining a maneuver to avoid a loss of well clear. All these functions are designed with a common interface and configurable implementation, which is critical in exploring DAA requirements. To date, JADEM has been utilized in three computer simulations of the National Airspace System, three pilot-in-the-loop experiments using a total of 37 professional UAS pilots, and two flight tests using NASA's Predator-B unmanned aircraft, named Ikhana. The data collected has directly informed the quantitative separation standard for "well clear", safety case, requirements development, and the operational environment for the DAA minimum operational performance standards. This work was performed by the Separation Assurance/Sense and Avoid Interoperability team under NASA's UAS Integration in the NAS project.
Advanced component testing : Kaskasia handbrake test.
DOT National Transportation Integrated Search
2016-07-01
The evaluation of a prototype remote operation handbrake showed that it can be installed on a car with only minor modifications to connect the air. This prototype did not set the emergency during any of the testing performed. The operation of the pro...
Design, Prototyping and Control of a Flexible Cystoscope for Biomedical Applications
NASA Astrophysics Data System (ADS)
Sozer, Canberk; Ghorbani, Morteza; Alcan, Gokhan; Uvet, Huseyin; Unel, Mustafa; Kosar, Ali
2017-07-01
Kidney stone and prostate hyperplasia are very common urogenital diseases all over the world. To treat these diseases, one of the ESWL (Extracorporeal Shock Wave Lithotripsy), PCNL (Percutaneous Nephrolithotomy), cystoscopes or open surgery techniques can be used. Cystoscopes named devices are used for in-vivo intervention. A flexible or rigid cystoscope device is inserted into human body and operates on interested area. In this study, a flexible cystoscope prototype has been developed. The prototype is able to bend up to ±40°in X and Y axes, has a hydrodynamic cavitation probe for rounding sharp edges of kidney stone or resection of the filled prostate with hydrodynamic cavitation method and contains a waterproof medical camera to give visual feedback to the operator. The operator steers the flexible end-effector via joystick toward target region. This paper presents design, manufacturing, control and experimental setup of the tendon driven flexible cystoscope prototype. The prototype is 10 mm in outer diameter, 70 mm in flexible part only and 120 mm in total length with flexible part and rigid tube. The experimental results show that the prototype bending mechanism, control system, manufactured prototype parts and experimental setup function properly. A small piece of real kidney stone was broken in targeted area.
Personal communications: An extension to the mobile satellite
NASA Technical Reports Server (NTRS)
Epstein, Murray; Draper, Francois
1990-01-01
As time progresses, customer demands become far more universal, involving integrated, simple to operate, cost effective services, with technology virtually transparent to the operator. Industry will be in a position of providing the necessary services to meet the subscribers' needs. Our resource based industries, transportation services, and utilities in the more rural and unserviced areas will require quality and affordable services that can only be supplied via satellite. One answer to these needs will be one- and two-way interoperable data messaging.
Making Stability Operations Less Complex While Improving Interoperability
2008-06-01
Concepts, Theory, Policy Multinational Endeavors Civil Military Endeavors by Erik Chaum , Gerard Christman Point of Contact: Gerard Christman...Interoperability” Mr. Erik Chaum Naval Undersea Warfare Center ChaumE @npt.nuwc.navy.mil Mr. Gerry Christman OASD NII (IICT) Gerard.christman.ctr...awareness and understanding. In a context as complex as StabOps effective com- 5 Alberts, David S. 2007
47 CFR 90.421 - Operation of mobile station units not under the control of the licensee.
Code of Federal Regulations, 2010 CFR
2010-10-01
... medical services activities. (3) On the Interoperability Channels in the 700 MHz Public Safety Band (See... in the 700 MHz Public Safety Band or by any licensee holding a license for any other public safety... hand-held and vehicular transmitters in the 700 MHz Band. (b) Industrial/Business Pool. Mobile units...
49 CFR 232.603 - Design, interoperability, and configuration management requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... requirements. (a) General. A freight car or freight train equipped with an ECP brake system shall, at a minimum...) Approval. A freight train or freight car equipped with an ECP brake system and equipment covered by the AAR...) Configuration management. A railroad operating a freight train or freight car equipped with ECP brake systems...
49 CFR 232.603 - Design, interoperability, and configuration management requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... requirements. (a) General. A freight car or freight train equipped with an ECP brake system shall, at a minimum...) Approval. A freight train or freight car equipped with an ECP brake system and equipment covered by the AAR...) Configuration management. A railroad operating a freight train or freight car equipped with ECP brake systems...
49 CFR 232.603 - Design, interoperability, and configuration management requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... requirements. (a) General. A freight car or freight train equipped with an ECP brake system shall, at a minimum...) Approval. A freight train or freight car equipped with an ECP brake system and equipment covered by the AAR...) Configuration management. A railroad operating a freight train or freight car equipped with ECP brake systems...
49 CFR 232.603 - Design, interoperability, and configuration management requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... requirements. (a) General. A freight car or freight train equipped with an ECP brake system shall, at a minimum...) Approval. A freight train or freight car equipped with an ECP brake system and equipment covered by the AAR...) Configuration management. A railroad operating a freight train or freight car equipped with ECP brake systems...
Central Corneal Thickness Reproducibility among Ten Different Instruments.
Pierro, Luisa; Iuliano, Lorenzo; Gagliardi, Marco; Ambrosi, Alessandro; Rama, Paolo; Bandello, Francesco
2016-11-01
To assess agreement between one ultrasonic (US) and nine optical instruments for the measurement of central corneal thickness (CCT), and to evaluate intra- and inter-operator reproducibility. In this observational cross-sectional study, two masked operators measured CCT thickness twice in 28 healthy eyes. We used seven spectral-domain optical coherence tomography (SD-OCT) devices, one time-domain OCT, one Scheimpflug camera, and one US-based instrument. Inter- and intra-operator reproducibility was evaluated by intraclass correlation coefficient (ICC), coefficient of variation (CV), and Bland-Altman test analysis. Instrument-to-instrument reproducibility was determined by ANOVA for repeated measurements. We also tested how the devices disagreed regarding systemic bias and random error using a structural equation model. Mean CCT of all instruments ranged from 536 ± 42 μm to 577 ± 40 μm. An instrument-to-instrument correlation test showed high values among the 10 investigated devices (correlation coefficient range 0.852-0.995; p values <0.0001 in all cases). The highest correlation coefficient values were registered between 3D OCT-2000 Topcon-Spectral OCT/SLO Opko (0.995) and Cirrus HD-OCT Zeiss-RS-3000 Nidek (0.995), whereas the lowest were seen between SS-1000 CASIA and Spectral OCT/SLO Opko (0.852). ICC and CV showed excellent inter- and intra-operator reproducibility for all optic-based devices, except for the US-based device. Bland-Altman analysis demonstrated low mean biases between operators. Despite highlighting good intra- and inter-operator reproducibility, we found that a scale bias between instruments might interfere with thorough CCT monitoring. We suggest that optimal monitoring is achieved with the same operator and the same device.
Interoperability of Information Systems Managed and Used by the Local Health Departments
Leider, Jonathon P.; Luo, Huabin; Kaur, Ravneet
2016-01-01
Background: In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). Objectives: To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. Data and Methods: This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. Results: For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Conclusion: Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide. PMID:27684616
A Fault-Oblivious Extreme-Scale Execution Environment (FOX)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Hensbergen, Eric; Speight, William; Xenidis, Jimi
IBM Research’s contribution to the Fault Oblivious Extreme-scale Execution Environment (FOX) revolved around three core research deliverables: • collaboration with Boston University around the Kittyhawk cloud infrastructure which both enabled a development and deployment platform for the project team and provided a fault-injection testbed to evaluate prototypes • operating systems research focused on exploring role-based operating system technologies through collaboration with Sandia National Labs on the NIX research operating system and collaboration with the broader IBM Research community around a hybrid operating system model which became known as FusedOS • IBM Research also participated in an advisory capacity with themore » Boston University SESA project, the core of which was derived from the K42 operating system research project funded in part by DARPA’s HPCS program. Both of these contributions were built on a foundation of previous operating systems research funding by the Department of Energy’s FastOS Program. Through the course of the X-stack funding we were able to develop prototypes, deploy them on production clusters at scale, and make them available to other researchers. As newer hardware, in the form of BlueGene/Q, came online, we were able to port the prototypes to the new hardware and release the source code for the resulting prototypes as open source to the community. In addition to the open source coded for the Kittyhawk and NIX prototypes, we were able to bring the BlueGene/Q Linux patches up to a more recent kernel and contribute them for inclusion by the broader Linux community. The lasting impact of the IBM Research work on FOX can be seen in its effect on the shift of IBM’s approach to HPC operating systems from Linux and Compute Node Kernels to role-based approaches as prototyped by the NIX and FusedOS work. This impact can be seen beyond IBM in follow-on ideas being incorporated into the proposals for the Exasacale Operating Systems/Runtime program.« less
Meshkati, Najmedin; Tabibzadeh, Maryam; Farshid, Ali; Rahimi, Mansour; Alhanaee, Ghena
2016-02-01
The aim of this study is to identify the interdependencies of human and organizational subsystems of multiple complex, safety-sensitive technological systems and their interoperability in the context of sustainability and resilience of an ecosystem. Recent technological disasters with severe environmental impact are attributed to human factors and safety culture causes. One of the most populous and environmentally sensitive regions in the world, the (Persian) Gulf, is on the confluence of an exponentially growing number of two industries--nuclear power and seawater desalination plants--that is changing its land- and seascape. Building upon Rasmussen's model, a macrosystem integrative framework, based on the broader context of human factors, is developed, which can be considered in this context as a "meta-ergonomics" paradigm, for the analysis of interactions, design of interoperability, and integration of decisions of major actors whose actions can affect safety and sustainability of the focused industries during routine and nonroutine (emergency) operations. Based on the emerging realities in the Gulf region, it is concluded that without such systematic approach toward addressing the interdependencies of water and energy sources, sustainability will be only a short-lived dream and prosperity will be a disappearing mirage for millions of people in the region. This multilayered framework for the integration of people, technology, and ecosystem--which has been applied to the (Persian) Gulf--offers a viable and vital approach to the design and operation of large-scale complex systems wherever the nexus of water, energy, and food sources are concerned, such as the Black Sea. © 2016, Human Factors and Ergonomics Society.
A distributed framework for health information exchange using smartphone technologies.
Abdulnabi, Mohamed; Al-Haiqi, Ahmed; Kiah, M L M; Zaidan, A A; Zaidan, B B; Hussain, Muzammil
2017-05-01
Nationwide health information exchange (NHIE) continues to be a persistent concern for government agencies, despite the many efforts and the conceived benefits of sharing patient data among healthcare providers. Difficulties in ensuring global connectivity, interoperability, and concerns on security have always hampered the government from successfully deploying NHIE. By looking at NHIE from a fresh perspective and bearing in mind the pervasiveness and power of modern mobile platforms, this paper proposes a new approach to NHIE that builds on the notion of consumer-mediated HIE, albeit without the focus on central health record banks. With the growing acceptance of smartphones as reliable, indispensable, and most personal devices, we suggest to leverage the concept of mobile personal health records (PHRs installed on smartphones) to the next level. We envision mPHRs that take the form of distributed storage units for health information, under the full control and direct possession of patients, who can have ready access to their personal data whenever needed. However, for the actual exchange of data with health information systems managed by healthcare providers, the latter have to be interoperable with patient-carried mPHRs. Computer industry has long ago solved a similar problem of interoperability between peripheral devices and operating systems. We borrow from that solution the idea of providing special interfaces between mPHRs and provider systems. This interface enables the two entities to communicate with no change to either end. The design and operation of the proposed approach is explained. Additional pointers on potential implementations are provided, and issues that pertain to any solution to implement NHIE are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.
Towards a Brokering Framework for Business Process Execution
NASA Astrophysics Data System (ADS)
Santoro, Mattia; Bigagli, Lorenzo; Roncella, Roberto; Mazzetti, Paolo; Nativi, Stefano
2013-04-01
Advancing our knowledge of environmental phenomena and their interconnections requires an intensive use of environmental models. Due to the complexity of Earth system, the representation of complex environmental processes often requires the use of more than one model (often from different disciplines). The Group on Earth Observation (GEO) launched the Model Web initiative to increase present accessibility and interoperability of environmental models, allowing their flexible composition into complex Business Processes (BPs). A few, basic principles are at the base of the Model Web concept (Nativi, et al.): (i) Open access, (ii) Minimal entry-barriers, (iii) Service-driven approach, and (iv) Scalability. This work proposes an architectural solution, based on the Brokering approach for multidisciplinary interoperability, aiming to contribute to the Model Web vision. The Brokering approach is currently adopted in the new GEOSS Common Infrastructure (GCI) as was presented at the last GEO Plenary meeting in Istanbul, November 2011. We designed and prototyped a component called BP Broker. The high-level functionalities provided by the BP Broker are: • Discover the needed model implementations in an open, distributed and heterogeneous environment; • Check I/O consistency of BPs and provide suggestions for mismatches resolving: • Publish the EBP as a standard model resource for re-use. • Submit the compiled BP (EBP) to a WF-engine for execution. A BP Broker has the following features: • Support multiple abstract BP specifications; • Support encoding in multiple WF-engine languages. According to the Brokering principles, the designed system is flexible enough to support the use of multiple BP design (visual) tools, heterogeneous Web interfaces for model execution (e.g. OGC WPS, WSDL, etc.), and different Workflow engines. The present implementation makes use of BPMN 2.0 notation for BP design and jBPM workflow engine for eBP execution; however, the strong decoupling which characterizes the design of the BP Broker easily allows supporting other technologies. The main benefits of the proposed approach are: (i) no need for a composition infrastructure, (ii) alleviation from technicalities of workflow definitions, (iii) support of incomplete BPs, and (iv) the reuse of existing BPs as atomic processes. The BP Broker was designed and prototyped in the EC funded projects EuroGEOSS (http://www.eurogeoss.eu) and UncertWeb (http://www.uncertweb.org); the latter project provided also the use scenarios that were used to test the framework: the eHabitat scenario (calculation habitat similarity likelihood) and the FERA scenario (impact of climate change on land-use and crop yield). Three more scenarios are presently under development. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreements n. 248488 and n. 226487. References Nativi, S., Mazzetti, P., & Geller, G. (2012), "Environmental model access and interoperability: The GEO Model Web initiative". Environmental Modelling & Software , 1-15
A Brokering Solution for Business Process Execution
NASA Astrophysics Data System (ADS)
Santoro, M.; Bigagli, L.; Roncella, R.; Mazzetti, P.; Nativi, S.
2012-12-01
Predicting the climate change impact on biodiversity and ecosystems, advancing our knowledge of environmental phenomena interconnection, assessing the validity of simulations and other key challenges of Earth Sciences require intensive use of environmental modeling. The complexity of Earth system requires the use of more than one model (often from different disciplines) to represent complex processes. The identification of appropriate mechanisms for reuse, chaining and composition of environmental models is considered a key enabler for an effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. The Group on Earth Observation (GEO) Model Web initiative aims to increase present accessibility and interoperability of environmental models, allowing their flexible composition into complex Business Processes (BPs). A few, basic principles are at the base of the Model Web concept (Nativi, et al.): 1. Open access 2. Minimal entry-barriers 3. Service-driven approach 4. Scalability In this work we propose an architectural solution aiming to contribute to the Model Web vision. This solution applies the Brokering approach for facilitiating complex multidisciplinary interoperability. The Brokering approach is currently adopted in the new GEOSS Common Infrastructure (GCI) as was presented at the last GEO Plenary meeting in Istanbul, November 2011. According to the Brokering principles, the designed system is flexible enough to support the use of multiple BP design (visual) tools, heterogeneous Web interfaces for model execution (e.g. OGC WPS, WSDL, etc.), and different Workflow engines. We designed and prototyped a component called BP Broker that is able to: (i) read an abstract BP, (ii) "compile" the abstract BP into an executable one (eBP) - in this phase the BP Broker might also provide recommendations for incomplete BPs and parameter mismatch resolution - and (iii) finally execute the eBP using a Workflow engine. The present implementation makes use of BPMN 2.0 notation for BP design and jBPM workflow engine for eBP execution; however, the strong decoupling which characterizes the design of the BP Broker easily allows supporting other technologies. The main benefits of the proposed approach are: (i) no need for a composition infrastructure, (ii) alleviation from technicalities of workflow definitions, (iii) support of incomplete BPs, and (iv) the reuse of existing BPs as atomic processes. The BP Broker was designed and prototyped in the EC funded projects EuroGEOSS (http://www.eurogeoss.eu) and UncertWeb (http://www.uncertweb.org); the latter project provided also the use scenarios that were used to test the framework: the eHabitat scenario (calculation habitat similarity likelihood) and the FERA scenario (impact of climate change on land-use and crop yield). Three more scenarios are presently under development. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreements n. 248488 and n. 226487. References Nativi, S., Mazzetti, P., & Geller, G. (2012), "Environmental model access and interoperability: The GEO Model Web initiative". Environmental Modelling & Software , 1-15
NASA Technical Reports Server (NTRS)
Moller, Bjorn; Garro, Alfredo; Falcone, Alberto; Crues, Edwin Z.; Dexter, Daniel E.
2016-01-01
Distributed and Real-Time Simulation plays a key-role in the Space domain being exploited for missions and systems analysis and engineering as well as for crew training and operational support. One of the most popular standards is the 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA). HLA supports the implementation of distributed simulations (called Federations) in which a set of simulation entities (called Federates) can interact using a Run-Time Infrastructure (RTI). In a given Federation, a Federate can publish and/or subscribes objects and interactions on the RTI only in accordance with their structures as defined in a FOM (Federation Object Model). Currently, the Space domain is characterized by a set of incompatible FOMs that, although meet the specific needs of different organizations and projects, increases the long-term cost for interoperability. In this context, the availability of a reference FOM for the Space domain will enable the development of interoperable HLA-based simulators for related joint projects and collaborations among worldwide organizations involved in the Space domain (e.g. NASA, ESA, Roscosmos, and JAXA). The paper presents a first set of results achieved by a SISO standardization effort that aims at providing a Space Reference FOM for international collaboration on Space systems simulations.
Elysee, Gerald; Herrin, Jeph; Horwitz, Leora I
2017-10-01
Stagnation in hospitals' adoption of data integration functionalities coupled with reduction in the number of operational health information exchanges could become a significant impediment to hospitals' adoption of 3 critical capabilities: electronic health information exchange, interoperability, and medication reconciliation, in which electronic systems are used to assist with resolving medication discrepancies and improving patient safety. Against this backdrop, we assessed the relationships between the 3 capabilities.We conducted an observational study applying partial least squares-structural equation modeling technique to 27 variables obtained from the 2013 American Hospital Association annual survey Information Technology (IT) supplement, which describes health IT capabilities.We included 1330 hospitals. In confirmatory factor analysis, out of the 27 variables, 15 achieved loading values greater than 0.548 at P < .001, as such were validated as the building blocks of the 3 capabilities. Subsequent path analysis showed a significant, positive, and cyclic relationship between the capabilities, in that decreases in the hospitals' adoption of one would lead to decreases in the adoption of the others.These results show that capability for high quality medication reconciliation may be impeded by lagging adoption of interoperability and health information exchange capabilities. Policies focused on improving one or more of these capabilities may have ancillary benefits.
Towards semantic interoperability for electronic health records.
Garde, Sebastian; Knaup, Petra; Hovenga, Evelyn; Heard, Sam
2007-01-01
In the field of open electronic health records (EHRs), openEHR as an archetype-based approach is being increasingly recognised. It is the objective of this paper to shortly describe this approach, and to analyse how openEHR archetypes impact on health professionals and semantic interoperability. Analysis of current approaches to EHR systems, terminology and standards developments. In addition to literature reviews, we organised face-to-face and additional telephone interviews and tele-conferences with members of relevant organisations and committees. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability -- both important prerequisites for semantic interoperability. Archetypes enable the formal definition of clinical content by clinicians. To enable comprehensive semantic interoperability, the development and maintenance of archetypes needs to be coordinated internationally and across health professions. Domain knowledge governance comprises a set of processes that enable the creation, development, organisation, sharing, dissemination, use and continuous maintenance of archetypes. It needs to be supported by information technology. To enable EHRs, semantic interoperability is essential. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability. However, without coordinated archetype development and maintenance, 'rank growth' of archetypes would jeopardize semantic interoperability. We therefore believe that openEHR archetypes and domain knowledge governance together create the knowledge environment required to adopt EHRs.
PACS/information systems interoperability using Enterprise Communication Framework.
alSafadi, Y; Lord, W P; Mankovich, N J
1998-06-01
Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).
The role of architecture and ontology for interoperability.
Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka
2010-01-01
Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.
SP-100 GES/NAT radiation shielding systems design and development testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Disney, R.K.; Kulikowski, H.D.; McGinnis, C.A.
1991-01-10
Advanced Energy Systems (AES) of Westinghouse Electric Corporation is under subcontract to the General Electric Company to supply nuclear radiation shielding components for the SP-100 Ground Engineering System (GES) Nuclear Assembly Test to be conducted at Westinghouse Hanford Company at Richland, Washington. The radiation shielding components are integral to the Nuclear Assembly Test (NAT) assembly and include prototypic and non-prototypic radiation shielding components which provide prototypic test conditions for the SP-100 reactor subsystem and reactor control subsystem components during the GES/NAT operations. W-AES is designing three radiation shield components for the NAT assembly; a prototypic Generic Flight System (GFS) shield,more » the Lower Internal Facility Shield (LIFS), and the Upper Internal Facility Shield (UIFS). This paper describes the design approach and development testing to support the design, fabrication, and assembly of these three shield components for use within the vacuum vessel of the GES/NAT. The GES/NAT shields must be designed to operate in a high vacuum which simulates space operations. The GFS shield and LIFS must provide prototypic radiation/thermal environments and mechanical interfaces for reactor system components. The NAT shields, in combination with the test facility shielding, must provide adequate radiation attenuation for overall test operations. Special design considerations account for the ground test facility effects on the prototypic GFS shield. Validation of the GFS shield design and performance will be based on detailed Monte Carlo analyses and developmental testing of design features. Full scale prototype testing of the shield subsystems is not planned.« less
Jian, Wen-Shan; Hsu, Chien-Yeh; Hao, Te-Hui; Wen, Hsyien-Chia; Hsu, Min-Huei; Lee, Yen-Liang; Li, Yu-Chuan; Chang, Polun
2007-11-01
Traditional electronic health record (EHR) data are produced from various hospital information systems. They could not have existed independently without an information system until the incarnation of XML technology. The interoperability of a healthcare system can be divided into two dimensions: functional interoperability and semantic interoperability. Currently, no single EHR standard exists that provides complete EHR interoperability. In order to establish a national EHR standard, we developed a set of local EHR templates. The Taiwan Electronic Medical Record Template (TMT) is a standard that aims to achieve semantic interoperability in EHR exchanges nationally. The TMT architecture is basically composed of forms, components, sections, and elements. Data stored in the elements which can be referenced by the code set, data type, and narrative block. The TMT was established with the following requirements in mind: (1) transformable to international standards; (2) having a minimal impact on the existing healthcare system; (3) easy to implement and deploy, and (4) compliant with Taiwan's current laws and regulations. The TMT provides a basis for building a portable, interoperable information infrastructure for EHR exchange in Taiwan.
Computational toxicology using the OpenTox application programming interface and Bioclipse
2011-01-01
Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173
Overview of the experimental tests in prototype
NASA Astrophysics Data System (ADS)
Egusquiza, Eduard; Valentín, David; Presas, Alexandre; Valero, Carme
2017-04-01
Experimental tests in prototype are necessary to understand the dynamic behaviour of the machine during different operating points. Hydraulic phenomena as well as its effect on the structure need to be studied in order to avoid instabilities during operation and to extend the life-time of the different components. For this purpose, a complete experimental study of a large Francis turbine prototype has been performed installing several sensors along the machine. Pressure sensors were installed in the penstock, spiral case, runner and draft tube, strain gauges were installed in the runner, vibration sensors were used in the stationary parts and different electrical and operational parameters were also measured. All these signals were acquired simultaneously for different operating points of the turbine.
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda S.; Wunderlich, Dana A.; Willoughby, John K.
1992-01-01
New and innovative software technology is presented that provides a cost effective bridge for smoothly transitioning prototype software, in the field of planning and scheduling, into an operational environment. Specifically, this technology mixes the flexibility and human design efficiency of dynamic data typing with the rigor and run-time efficiencies of static data typing. This new technology provides a very valuable tool for conducting the extensive, up-front system prototyping that leads to specifying the correct system and producing a reliable, efficient version that will be operationally effective and will be accepted by the intended users.
Information Management Challenges in Achieving Coalition Interoperability
2001-12-01
by J. Dyer SESSION I: ARCHITECTURES AND STANDARDS: FUNDAMENTAL ISSUES Chairman: Dr I. WHITE (UK) Planning for Interoperability 1 by W.M. Gentleman...framework – a crucial step toward achieving coalition C4I interoperability. TOPICS TO BE COVERED: 1 ) Maintaining secure interoperability 2) Command...d’une coalition. SUJETS À EXAMINER : 1 ) Le maintien d’une interopérabilité sécurisée 2) Les interfaces des systèmes de commandement : 2a
Intelligent Network Flow Optimization (INFLO) prototype : Seattle small-scale demonstration report.
DOT National Transportation Integrated Search
2015-05-01
This report describes the performance and results of the INFLO Prototype Small-Scale Demonstration. The purpose of the Small-Scale Demonstration was to deploy the INFLO Prototype System to demonstrate its functionality and performance in an operation...
The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications
2011-01-01
Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP/WSDL specification among and between various programming-language libraries; and iv) incompatibility between various bioinformatics data formats. Although it was still difficult to solve real world problems posed to the developers by the biological researchers in attendance because of these problems, we note the promise of addressing these issues within a semantic framework. PMID:21806842
Building AN International Polar Data Coordination Network
NASA Astrophysics Data System (ADS)
Pulsifer, P. L.; Yarmey, L.; Manley, W. F.; Gaylord, A. G.; Tweedie, C. E.
2013-12-01
In the spirit of the World Data Center system developed to manage data resulting from the International Geophysical Year of 1957-58, the International Polar Year 2007-2009 (IPY) resulted in significant progress towards establishing an international polar data management network. However, a sustained international network is still evolving. In this paper we argue that the fundamental building blocks for such a network exist and that the time is right to move forward. We focus on the Arctic component of such a network with linkages to Antarctic network building activities. A review of an important set of Network building blocks is presented: i) the legacy of the IPY data and information service; ii) global data management services with a polar component (e.g. World Data System); iii) regional systems (e.g. Arctic Observing Viewer; iv) nationally focused programs (e.g. Arctic Observing Viewer, Advanced Cooperative Arctic Data and Information Service, Polar Data Catalogue, Inuit Knowledge Centre); v) programs focused on the local (e.g. Exchange for Local Observations and Knowledge of the Arctic, Geomatics and Cartographic Research Centre). We discuss current activities and results with respect to three priority areas needed to establish a strong and effective Network. First, a summary of network building activities reports on a series of productive meetings, including the Arctic Observing Summit and the Polar Data Forum, that have resulted in a core set of Network nodes and participants and a refined vision for the Network. Second, we recognize that interoperability for information sharing fundamentally relies on the creation and adoption of community-based data description standards and data delivery mechanisms. There is a broad range of interoperability frameworks and specifications available; however, these need to be adapted for polar community needs. Progress towards Network interoperability is reviewed, and a prototype distributed data systems is demonstrated. We discuss remaining challenges. Lastly, to establish a sustainable Arctic Data Coordination Network (ADCN) as part of a broader polar Network will require adequate continued resources. We conclude by outlining proposed business models for the emerging Arctic Data Coordination Network and a broader polar Network.
The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications.
Katayama, Toshiaki; Wilkinson, Mark D; Vos, Rutger; Kawashima, Takeshi; Kawashima, Shuichi; Nakao, Mitsuteru; Yamamoto, Yasunori; Chun, Hong-Woo; Yamaguchi, Atsuko; Kawano, Shin; Aerts, Jan; Aoki-Kinoshita, Kiyoko F; Arakawa, Kazuharu; Aranda, Bruno; Bonnal, Raoul Jp; Fernández, José M; Fujisawa, Takatomo; Gordon, Paul Mk; Goto, Naohisa; Haider, Syed; Harris, Todd; Hatakeyama, Takashi; Ho, Isaac; Itoh, Masumi; Kasprzyk, Arek; Kido, Nobuhiro; Kim, Young-Joo; Kinjo, Akira R; Konishi, Fumikazu; Kovarskaya, Yulia; von Kuster, Greg; Labarga, Alberto; Limviphuvadh, Vachiranee; McCarthy, Luke; Nakamura, Yasukazu; Nam, Yunsun; Nishida, Kozo; Nishimura, Kunihiro; Nishizawa, Tatsuya; Ogishima, Soichi; Oinn, Tom; Okamoto, Shinobu; Okuda, Shujiro; Ono, Keiichiro; Oshita, Kazuki; Park, Keun-Joon; Putnam, Nicholas; Senger, Martin; Severin, Jessica; Shigemoto, Yasumasa; Sugawara, Hideaki; Taylor, James; Trelles, Oswaldo; Yamasaki, Chisato; Yamashita, Riu; Satoh, Noriyuki; Takagi, Toshihisa
2011-08-02
The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP/WSDL specification among and between various programming-language libraries; and iv) incompatibility between various bioinformatics data formats. Although it was still difficult to solve real world problems posed to the developers by the biological researchers in attendance because of these problems, we note the promise of addressing these issues within a semantic framework.
McCrory, Emma; McGuinness, Niall Jp; Ulhaq, Aman
2018-06-01
To determine the reproducibility of Index of Orthognathic Functional Treatment Need (IOFTN) scores derived from plaster casts and their three-dimensional (3D) digital equivalents. Pilot study, prospective analytical. UK hospital orthodontic department. Thirty casts and their digital equivalents, representing the pre-treatment malocclusions of patients requiring orthodontic-orthognathic surgical treatment, were scored by four clinicians using IOFTN. Casts were scanned using a 3Shape digital scanner and 3D models produced using OrthoAnalyzer TM (3Shape Ltd, Copenhagen, Denmark). Examiners independently determined the IOFTN scores for the casts and digital models, to test their inter- and intra-operator reliability using weighted Kappa scores. Intra-operator agreement with IOFTN major categories (1-5: treatment need) was very good for plaster casts (0.83-0.98) and good-very good for digital models (0.78-0.83). Inter-operator agreement was moderate-very good for casts (0.58-0.82) and good-very good for digital models (0.65-0.92). Intra-operator agreement with IOFTN sub-categories (1-14: feature of malocclusion) was good-very good for casts (0.70-0.97) and digital models (0.80-0.94). Inter-operator agreement was moderate-good for casts (0.53-0.77); and moderate-very good for the digital models (0.58-0.90). Digital models are an acceptable alternative to plaster casts for examining the malocclusion of patients requiring combined orthodontic-orthognathic surgical treatment and determining treatment need.
Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J
2014-01-01
The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ © The Author(s) 2014. Published by Oxford University Press.
Barbarito, Fulvio; Pinciroli, Francesco; Mason, John; Marceglia, Sara; Mazzola, Luca; Bonacina, Stefano
2012-08-01
Information technologies (ITs) have now entered the everyday workflow in a variety of healthcare providers with a certain degree of independence. This independence may be the cause of difficulty in interoperability between information systems and it can be overcome through the implementation and adoption of standards. Here we present the case of the Lombardy Region, in Italy, that has been able, in the last 10 years, to set up the Regional Social and Healthcare Information System, connecting all the healthcare providers within the region, and providing full access to clinical and health-related documents independently from the healthcare organization that generated the document itself. This goal, in a region with almost 10 millions citizens, was achieved through a twofold approach: first, the political and operative push towards the adoption of the Health Level 7 (HL7) standard within single hospitals and, second, providing a technological infrastructure for data sharing based on interoperability specifications recognized at the regional level for messages transmitted from healthcare providers to the central domain. The adoption of such regional interoperability specifications enabled the communication among heterogeneous systems placed in different hospitals in Lombardy. Integrating the Healthcare Enterprise (IHE) integration profiles which refer to HL7 standards are adopted within hospitals for message exchange and for the definition of integration scenarios. The IHE patient administration management (PAM) profile with its different workflows is adopted for patient management, whereas the Scheduled Workflow (SWF), the Laboratory Testing Workflow (LTW), and the Ambulatory Testing Workflow (ATW) are adopted for order management. At present, the system manages 4,700,000 pharmacological e-prescriptions, and 1,700,000 e-prescriptions for laboratory exams per month. It produces, monthly, 490,000 laboratory medical reports, 180,000 radiology medical reports, 180,000 first aid medical reports, and 58,000 discharge summaries. Hence, despite there being still work in progress, the Lombardy Region healthcare system is a fully interoperable social healthcare system connecting patients, healthcare providers, healthcare organizations, and healthcare professionals in a large and heterogeneous territory through the implementation of international health standards. Copyright © 2012 Elsevier Inc. All rights reserved.
Semantic Integration for Marine Science Interoperability Using Web Technologies
NASA Astrophysics Data System (ADS)
Rueda, C.; Bermudez, L.; Graybeal, J.; Isenor, A. W.
2008-12-01
The Marine Metadata Interoperability Project, MMI (http://marinemetadata.org) promotes the exchange, integration, and use of marine data through enhanced data publishing, discovery, documentation, and accessibility. A key effort is the definition of an Architectural Framework and Operational Concept for Semantic Interoperability (http://marinemetadata.org/sfc), which is complemented with the development of tools that realize critical use cases in semantic interoperability. In this presentation, we describe a set of such Semantic Web tools that allow performing important interoperability tasks, ranging from the creation of controlled vocabularies and the mapping of terms across multiple ontologies, to the online registration, storage, and search services needed to work with the ontologies (http://mmisw.org). This set of services uses Web standards and technologies, including Resource Description Framework (RDF), Web Ontology language (OWL), Web services, and toolkits for Rich Internet Application development. We will describe the following components: MMI Ontology Registry: The MMI Ontology Registry and Repository provides registry and storage services for ontologies. Entries in the registry are associated with projects defined by the registered users. Also, sophisticated search functions, for example according to metadata items and vocabulary terms, are provided. Client applications can submit search requests using the WC3 SPARQL Query Language for RDF. Voc2RDF: This component converts an ASCII comma-delimited set of terms and definitions into an RDF file. Voc2RDF facilitates the creation of controlled vocabularies by using a simple form-based user interface. Created vocabularies and their descriptive metadata can be submitted to the MMI Ontology Registry for versioning and community access. VINE: The Vocabulary Integration Environment component allows the user to map vocabulary terms across multiple ontologies. Various relationships can be established, for example exactMatch, narrowerThan, and subClassOf. VINE can compute inferred mappings based on the given associations. Attributes about each mapping, like comments and a confidence level, can also be included. VINE also supports registering and storing resulting mapping files in the Ontology Registry. The presentation will describe the application of semantic technologies in general, and our planned applications in particular, to solve data management problems in the marine and environmental sciences.
Wiegers, Thomas C.; Davis, Allan Peter; Mattingly, Carolyn J.
2014-01-01
The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ PMID:24919658
Interoperability Context-Setting Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Hardin, Dave; Ambrosio, Ron
2007-01-31
As the deployment of automation technology advances, it touches upon many areas of our corporate and personal lives. A trend is emerging where systems are growing to the extent that integration is taking place with other systems to provide even greater capabilities more efficiently and effectively. GridWise™ provides a vision for this type of integration as it applies to the electric system. Imagine a time in the not too distant future when homeowners can offer the management of their electricity demand to participate in a more efficient and environmentally friendly operation of the electric power grid. They will do thismore » using technology that acts on their behalf in response to information from other components of the electric system. This technology will recognize their preferences to parameters such as comfort and the price of energy to form responses that optimize the local need to a signal that satisfies a higher-level need in the grid. For example, consider a particularly hot day with air stagnation in an area with a significant dependence on wind generation. To manage the forecasted peak electricity demand, the bulk system operator issues a critical peak price warning. Their automation systems alert electric service providers who distribute electricity from the wholesale electricity system to consumers. In response, the electric service providers use their automation systems to inform consumers of impending price increases for electricity. This information is passed to an energy management system at the premises, which acts on the homeowner’s behalf, to adjust the electricity usage of the onsite equipment (which might include generation from such sources as a fuel cell). The objective of such a system is to honor the agreement with the electricity service provider and reduce the homeowner’s bill while keeping the occupants as comfortable as possible. This will include actions such as moving the thermostat on the heating, ventilation, and air-conditioning (HVAC) unit up several degrees. The resulting load reduction becomes part of an aggregated response from the electricity service provider to the bulk system operator who is now in a better position to manage total system load with available generation. Looking across the electric system, from generating plants, to transmission substations, to the distribution system, to factories, office parks, and buildings, automation is growing, and the opportunities for unleashing new value propositions are exciting. How can we facilitate this change and do so in a way that ensures the reliability of electric resources for the wellbeing of our economy and security? The GridWise Architecture Council (GWAC) mission is to enable interoperability among the many entities that interact with the electric power system. A good definition of interoperability is, “The capability of two or more networks, systems, devices, applications, or components to exchange information between them and to use the information so exchanged.” As a step in the direction of enabling interoperability, the GWAC proposes a context-setting framework to organize concepts and terminology so that interoperability issues can be identified and debated, improvements to address issues articulated, and actions prioritized and coordinated across the electric power community.« less
Impact of coalition interoperability on PKI
NASA Astrophysics Data System (ADS)
Krall, Edward J.
2003-07-01
This paper examines methods for providing PKI interoperability among units of a coalition of armed forces drawn from different nations. The area in question is tactical identity management, for the purposes of confidentiality, integrity and non-repudiation in such a dynamic coalition. The interoperating applications under consideration range from email and other forms of store-and-forward messaging to TLS and IPSEC-protected real-time communications. Six interoperability architectures are examined with advantages and disadvantages of each described in the paper.
2013-01-01
Background Activity of disease in patients with multiple sclerosis (MS) is monitored by detecting and delineating hyper-intense lesions on MRI scans. The Minimum Area Contour Change (MACC) algorithm has been created with two main goals: a) to improve inter-operator agreement on outlining regions of interest (ROIs) and b) to automatically propagate longitudinal ROIs from the baseline scan to a follow-up scan. Methods The MACC algorithm first identifies an outer bound for the solution path, forms a high number of iso-contour curves based on equally spaced contour values, and then selects the best contour value to outline the lesion. The MACC software was tested on a set of 17 FLAIR MRI images evaluated by a pair of human experts and a longitudinal dataset of 12 pairs of T2-weighted Fluid Attenuated Inversion Recovery (FLAIR) images that had lesion analysis ROIs drawn by a single expert operator. Results In the tests where two human experts evaluated the same MRI images, the MACC program demonstrated that it could markedly reduce inter-operator outline error. In the longitudinal part of the study, the MACC program created ROIs on follow-up scans that were in close agreement to the original expert’s ROIs. Finally, in a post-hoc analysis of 424 follow-up scans 91% of propagated MACC were accepted by an expert and only 9% of the final accepted ROIS had to be created or edited by the expert. Conclusion When used with an expert operator's verification of automatically created ROIs, MACC can be used to improve inter- operator agreement and decrease analysis time, which should improve data collected and analyzed in multicenter clinical trials. PMID:24004511
Cardea: Providing Support for Dynamic Resource Access in a Distributed Computing Environment
NASA Technical Reports Server (NTRS)
Lepro, Rebekah
2003-01-01
The environment framing the modem authorization process span domains of administration, relies on many different authentication sources, and manages complex attributes as part of the authorization process. Cardea facilitates dynamic access control within this environment as a central function of an inter-operable authorization framework. The system departs from the traditional authorization model by separating the authentication and authorization processes, distributing the responsibility for authorization data and allowing collaborating domains to retain control over their implementation mechanisms. Critical features of the system architecture and its handling of the authorization process differentiate the system from existing authorization components by addressing common needs not adequately addressed by existing systems. Continuing system research seeks to enhance the implementation of the current authorization model employed in Cardea, increase the robustness of current features, further the framework for establishing trust and promote interoperability with existing security mechanisms.
UAS in the NAS Flight Test Series 4 Overview
NASA Technical Reports Server (NTRS)
Murphy, Jim
2016-01-01
Flight Test Series 4 (FT4) provides the researchers with an opportunity to expand on the data collected during the first flight tests. Following Flight Test Series 3, additional scripted encounters with different aircraft performance and sensors will be conducted. FT4 is presently planned for Spring of 2016 to ensure collection of data to support the validation of the final RTCA Phase 1 DAA (Detect and Avoid) Minimum Operational Performance Standards (MOPS). There are three research objectives associated with this goal: Evaluate the performance of the DAA system against cooperative and non-cooperative aircraft encounters Evaluate UAS (Unmanned Aircraft Systems) pilot performance in response to DAA maneuver guidance and alerting with live intruder encounters Evaluate TCAS/DAA (Traffic Alert and Collision Avoidance System/Detect and Avoid) interoperability. This flight test series will focus on only the Scripted Encounters configuration, supporting the collection of data to validate the interoperability of DAA and collision avoidance algorithms.
77 FR 30518 - Support of Deployment of Prototype Small Modular Reactors at the Savannah River Site
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-23
... DEPARTMENT OF ENERGY Support of Deployment of Prototype Small Modular Reactors at the Savannah River Site AGENCY: Savannah River Operations Office, Department of Energy (DOE). ACTION: Notice of availability. SUMMARY: DOE-Savannah River Operations Office (SR), in conjunction with the Savannah River...
Prototype Tool and Focus Group Evaluation for an Advanced Trajectory-Based Operations Concept
NASA Technical Reports Server (NTRS)
Guerreiro, Nelson M.; Jones, Denise R.; Barmore, Bryan E.; Butler, Ricky W.; Hagen, George E.; Maddalon, Jeffrey M.; Ahmad, Nash'at N.
2017-01-01
Trajectory-based operations (TBO) is a key concept in the Next Generation Air Transportation System transformation of the National Airspace System (NAS) that will increase the predictability and stability of traffic flows, support a common operational picture through the use of digital data sharing, facilitate more effective collaborative decision making between airspace users and air navigation service providers, and enable increased levels of integrated automation across the NAS. NASA has been developing trajectory-based systems to improve the efficiency of the NAS during specific phases of flight and is now also exploring Advanced 4-Dimensional Trajectory (4DT) operational concepts that will integrate these technologies and incorporate new technology where needed to create both automation and procedures to support gate-to-gate TBO. A TBO Prototype simulation toolkit has been developed that demonstrates initial functionality of an Advanced 4DT TBO concept. Pilot and controller subject matter experts (SMEs) were brought to the Air Traffic Operations Laboratory at NASA Langley Research Center for discussions on an Advanced 4DT operational concept and were provided an interactive demonstration of the TBO Prototype using four example scenarios. The SMEs provided feedback on potential operational, technological, and procedural opportunities and concerns. This paper describes an Advanced 4DT operational concept, the TBO Prototype, the demonstration scenarios and methods used, and the feedback obtained from the pilot and controller SMEs in this focus group activity.
20th Annual Systems Engineering Conference. Volume 1, Monday-Tuesday
2017-10-26
Environment will follow Mr. Thompson’s presentation with a presentation focusing on how ESOH Risk Management is an integral part of the RIO Management...office successes and failures in implementing the DoDI 5000.02 acquisition ESOH policy. HUMAN SYSTEMS INTEGRATION (HSI) Track Chair: Matthew...practices, process improvements, applications and approaches to program integration . INTEROPERABILITY/NET - CENTRIC OPERATIONS Track Chairs
Endpoint Naming for Space Delay/Disruption Tolerant Networking
NASA Technical Reports Server (NTRS)
Clare, Loren; Burleigh, Scott; Scott, Keith
2010-01-01
Delay/Disruption Tolerant Networking (DTN) provides solutions to space communication challenges such as disconnections when orbiters lose line-of-sight with landers, long propagation delays over interplanetary links, and other operational constraints. DTN is critical to enabling the future space internetworking envisioned by NASA. Interoperability with international partners is essential and standardization is progressing through both the CCSDS and the IETF.
National ESPC Committee Support
2015-09-30
to the physical parameterization driver software at Navy, NOAA , NASA , and AFWA. This interoperability capability will allow for more...core from another system. Under NUOPC funding, ESMF development will be completed, maintained and evolved to address DoD and NOAA requirements. In...operational NWP centers; however, it also involves collaboration with other primary NWP development centers such as NASA , NCAR, and DOE and will
2009-03-01
utilizing a radioisotope, polonium - 210 , the advent of a practical use TEG launched the development and array of applications for such devices. Rapidly...47 1. Seebeck Effect ...............................47 2. Principles of Operation ......................48...UltraCell XX25 Fuel Cell (From UltraCell Corporation)....................................59 Figure 13. Effect of CO on PEMFC (From Baschuk and Li 2001
Maritime In Situ Sensing Inter-Operable Networks (MISSION)
2013-09-30
creating acoustic communications (acomms) technologies enabling underwater sensor networks and distributed systems. Figure 1. Project MISSION...Marn, S. Ramp, F. Bahr, “Implementation of an Underwater Wireless Sensor Network in San Francisco Bay,” Proc. 10th International Mine Warfare...NILUS – An Underwater Acoustic Sensor Network Demonstrator System,” Proc. 10th International Mine Warfare Technology Symposium, Monterey, CA, May 7
NASA Technical Reports Server (NTRS)
Figueroa, Jorge Fernando
2008-01-01
In February of 2008; NASA Stennis Space Center (SSC), NASA Kennedy Space Center (KSC), and The Applied Research Laboratory at Penn State University demonstrated a pilot implementation of an Integrated System Health Management (ISHM) capability at the Launch Complex 20 of KSC. The following significant accomplishments are associated with this development: (1) implementation of an architecture for ground operations ISHM, based on networked intelligent elements; (2) Use of standards for management of data, information, and knowledge (DIaK) leading to modular ISHM implementation with interoperable elements communicating according to standards (three standards were used: IEEE 1451 family of standards for smart sensors and actuators, Open Systems Architecture for Condition Based Maintenance (OSA-CBM) standard for communicating DIaK describing the condition of elements of a system, and the OPC standard for communicating data); (3) ISHM implementation using interoperable modules addressing health management of subsystems; and (4) use of a physical intelligent sensor node (smart network element or SNE capable of providing data and health) along with classic sensors originally installed in the facility. An operational demonstration included detection of anomalies (sensor failures, leaks, etc.), determination of causes and effects, communication among health nodes, and user interfaces.
IMMR Phase 1 Prototyping Plan Inputs
NASA Technical Reports Server (NTRS)
Vowell, C. W.; Johnson-Throop, Kathy; Smith, Bryon; Darcy, Jeannette
2006-01-01
This viewgraph presentation reviews the phase I plan of the prototype of the IMMR by the Multilateral Medical Operations Panel (MMOP) Medical Informatics & Technology (MIT) Working Group. It reviews the Purpose of IMMR Prototype Phase 1 (IPP1); the IPP1 Plan Overview, the IMMR Prototype Phase 1 Plan for PDDs and MIC and MIC-DDs, Plan for MICs, a nd the IPP1 objectives
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 1 2014-10-01 2014-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 1 2013-10-01 2013-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...
47 CFR 0.192 - Emergency Response Interoperability Center.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
Motion Imagery and Robotics Application (MIRA)
NASA Technical Reports Server (NTRS)
Martinez, Lindolfo; Rich, Thomas
2011-01-01
Objectives include: I. Prototype a camera service leveraging the CCSDS Integrated protocol stack (MIRA/SM&C/AMS/DTN): a) CCSDS MIRA Service (New). b) Spacecraft Monitor and Control (SM&C). c) Asynchronous Messaging Service (AMS). d) Delay/Disruption Tolerant Networking (DTN). II. Additional MIRA Objectives: a) Demo of Camera Control through ISS using CCSDS protocol stack (Berlin, May 2011). b) Verify that the CCSDS standards stack can provide end-to-end space camera services across ground and space environments. c) Test interoperability of various CCSDS protocol standards. d) Identify overlaps in the design and implementations of the CCSDS protocol standards. e) Identify software incompatibilities in the CCSDS stack interfaces. f) Provide redlines to the SM&C, AMS, and DTN working groups. d) Enable the CCSDS MIRA service for potential use in ISS Kibo camera commanding. e) Assist in long-term evolution of this entire group of CCSDS standards to TRL 6 or greater.
Towards an Ontology for Reef Islands
NASA Astrophysics Data System (ADS)
Duce, Stephanie
Reef islands are complex, dynamic and vulnerable environments with a diverse range of stake holders. Communication and data sharing between these different groups of stake holders is often difficult. An ontology for the reef island domain would improve the understanding of reef island geomorphology and improve communication between stake holders as well as forming a platform from which to move towards interoperability and the application of Information Technology to forecast and monitor these environments. This paper develops a small, prototypical reef island domain ontology, based on informal, natural language relations, aligned to the DOLCE upper-level ontology, for 20 fundamental terms within the domain. A subset of these terms and their relations are discussed in detail. This approach reveals and discusses challenges which must be overcome in the creation of a reef island domain ontology and which could be relevant to other ontologies in dynamic geospatial domains.
NASA Astrophysics Data System (ADS)
San Gil, Inigo; White, Marshall; Melendez, Eda; Vanderbilt, Kristin
The thirty-year-old United States Long Term Ecological Research Network has developed extensive metadata to document their scientific data. Standard and interoperable metadata is a core component of the data-driven analytical solutions developed by this research network Content management systems offer an affordable solution for rapid deployment of metadata centered information management systems. We developed a customized integrative metadata management system based on the Drupal content management system technology. Building on knowledge and experience with the Sevilleta and Luquillo Long Term Ecological Research sites, we successfully deployed the first two medium-scale customized prototypes. In this paper, we describe the vision behind our Drupal based information management instances, and list the features offered through these Drupal based systems. We also outline the plans to expand the information services offered through these metadata centered management systems. We will conclude with the growing list of participants deploying similar instances.
Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A
2008-02-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).
Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.
2008-01-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service Oriented Architecture (SSOA) for cancer research by the National Cancer Institute’s cancer Biomedical Informatics Grid (caBIG™). PMID:17512259
Maturity Model for Advancing Smart Grid Interoperability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knight, Mark; Widergren, Steven E.; Mater, J.
2013-10-28
Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met withmore » process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.« less
Laplante-Lévesque, Ariane; Abrams, Harvey; Bülow, Maja; Lunner, Thomas; Nelson, John; Riis, Søren Kamaric; Vanpoucke, Filiep
2016-10-01
This article describes the perspectives of hearing device manufacturers regarding the exciting developments that the Internet makes possible. Specifically, it proposes to join forces toward interoperability and standardization of Internet and audiology. A summary of why such a collaborative effort is required is provided from historical and scientific perspectives. A roadmap toward interoperability and standardization is proposed. Information and communication technologies improve the flow of health care data and pave the way to better health care. However, hearing-related products, features, and services are notoriously heterogeneous and incompatible with other health care systems (no interoperability). Standardization is the process of developing and implementing technical standards (e.g., Noah hearing database). All parties involved in interoperability and standardization realize mutual gains by making mutually consistent decisions. De jure (officially endorsed) standards can be developed in collaboration with large national health care systems as well as spokespeople for hearing care professionals and hearing device users. The roadmap covers mutual collaboration; data privacy, security, and ownership; compliance with current regulations; scalability and modularity; and the scope of interoperability and standards. We propose to join forces to pave the way to the interoperable Internet and audiology products, features, and services that the world needs.
Reflections on the role of open source in health information system interoperability.
Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G
2007-01-01
This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.
NASA Technical Reports Server (NTRS)
Happell, Nadine; Miksell, Steve; Carlisle, Candace
1989-01-01
A major barrier in taking expert systems from prototype to operational status involves instilling end user confidence in the operational system. The software of different life cycle models is examined and the advantages and disadvantages of each when applied to expert system development are explored. The Fault Isolation Expert System for Tracking and data relay satellite system Applications (FIESTA) is presented as a case study of development of an expert system. The end user confidence necessary for operational use of this system is accentuated by the fact that it will handle real-time data in a secure environment, allowing little tolerance for errors. How FIESTA is dealing with transition problems as it moves from an off-line standalone prototype to an on-line real-time system is discussed.
NASA Technical Reports Server (NTRS)
Happell, Nadine; Miksell, Steve; Carlisle, Candace
1989-01-01
A major barrier in taking expert systems from prototype to operational status involves instilling end user confidence in the operational system. The software of different life cycle models is examined and the advantages and disadvantages of each when applied to expert system development are explored. The Fault Isolation Expert System for Tracking and data relay satellite system Applications (FIESTA) is presented as a case study of development of an expert system. The end user confidence necessary for operational use of this system is accentuated by the fact that it will handle real-time data in a secure environment, allowing little tolerance for errors. How FIESTA is dealing with transition problems as it moves from an off-line standalone prototype to an on-line real-time system is discussed.
Developing a Standard Method for Link-Layer Security of CCSDS Space Communications
NASA Technical Reports Server (NTRS)
Biggerstaff, Craig
2009-01-01
Communications security for space systems has been a specialized field generally far removed from considerations of mission interoperability and cross-support in fact, these considerations often have been viewed as intrinsically opposed to security objectives. The space communications protocols defined by the Consultative Committee for Space Data Systems (CCSDS) have a twenty-five year history of successful use in over 400 missions. While the CCSDS Telemetry, Telecommand, and Advancing Orbiting Systems protocols for use at OSI Layer 2 are operationally mature, there has been no direct support within these protocols for communications security techniques. Link-layer communications security has been successfully implemented in the past using mission-unique methods, but never before with an objective of facilitating cross-support and interoperability. This paper discusses the design of a standard method for cryptographic authentication, encryption, and replay protection at the data link layer that can be integrated into existing CCSDS protocols without disruption to legacy communications services. Integrating cryptographic operations into existing data structures and processing sequences requires a careful assessment of the potential impediments within spacecraft, ground stations, and operations centers. The objective of this work is to provide a sound method for cryptographic encapsulation of frame data that also facilitates Layer 2 virtual channel switching, such that a mission may procure data transport services as needed without involving third parties in the cryptographic processing, or split independent data streams for separate cryptographic processing.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public... persons that the Federal Communications Commission's (FCC) Communications Security, Reliability, and... the security, reliability, and interoperability of communications systems. On March 19, 2011, the FCC...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
BRODY, DAVID L.; DONALD, CHRISTINE Mac; KESSENS, CHAD C.; YUEDE, CARLA; PARSADANIAN, MAIA; SPINNER, MIKE; KIM, EDDIE; SCHWETYE, KATHERINE E.; HOLTZMAN, DAVID M.; BAYLY, PHILIP V.
2008-01-01
Genetically modified mice represent useful tools for traumatic brain injury (TBI) research and attractive preclinical models for the development of novel therapeutics. Experimental methods that minimize the number of mice needed may increase the pace of discovery. With this in mind, we developed and characterized a prototype electromagnetic (EM) controlled cortical impact device along with refined surgical and behavioral testing techniques. By varying the depth of impact between 1.0 and 3.0 mm, we found that the EM device was capable of producing a broad range of injury severities. Histologically, 2.0-mm impact depth injuries produced by the EM device were similar to 1.0-mm impact depth injuries produced by a commercially available pneumatic device. Behaviorally, 2.0-, 2.5-, and 3.0-mm impacts impaired hidden platform and probe trial water maze performance, whereas 1.5-mm impacts did not. Rotorod and visible platform water maze deficits were also found following 2.5- and 3.0-mm impacts. No impairment of conditioned fear performance was detected. No differences were found between sexes of mice. Inter-operator reliability was very good. Behaviorally, we found that we could statistically distinguish between injury depths differing by 0.5 mm using 12 mice per group and between injury depths differing by 1.0 mm with 7-8 mice per group. Thus, the EM impactor and refined surgical and behavioral testing techniques may offer a reliable and convenient framework for preclinical TBI research involving mice. PMID:17439349
Hallbeck, M Susan; Koneczny, Sonja; Smith, Justine
2009-01-01
Controls for most technologies, including medical devices, are becoming increasingly complex, difficult to intuitively understand and don't necessarily follow population stereotypes. The resulting delays and errors are unacceptable when seconds can mean the difference between life and death. In this study participants were asked to "control" a system using a paper prototype (color photographs of controls) and then with a higher fidelity prototype of the same physical controls to determine performance differences among ethnicities and genders. No ethnic nor gender differences were found, and the comparison of paper versus higher fidelity prototypes also showed no significant differences. Thus, paper prototypes can be employed as an early device design usability tool to illustrate stereotype violations long before the first physical prototype. This will not only save money in the development and design processes, but also makes sure that even the most complex devices are intuitively understandable and operable for their basic functions.
Enriched biodiversity data as a resource and service
Balech, Bachir; Beard, Niall; Blissett, Matthew; Brenninkmeijer, Christian; van Dooren, Tom; Eades, David; Gosline, George; Groom, Quentin John; Hamann, Thomas D.; Hettling, Hannes; Hoehndorf, Robert; Holleman, Ayco; Hovenkamp, Peter; Kelbert, Patricia; King, David; Kirkup, Don; Lammers, Youri; DeMeulemeester, Thibaut; Mietchen, Daniel; Miller, Jeremy A.; Mounce, Ross; Nicolson, Nicola; Page, Rod; Pawlik, Aleksandra; Pereira, Serrano; Penev, Lyubomir; Richards, Kevin; Sautter, Guido; Shorthouse, David Peter; Tähtinen, Marko; Weiland, Claus; Williams, Alan R.; Sierra, Soraya
2014-01-01
Abstract Background: Recent years have seen a surge in projects that produce large volumes of structured, machine-readable biodiversity data. To make these data amenable to processing by generic, open source “data enrichment” workflows, they are increasingly being represented in a variety of standards-compliant interchange formats. Here, we report on an initiative in which software developers and taxonomists came together to address the challenges and highlight the opportunities in the enrichment of such biodiversity data by engaging in intensive, collaborative software development: The Biodiversity Data Enrichment Hackathon. Results: The hackathon brought together 37 participants (including developers and taxonomists, i.e. scientific professionals that gather, identify, name and classify species) from 10 countries: Belgium, Bulgaria, Canada, Finland, Germany, Italy, the Netherlands, New Zealand, the UK, and the US. The participants brought expertise in processing structured data, text mining, development of ontologies, digital identification keys, geographic information systems, niche modeling, natural language processing, provenance annotation, semantic integration, taxonomic name resolution, web service interfaces, workflow tools and visualisation. Most use cases and exemplar data were provided by taxonomists. One goal of the meeting was to facilitate re-use and enhancement of biodiversity knowledge by a broad range of stakeholders, such as taxonomists, systematists, ecologists, niche modelers, informaticians and ontologists. The suggested use cases resulted in nine breakout groups addressing three main themes: i) mobilising heritage biodiversity knowledge; ii) formalising and linking concepts; and iii) addressing interoperability between service platforms. Another goal was to further foster a community of experts in biodiversity informatics and to build human links between research projects and institutions, in response to recent calls to further such integration in this research domain. Conclusions: Beyond deriving prototype solutions for each use case, areas of inadequacy were discussed and are being pursued further. It was striking how many possible applications for biodiversity data there were and how quickly solutions could be put together when the normal constraints to collaboration were broken down for a week. Conversely, mobilising biodiversity knowledge from their silos in heritage literature and natural history collections will continue to require formalisation of the concepts (and the links between them) that define the research domain, as well as increased interoperability between the software platforms that operate on these concepts. PMID:25057255
UAS Integration in the NAS Project: DAA-TCAS Interoperability "mini" HITL Primary Results
NASA Technical Reports Server (NTRS)
Rorie, Conrad; Fern, Lisa; Shively, Jay; Santiago, Confesor
2016-01-01
At the May 2015 SC-228 meeting, requirements for TCAS II interoperability became elevated in priority. A TCAS interoperability workgroup was formed to identify and address key issues/questions. The TCAS workgroup came up with an initial list of questions and a plan to address those questions. As part of that plan, NASA proposed to run a mini HITL to address display, alerting and guidance issues. A TCAS Interoperability Workshop was held to determine potential display/alerting/guidance issues that could be explored in future NASA mini HITLS. Consensus on main functionality of DAA guidance when TCAS II RA occurs. Prioritized list of independent variables for experimental design. Set of use cases to stress TCAS Interoperability.
An Ontological Solution to Support Interoperability in the Textile Industry
NASA Astrophysics Data System (ADS)
Duque, Arantxa; Campos, Cristina; Jiménez-Ruiz, Ernesto; Chalmeta, Ricardo
Significant developments in information and communication technologies and challenging market conditions have forced enterprises to adapt their way of doing business. In this context, providing mechanisms to guarantee interoperability among heterogeneous organisations has become a critical issue. Even though prolific research has already been conducted in the area of enterprise interoperability, we have found that enterprises still struggle to introduce fully interoperable solutions, especially, in terms of the development and application of ontologies. Thus, the aim of this paper is to introduce basic ontology concepts in a simple manner and to explain the advantages of the use of ontologies to improve interoperability. We will also present a case study showing the implementation of an application ontology for an enterprise in the textile/clothing sector.
Huang, Po-Hsin; Chiu, Ming-Chuan
2016-01-01
The Digital Accessible Information SYstem (DAISY) player is an assistive reading tool developed for use by persons with visual impairments. Certain problems have persisted in the operating procedure and interface of DAISY players, especially for their Chinese users. Therefore, the aim of this study was to redesign the DAISY player with increased usability features for use by native Chinese speakers. First, a User Centered Design (UCD) process was employed to analyze the development of the prototype. Next, operation procedures were reorganized according to GOMS (Goals, Operators, Methods, and Selection rules) methodology. Then the user interface was redesigned according to specific Universal Design (UD) principles. Following these revisions, an experiment involving four scenarios was conducted to compare the new prototype to other players, and it was tested by twelve visually impaired participants. Results indicate the prototype had the quickest operating times, the fewest number of operating errors, and the lowest mental workloads of all the compared players, significantly enhancing the prototype's usability. These findings have allowed us to generate suggestions for developing the next generation of DAISY players for people, especially for Chinese audience. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Development of a prototype real-time automated filter for operational deep space navigation
NASA Technical Reports Server (NTRS)
Masters, W. C.; Pollmeier, V. M.
1994-01-01
Operational deep space navigation has been in the past, and is currently, performed using systems whose architecture requires constant human supervision and intervention. A prototype for a system which allows relatively automated processing of radio metric data received in near real-time from NASA's Deep Space Network (DSN) without any redesign of the existing operational data flow has been developed. This system can allow for more rapid response as well as much reduced staffing to support mission navigation operations.
What Roles and Missions for Europe’s Military and Security Forces in the 21st Century?
2005-08-01
Studies Dr. John L. Clarke vi Executive Summary Are armies the dinosaurs of the 21st Century, soon to become extinct in the new security environment? What...and 40-year old weapons systems. Dr. John L. Clarke 6 For many reasons , Europe’s armed forces are in a long-term period of decline, and this trend is...in a reasonable period of time and on a high level of interoperability. The capability of carrying out sophisticated operations with a high operational
2003-01-01
dependencies, and conceptual independencies. Taken together, the three views provide a framework to ensure interoperability, regardless of system... products for COP users . It enables a shared situational awareness that significantly improves the ability of commanders at all levels to quickly make... Review , March-April 1998. 5 Eric K. Shinseki, General , U.S. Army. “ The Army Transformation: A Historic Opportunity,” 2001- 02 Army Green Book
Space Mobile Network: A Near Earth Communication and Navigation Architecture
NASA Technical Reports Server (NTRS)
Israel, Dave J.; Heckler, Greg; Menrad, Robert J.
2016-01-01
This paper describes a Space Mobile Network architecture, the result of a recently completed NASA study exploring architectural concepts to produce a vision for the future Near Earth communications and navigation systems. The Space Mobile Network (SMN) incorporates technologies, such as Disruption Tolerant Networking (DTN) and optical communications, and new operations concepts, such as User Initiated Services, to provide user services analogous to a terrestrial smartphone user. The paper will describe the SMN Architecture, envisioned future operations concepts, opportunities for industry and international collaboration and interoperability, and technology development areas and goals.
NASA Astrophysics Data System (ADS)
Glaves, Helen; Schaap, Dick
2017-04-01
In recent years there has been a paradigm shift in marine research moving from the traditional discipline based methodology employed at the national level by one or more organizations, to a multidisciplinary, ecosystem level approach conducted on an international scale. This increasingly holistic approach to marine research is in part being driven by policy and legislation. For example, the European Commission's Blue Growth strategy promotes sustainable growth in the marine environment including the development of sea-basin strategies (European Commission 2014). As well as this policy driven shift to ecosystem level marine research there are also scientific and economic drivers for a basin level approach. Marine monitoring is essential for assessing the health of an ecosystem and determining the impacts of specific factors and activities on it. The availability of large volumes of good quality data is fundamental to this increasingly holistic approach to ocean research but there are significant barriers to its re-use. These are due to the heterogeneity of the data resulting from having been collected by many organizations around the globe using a variety of sensors mounted on a range of different platforms. The data is then delivered and archived in a range of formats, using various spatial coordinate systems and aligned with different standards. This heterogeneity coupled with organizational and national policies on data sharing make access and re-use of marine data problematic. In response to the need for greater sharing of marine data a number of e-infrastructures have been developed but these have different levels of granularity with the majority having been developed at the regional level to address specific requirements for data e.g. SeaDataNet in Europe, the Australian Ocean Data Network (AODN). These data infrastructures are also frequently aligned with the priorities of the local funding agencies and have been created in isolation from those developed elsewhere. To add a further layer of complexity there are also global initiatives providing marine data infrastructures e.g. IOC-IODE, POGO as well as those with a wider remit which includes environmental data e.g. GEOSS, COPERNICUS etc. Ecosystem level marine research requires a common framework for marine data management that supports the sharing of data across these regional and global data systems, and provides the user with access to the data available from these services via a single point of access. This framework must be based on existing data systems and established by developing interoperability between them. The Ocean Data and Interoperability Platform (ODIP/ODIP II) project brings together those organisations responsible for maintaining selected regional data infrastructures along with other relevant experts in order to identify the common standards and best practice necessary to underpin this framework, and to evaluate the differences and commonalties between the regional data infrastructures in order to establish interoperability between them for the purposes of data sharing. This coordinated approach is being demonstrated and validated through the development of a series of prototype interoperability solutions that demonstrate the mechanisms and standards necessary to facilitate the sharing of marine data across these existing data infrastructures.
77 FR 37001 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... of the Interoperability Services Layer, Attn: Ron Chen, 400 Gigling Road, Seaside, CA 93955. Title; Associated Form; and OMB Number: Interoperability Services Layer; OMB Control Number 0704-TBD. Needs and Uses... INFORMATION: Summary of Information Collection IoLS (Interoperability Layer Services) is an application in a...
He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison
2018-01-12
Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).
Achieving Interoperability in GEOSS - How Close Are We?
NASA Astrophysics Data System (ADS)
Arctur, D. K.; Khalsa, S. S.; Browdy, S. F.
2010-12-01
A primary goal of the Global Earth Observing System of System (GEOSS) is improving the interoperability between the observational, modelling, data assimilation, and prediction systems contributed by member countries. The GEOSS Common Infrastructure (GCI) comprises the elements designed to enable discovery and access to these diverse data and information sources. But to what degree can the mechanisms for accessing these data, and the data themselves, be considered interoperable? Will the separate efforts by Communities of Practice within GEO to build their own portals, such as for Energy, Biodiversity, and Air Quality, lead to fragmentation or synergy? What communication and leadership do we need with these communities to improve interoperability both within and across such communities? The Standards and Interoperability Forum (SIF) of GEO's Architecture and Data Committee has assessed progress towards achieving the goal of global interoperability and made recommendations regarding evolution of the architecture and overall data strategy to ensure fulfillment of the GEOSS vision. This presentation will highlight the results of this study, and directions for further work.
Personal Health Records: Is Rapid Adoption Hindering Interoperability?
Studeny, Jana; Coustasse, Alberto
2014-01-01
The establishment of the Meaningful Use criteria has created a critical need for robust interoperability of health records. A universal definition of a personal health record (PHR) has not been agreed upon. Standardized code sets have been built for specific entities, but integration between them has not been supported. The purpose of this research study was to explore the hindrance and promotion of interoperability standards in relationship to PHRs to describe interoperability progress in this area. The study was conducted following the basic principles of a systematic review, with 61 articles used in the study. Lagging interoperability has stemmed from slow adoption by patients, creation of disparate systems due to rapid development to meet requirements for the Meaningful Use stages, and rapid early development of PHRs prior to the mandate for integration among multiple systems. Findings of this study suggest that deadlines for implementation to capture Meaningful Use incentive payments are supporting the creation of PHR data silos, thereby hindering the goal of high-level interoperability. PMID:25214822
Leveraging the Unified Access Framework: A Tale of an Integrated Ocean Data Prototype
NASA Astrophysics Data System (ADS)
O'Brien, K.; Kern, K.; Smith, B.; Schweitzer, R.; Simons, R.; Mendelssohn, R.; Diggs, S. C.; Belbeoch, M.; Hankin, S.
2014-12-01
The Tropical Pacific Observing System (TPOS) has been functioning and capturing measurements since the mid 1990s during the very successful Tropical Ocean Global Atmosphere (TOGA) project. Unfortunately, in the current environment, some 20 years after the end of the TOGA project, sustaining the observing system is proving difficult. With the many advances in methods of observing the ocean, a group of scientists is taking a fresh look at what the Tropical Pacific Observing System requires for sustainability. This includes utilizing a wide variety of observing system platforms, including Argo floats, unmanned drifters, moorings, ships, etc. This variety of platforms measuring ocean data also provides a significant challenge in terms of integrated data management. It is recognized that data and information management is crucial to the success and impact of any observing system. In order to be successful, it is also crucial to avoid building stovepipes for data management. To that end, NOAA's Observing System Monitoring Center (OSMC) has been tasked to create a testbed of integrated real time and delayed mode observations for the Tropical Pacific region in support of the TPOS. The observing networks included in the prototype are: Argo floats, OceanSites moorings, drifting buoys, hydrographic surveys, underway carbon observations and, of course, real time ocean measurements. In this presentation, we will discuss how the OSMC project is building the integrated data prototype using existing free and open source software. We will explore how we are leveraging successful data management frameworks pioneered by efforts such as NOAA's Unified Access Framework project. We will also show examples of how conforming to well known conventions and standards allows for discoverability, usability and interoperability of data.
Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology
NASA Astrophysics Data System (ADS)
Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Iyemori, Toshihiko; Koyama, Yukinobu; Yatagai, Akiyo; Murayama, Yasuhiro; King, Todd; Hughes, John; Fung, Shing; Galkin, Ivan; Hapgood, Michael; Belehaki, Anna
2015-04-01
Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology European Union ESPAS, Japanese IUGONET and GFZ ISDC data server are developed for the ingestion, archiving and distributing of geo and space science domain data. Main parts of the data -managed by the mentioned data server- are related to near earth-space and geomagnetic field data. A smart mashup of the data server would allow a seamless browse and access to data and related context information. However the achievement of a high level of interoperability is a challenge because the data server are based on different data models and software frameworks. This paper is focused on the latest experiments and results for the mashup of the data server using the semantic Web approach. Besides the mashup of domain and terminological ontologies, especially the options to connect data managed by relational databases using D2R server and SPARQL technology will be addressed. A successful realization of the data server mashup will not only have a positive impact to the data users of the specific scientific domain but also to related projects, such as e.g. the development of a new interoperable version of NASA's Planetary Data System (PDS) or ICUS's World Data System alliance. ESPAS data server: https://www.espas-fp7.eu/portal/ IUGONET data server: http://search.iugonet.org/iugonet/ GFZ ISDC data server (semantic Web based prototype): http://rz-vm30.gfz-potsdam.de/drupal-7.9/ NASA PDS: http://pds.nasa.gov ICSU-WDS: https://www.icsu-wds.org
Solar heating and cooling system design and development
NASA Technical Reports Server (NTRS)
1978-01-01
The development of eight prototype solar heating and combined heating and cooling systems is reported. Manufacture, test, installation, maintenance, problem resolution, and monitoring the operation of prototype systems is included. Heating and cooling equipment for single family residential and commercial applications and eight operational test sites (four heating and four heating and cooling) is described.
Prototype continuous flow ventricular assist device supported on magnetic bearings.
Allaire, P E; Kim, H C; Maslen, E H; Olsen, D B; Bearnson, G B
1996-06-01
This article describes a prototype continuous flow pump (CFVAD2) fully supported in magnetic bearings. The pump performance was measured in a simulated adult human circulation system. The pump delivered 6 L/min of flow at 100 mm Hg of differential pressure head operating at 2,400 rpm in water. The pump is totally supported in 4 magnetic bearings: 2 radial and 2 thrust. Magnetic bearings offer the advantages of no required lubrication and large operating clearances. The geometry and other properties of the bearings are described. Bearing parameters such as load capacity and current gains are discussed. Bearing coil currents were measured during operation in air and water. The rotor was operated in various orientations to determine the actuator current gains. These values were then used to estimate the radial and thrust forces acting on the rotor in both air and water. Much lower levels of force were found than were expected, allowing for a very significant reduction in the size of the next prototype. Hemolysis levels were measured in the prototype pump and found not to indicate damage to the blood cells.
A knowledge-based system for monitoring the electrical power system of the Hubble Space Telescope
NASA Technical Reports Server (NTRS)
Eddy, Pat
1987-01-01
The design and the prototype for the expert system for the Hubble Space Telescope's electrical power system are discussed. This prototype demonstrated the capability to use real time data from a 32k telemetry stream and to perform operational health and safety status monitoring, detect trends such as battery degradation, and detect anomalies such as solar array failures. This prototype, along with the pointing control system and data management system expert systems, forms the initial Telemetry Analysis for Lockheed Operated Spacecraft (TALOS) capability.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-13
..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public..., Reliability, and Interoperability Council (CSRIC) will hold its fifth meeting. The CSRIC will vote on... to the FCC regarding best practices and actions the FCC can take to ensure the security, reliability...
Evaluation of Interoperability Protocols in Repositories of Electronic Theses and Dissertations
ERIC Educational Resources Information Center
Hakimjavadi, Hesamedin; Masrek, Mohamad Noorman
2013-01-01
Purpose: The purpose of this study is to evaluate the status of eight interoperability protocols within repositories of electronic theses and dissertations (ETDs) as an introduction to further studies on feasibility of deploying these protocols in upcoming areas of interoperability. Design/methodology/approach: Three surveys of 266 ETD…
Examining the Relationship between Electronic Health Record Interoperability and Quality Management
ERIC Educational Resources Information Center
Purcell, Bernice M.
2013-01-01
A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…
Interoperability of Demand Response Resources Demonstration in NY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wellington, Andre
2014-03-31
The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.
Interoperation of an UHF RFID Reader and a TCP/IP Device via Wired and Wireless Links
Lee, Sang Hoon; Jin, Ik Soo
2011-01-01
A main application in radio frequency identification (RFID) sensor networks is the function that processes real-time tag information after gathering the required data from multiple RFID tags. The component technologies that contain an RFID reader, called the interrogator, which has a tag chip, processors, coupling antenna, and a power management system have advanced significantly over the last decade. This paper presents a system implementation for interoperation between an UHF RFID reader and a TCP/IP device that is used as a gateway. The proposed system consists of an UHF RFID tag, an UHF RFID reader, an RF end-device, an RF coordinator, and a TCP/IP I/F. The UHF RFID reader, operating at 915 MHz, is compatible with EPC Class-0/Gen1, Class-1/Gen1 and 2, and ISO18000-6B. In particular, the UHF RFID reader can be combined with the RF end-device/coordinator for a ZigBee (IEEE 802.15.4) interface, which is a low-power wireless standard. The TCP/IP device communicates with the RFID reader via wired links. On the other hand, it is connected to the ZigBee end-device via wireless links. The web based test results show that the developed system can remotely recognize information of multiple tags through the interoperation between the RFID reader and the TCP/IP device. PMID:22346665
Interoperation of an UHF RFID reader and a TCP/IP device via wired and wireless links.
Lee, Sang Hoon; Jin, Ik Soo
2011-01-01
A main application in radio frequency identification (RFID) sensor networks is the function that processes real-time tag information after gathering the required data from multiple RFID tags. The component technologies that contain an RFID reader, called the interrogator, which has a tag chip, processors, coupling antenna, and a power management system have advanced significantly over the last decade. This paper presents a system implementation for interoperation between an UHF RFID reader and a TCP/IP device that is used as a gateway. The proposed system consists of an UHF RFID tag, an UHF RFID reader, an RF end-device, an RF coordinator, and a TCP/IP I/F. The UHF RFID reader, operating at 915 MHz, is compatible with EPC Class-0/Gen1, Class-1/Gen1 and 2, and ISO18000-6B. In particular, the UHF RFID reader can be combined with the RF end-device/coordinator for a ZigBee (IEEE 802.15.4) interface, which is a low-power wireless standard. The TCP/IP device communicates with the RFID reader via wired links. On the other hand, it is connected to the ZigBee end-device via wireless links. The web based test results show that the developed system can remotely recognize information of multiple tags through the interoperation between the RFID reader and the TCP/IP device.
The PSML format and library for norm-conserving pseudopotential data curation and interoperability
NASA Astrophysics Data System (ADS)
García, Alberto; Verstraete, Matthieu J.; Pouillon, Yann; Junquera, Javier
2018-06-01
Norm-conserving pseudopotentials are used by a significant number of electronic-structure packages, but the practical differences among codes in the handling of the associated data hinder their interoperability and make it difficult to compare their results. At the same time, existing formats lack provenance data, which makes it difficult to track and document computational workflows. To address these problems, we first propose a file format (PSML) that maps the basic concepts of the norm-conserving pseudopotential domain in a flexible form and supports the inclusion of provenance information and other important metadata. Second, we provide a software library (libPSML) that can be used by electronic structure codes to transparently extract the information in the file and adapt it to their own data structures, or to create converters for other formats. Support for the new file format has been already implemented in several pseudopotential generator programs (including ATOM and ONCVPSP), and the library has been linked with SIESTA and ABINIT, allowing them to work with the same pseudopotential operator (with the same local part and fully non-local projectors) thus easing the comparison of their results for the structural and electronic properties, as shown for several example systems. This methodology can be easily transferred to any other package that uses norm-conserving pseudopotentials, and offers a proof-of-concept for a general approach to interoperability.
Reminiscing about 15 years of interoperability efforts
Van de Sompel, Herbert; Nelson, Michael L.
2015-11-01
Over the past fifteen years, our perspective on tackling information interoperability problems for web-based scholarship has evolved significantly. In this opinion piece, we look back at three efforts that we have been involved in that aptly illustrate this evolution: OAI-PMH, OAI-ORE, and Memento. Understanding that no interoperability specification is neutral, we attempt to characterize the perspectives and technical toolkits that provided the basis for these endeavors. With that regard, we consider repository-centric and web-centric interoperability perspectives, and the use of a Linked Data or a REST/HATEAOS technology stack, respectively. In addition, we lament the lack of interoperability across nodes thatmore » play a role in web-based scholarship, but end on a constructive note with some ideas regarding a possible path forward.« less
The HDF Product Designer - Interoperability in the First Mile
NASA Astrophysics Data System (ADS)
Lee, H.; Jelenak, A.; Habermann, T.
2014-12-01
Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chandler, K.; Eudy, L.
This report describes operations at Connecticut Transit (CTTRANSIT) in Hartford for one prototype fuel cell bus and three new diesel buses operating from the same location. The prototype fuel cell bus was manufactured by Van Hool and ISE Corp. and features an electric hybrid drive system with a UTC Power PureMotion 120 Fuel Cell Power System and ZEBRA batteries for energy storage. The fuel cell bus started operation in April 2007, and evaluation results through October 2009 are provided in this report.
Multinational Experiment 7: Protecting Access to Space
2013-07-08
access to space cost to the design, engineering , production and operation of the spacecraft. They also have an impact on spacecraft mass, thermal...station and provide engineering support to receive data in the agreed format. Step 5 – Implementing interoperability. Once a framework has been...procedures or using alternative means (for example, high-altitude airships ). A7. The results support the view that better mitigation approaches need to
2005-12-14
66 4.5.1. K now ledge V isualization by V irtual Reality ...interoperability between market participants (players) in a semantic manner are needed; " War avoidance operations such as peace-keeping, peace...and analysis , situation prediction, etc. The current report sums up the obtained results. It is organized in the following way. Section 2 introduces
Interoperability in the e-Government Context
2012-01-01
Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. Any opinions...Hanscom AFB, MA 01731-2125 NO WARRANTY THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN “AS-IS” BASIS... Software Engineering Institute at permission@sei.cmu.edu. * These restrictions do not apply to U.S. government entities. CMU/SEI-2011-TN-014 | i Table
ERIC Educational Resources Information Center
Watson, Jason; Ahmed, Pervaiz K.
2004-01-01
This paper briefly introduces the trends towards e-learning and amplifies some examples of state of the art systems, pointing out that all of these are, to date, limited by adaptability and shareability of content and that it is necessary for industry to develop and use an inter-operability standard. Uses SCORM specifications to specify the…
NASA Astrophysics Data System (ADS)
Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.
2005-12-01
As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS/DOT, and Intergraph; and 5) develop WFS-based solutions and technical documents using the GeoMedia WebMap WFS toolkit. Geospatial Web Feature Service is demonstrated to be more efficient in sharing vector data and supports direct Internet access transportation data. Developed WFS solutions also enhanced the interoperable service provided by CEOSR through the FGDC clearinghouse node and the GOS Portal.
Operational test of the prototype peewee yarder.
Charles N. Mann; Ronald W. Mifflin
1979-01-01
An operational test of a small, prototype running skyline yarder was conducted early in 1978. Test results indicate that this yarder concept promises a low cost, high performance system for harvesting small logs where skyline methods are indicated. Timber harvest by thinning took place on 12 uphill and 2 downhill skyline roads, and clearcut harvesting was performed on...
Prototype operational earthquake prediction system
Spall, Henry
1986-01-01
An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.
Principles of Automation for Patient Safety in Intensive Care: Learning From Aviation.
Dominiczak, Jason; Khansa, Lara
2018-06-01
The transition away from written documentation and analog methods has opened up the possibility of leveraging data science and analytic techniques to improve health care. In the implementation of data science techniques and methodologies, high-acuity patients in the ICU can particularly benefit. The Principles of Automation for Patient Safety in Intensive Care (PASPIC) framework draws on Billings's principles of human-centered aviation (HCA) automation and helps in identifying the advantages, pitfalls, and unintended consequences of automation in health care. Billings's HCA principles are based on the premise that human operators must remain "in command," so that they are continuously informed and actively involved in all aspects of system operations. In addition, automated systems need to be predictable, simple to train, to learn, and to operate, and must be able to monitor the human operators, and every intelligent system element must know the intent of other intelligent system elements. In applying Billings's HCA principles to the ICU setting, PAPSIC has three key characteristics: (1) integration and better interoperability, (2) multidimensional analysis, and (3) enhanced situation awareness. PAPSIC suggests that health care professionals reduce overreliance on automation and implement "cooperative automation" and that vendors reduce mode errors and embrace interoperability. Much can be learned from the aviation industry in automating the ICU. Because it combines "smart" technology with the necessary controls to withstand unintended consequences, PAPSIC could help ensure more informed decision making in the ICU and better patient care. Copyright © 2018 The Joint Commission. Published by Elsevier Inc. All rights reserved.
Uranus: a rapid prototyping tool for FPGA embedded computer vision
NASA Astrophysics Data System (ADS)
Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.
2007-01-01
The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.
Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaidon, Clement; Poplawski, Michael
First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.
Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'
NASA Astrophysics Data System (ADS)
Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno
2015-04-01
Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.
NASA Technical Reports Server (NTRS)
Basile, Lisa
1988-01-01
The SLDPF is responsible for the capture, quality monitoring processing, accounting, and shipment of Spacelab and/or Attached Shuttle Payloads (ASP) telemetry data to various user facilities. Expert systems will aid in the performance of the quality assurance and data accounting functions of the two SLDPF functional elements: the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). Prototypes were developed for each as independent efforts. The SIPS Knowledge System Prototype (KSP) used the commercial shell OPS5+ on an IBM PC/AT; the SOPS Expert System Prototype used the expert system shell CLIPS implemented on a Macintosh personal computer. Both prototypes emulate the duties of the respective QA/DA analysts based upon analyst input and predetermined mission criteria parameters, and recommended instructions and decisions governing the reprocessing, release, or holding for further analysis of data. These prototypes demonstrated feasibility and high potential for operational systems. Increase in productivity, decrease of tedium, consistency, concise historical records, and a training tool for new analyses were the principal advantages. An operational configuration, taking advantage of the SLDPF network capabilities, is under development with the expert systems being installed on SUN workstations. This new configuration in conjunction with the potential of the expert systems will enhance the efficiency, in both time and quality, of the SLDPF's release of Spacelab/AST data products.
NASA Technical Reports Server (NTRS)
Basile, Lisa
1988-01-01
The SLDPF is responsible for the capture, quality monitoring processing, accounting, and shipment of Spacelab and/or Attached Shuttle Payloads (ASP) telemetry data to various user facilities. Expert systems will aid in the performance of the quality assurance and data accounting functions of the two SLDPF functional elements: the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). Prototypes were developed for each as independent efforts. The SIPS Knowledge System Prototype (KSP) used the commercial shell OPS5+ on an IBM PC/AT; the SOPS Expert System Prototype used the expert system shell CLIPS implemented on a Macintosh personal computer. Both prototypes emulate the duties of the respective QA/DA analysts based upon analyst input and predetermined mission criteria parameters, and recommended instructions and decisions governing the reprocessing, release, or holding for further analysis of data. These prototypes demonstrated feasibility and high potential for operational systems. Increase in productivity, decrease of tedium, consistency, concise historial records, and a training tool for new analyses were the principal advantages. An operational configuration, taking advantage of the SLDPF network capabilities, is under development with the expert systems being installed on SUN workstations. This new configuration in conjunction with the potential of the expert systems will enhance the efficiency, in both time and quality, of the SLDPF's release of Spacelab/AST data products.
An open repositories network development for medical teaching resources.
Soula, Gérard; Darmoni, Stefan; Le Beux, Pierre; Renard, Jean-Marie; Dahamna, Badisse; Fieschi, Marius
2010-01-01
The lack of interoperability between repositories of heterogeneous and geographically widespread data is an obstacle to the diffusion, sharing and reutilization of those data. We present the development of an open repositories network taking into account both the syntactic and semantic interoperability of the different repositories and based on international standards in this field. The network is used by the medical community in France for the diffusion and sharing of digital teaching resources. The syntactic interoperability of the repositories is managed using the OAI-PMH protocol for the exchange of metadata describing the resources. Semantic interoperability is based, on one hand, on the LOM standard for the description of resources and on MESH for the indexing of the latter and, on the other hand, on semantic interoperability management designed to optimize compliance with standards and the quality of the metadata.
Johnson, Steven M.; Swanson, Robert B.
1994-01-01
Prototype stream-monitoring sites were operated during part of 1992 in the Central Nebraska Basins (CNBR) and three other study areas of the National Water-Quality Assessment (NAWQ) Program of the U.S. Geological Survey. Results from the prototype project provide information needed to operate a net- work of intensive fixed station stream-monitoring sites. This report evaluates operating procedures for two NAWQA prototype sites at Maple Creek near Nickerson and the Platte River at Louisville, eastern Nebraska. Each site was sampled intensively in the spring and late summer 1992, with less intensive sampling in midsummer. In addition, multiple samples were collected during two high- flow periods at the Maple Creek site--one early and the other late in the growing season. Water-samples analyses included determination of pesticides, nutrients, major ions, suspended sediment, and measurements of physical properties. Equipment and protocols for the water-quality sampling procedures were evaluated. Operation of the prototype stream- monitoring sites included development and comparison of onsite and laboratory sample-processing proce- dures. Onsite processing was labor intensive but allowed for immediate preservation of all sampled constituents. Laboratory processing required less field labor and decreased the risk of contamination, but allowed for no immediate preservation of the samples.
Beštek, Mate; Stanimirović, Dalibor
2017-08-09
The main aims of the paper comprise the characterization and examination of the potential approaches regarding interoperability. This includes openEHR, SNOMED, IHE, and Continua as combined interoperability approaches, possibilities for their incorporation into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. The paper represents an in-depth analysis regarding the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research and charting of existing experience in the field, and sources, both electronic and written, which include interoperability concepts and related implementation issues. The paper will try to answer the following inquiries that are complementing each other: 1. Scrutiny of the potential approaches, which could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field that critically influence development and implementation of eHealth projects in an efficient manner. Provided insights and identified success factors could serve as a constituent of the strategic starting points for continuous integration of interoperability principles into the healthcare domain. Moreover, the general implementation of the identified success factors could facilitate better penetration of ICT into the healthcare environment and enable the eHealth-based transformation of the health system especially in the countries which are still in an early phase of eHealth planning and development and are often confronted with differing interests, requirements, and contending strategies.
Ensuring Sustainable Data Interoperability Across the Natural and Social Sciences
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2015-12-01
Both the natural and social science data communities are attempting to address the long-term sustainability of their data infrastructures in rapidly changing research, technological, and policy environments. Many parts of these communities are also considering how to improve the interoperability and integration of their data and systems across natural, social, health, and other domains. However, these efforts have generally been undertaken in parallel, with little thought about how different sustainability approaches may impact long-term interoperability from scientific, legal, or economic perspectives, or vice versa, i.e., how improved interoperability could enhance—or threaten—infrastructure sustainability. Scientific progress depends substantially on the ability to learn from the legacy of previous work available for current and future scientists to study, often by integrating disparate data not previously assembled. Digital data are less likely than scientific publications to be usable in the future unless they are managed by science-oriented repositories that can support long-term data access with the documentation and services needed for future interoperability. We summarize recent discussions in the social and natural science communities on emerging approaches to sustainability and relevant interoperability activities, including efforts by the Belmont Forum E-Infrastructures project to address global change data infrastructure needs; the Group on Earth Observations to further implement data sharing and improve data management across diverse societal benefit areas; and the Research Data Alliance to develop legal interoperability principles and guidelines and to address challenges faced by domain repositories. We also examine emerging needs for data interoperability in the context of the post-2015 development agenda and the expected set of Sustainable Development Goals (SDGs), which set ambitious targets for sustainable development, poverty reduction, and environmental stewardship by 2030. These efforts suggest the need for a holistic approach towards improving and implementing strategies, policies, and practices that will ensure long-term sustainability and interoperability of scientific data repositories and networks across multiple scientific domains.
A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World
NASA Astrophysics Data System (ADS)
Wright, D. J.; Sankaran, S.
2015-12-01
In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted as part of our everyday use of technology.
Airport Simulations Using Distributed Computational Resources
NASA Technical Reports Server (NTRS)
McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Daniel (Technical Monitor)
2002-01-01
The Virtual National Airspace Simulation (VNAS) will improve the safety of Air Transportation. In 2001, using simulation and information management software running over a distributed network of super-computers, researchers at NASA Ames, Glenn, and Langley Research Centers developed a working prototype of a virtual airspace. This VNAS prototype modeled daily operations of the Atlanta airport by integrating measured operational data and simulation data on up to 2,000 flights a day. The concepts and architecture developed by NASA for this prototype are integral to the National Airspace Simulation to support the development of strategies improving aviation safety, identifying precursors to component failure.
Groundwater data network interoperability
Brodaric, Boyan; Booth, Nathaniel; Boisvert, Eric; Lucido, Jessica M.
2016-01-01
Water data networks are increasingly being integrated to answer complex scientific questions that often span large geographical areas and cross political borders. Data heterogeneity is a major obstacle that impedes interoperability within and between such networks. It is resolved here for groundwater data at five levels of interoperability, within a Spatial Data Infrastructure architecture. The result is a pair of distinct national groundwater data networks for the United States and Canada, and a combined data network in which they are interoperable. This combined data network enables, for the first time, transparent public access to harmonized groundwater data from both sides of the shared international border.
A logical approach to semantic interoperability in healthcare.
Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni
2011-01-01
Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.
Prototype Continuous Flow Ventricular Assist Device Supported on Magnetic Bearings.
Allaire, P E; Kim, H C; Maslen, E H; Olsen, D B; Bearnson, G B
1996-05-01
This article describes a prototype continuous flow pump (CFVAD2) fully supported in magnetic bearings. The pump performance was measured in a simulated adult human circulation system. The pump delivered 6 L/min of flow at 100 mm Hg of differential pressure head operating at 2,400 rpm in water. The pump is totally supported in 4 magnetic bearings: 2 radial and 2 thrust. Magnetic bearings offer the advantages of no required lubrication and large operating clearances. The geometry and other properties of the bearings are described. Bearing parameters such as load capacity and current gains are discussed. Bearing coil currents were measured during operation in air and water. The rotor was operated in various orientations to determine the actuator current gains. These values were then used to estimate the radial and thrust forces acting on the rotor in both air and water. Much lower levels of force were found than were expected, allowing for a very significant reduction in the size of the next prototype. Hemolysis levels were measured in the prototype pump and found not to indicate damage to the blood cells. © 1996 International Society for Artificial Organs.
2016-09-01
training in the decisive action training environment, with rotations routinely featuring several thousand participants from many nations and operating in...teams work with exercise participants before they arrive at the training center. The goal is to ensure all formations understand — and are able to...capabilities, location, and extensive experience working with NATO and partner countries, the JMTC is uniquely positioned to implement NATO training
Defense Acquisitions: Assessments of Selected Weapon Programs
2014-03-01
Frequency (UHF) Follow-On ( UFO ) satellite system currently in operation and provide interoperability with legacy terminals. MUOS consists of a...failures of two UFO satellites and predicted end-of-life of on-orbit UFO satellites, one of which was taken off-line in November 2012. A...needed because most on-orbit UFO satellites are past their design lives. Two of these unexpectedly failed—one in June 2005 and another in
Improving NATO’s Interoperability Through U.S. Precision Weapons
1998-06-01
They must identify the target and manually fine-tune the laser designator onto the desired impact point . The challenge is keeping the laser designator...aimed at the appropriate impact point , especially while maneuvering the aircraft to avoid threats. Once the LGB is released, the laser seeker...originally configured for low altitude operations. Later in the war, the Tornado aircraft were re-equipped with their Thermal Imaging and Laser-Designating
CNES-NASA Disruption-Tolerant Networking (DTN) Interoperability
NASA Technical Reports Server (NTRS)
Mortensen, Dale; Eddy, Wesley M.; Reinhart, Richard C.; Lassere, Francois
2014-01-01
Future missions requiring robust internetworking services may use Delay-Disruption-Tolerant Networking (DTN) technology. CNES, NASA, and other international space agencies are committed to using CCSDS standards in their space and ground mission communications systems. The experiment described in this presentation will evaluate operations concepts, system performance, and advance technology readiness for the use of DTN protocols in conjunction with CCSDS ground systems, CCSDS data links, and CCSDS file transfer applications
The Operational Impacts of the Global Network Enterprise Construct
2010-05-14
Board Task Force on Achieving Interoperability in a Net-Centric Environment, xiv. 60 Lolita Baldor, “Military Asserts Right to Return Cyber-Attacks...the commander is aware that applications such video teleconferencing and large file transfers are often not possible with subordinate units...data packets, but if there is latency along the path, services such as video or large file transfers will fail. Latency is the time delay inherent in
2014-03-01
38 2. Mobile Ad Hoc Networks ..................................................................39 3. Wireless Ad Hoc Sensor Networks...59 Figure 32. RENEWS with WiMAX and Wave Relay AP at C-IED Site.............................59 Figure 33. RENEWS Wind Turbine and Solar Panels at Hat...worldwide interoperability for microwave access WSN wireless sensor network xv ACKNOWLEDGMENTS We would like to express our sincerest gratitude
War Is Too Important to be Left to the Lawyers
2008-10-29
century battlefield. Specifically, it explores how legal differences between the U.S. and coalition partners have adversely impacted the theater...U.S. and coalition partners have adversely impacted the theater commander’s military operations in Kosovo during ALLIED FORCE and in Iraq and...interoperability issues.”22 For example, differences between the nineteen coalition members over what constituted a legal and legitimate target impacted unity of
An Approach to Information Management for AIR7000 with Metadata and Ontologies
2009-10-01
metadata. We then propose an approach based on Semantic Technologies including the Resource Description Framework (RDF) and Upper Ontologies, for the...mandating specific metadata schemas can result in interoperability problems. For example, many standards within the ADO mandate the use of XML for metadata...such problems, we propose an archi- tecture in which different metadata schemes can inter operate. By using RDF (Resource Description Framework ) as a