Ross, Sue; Magee, Laura; Walker, Mark; Wood, Stephen
2012-12-27
Intellectual property is associated with the creative work needed to design clinical trials. Two approaches have developed to protect the intellectual property associated with multicentre trial protocols prior to site initiation. The 'open access' approach involves publishing the protocol, permitting easy access to the complete protocol. The main advantages of the open access approach are that the protocol is freely available to all stakeholders, permitting them to discuss the protocol widely with colleagues, assess the quality and rigour of the protocol, determine the feasibility of conducting the trial at their centre, and after trial completion, to evaluate the reported findings based on a full understanding of the protocol. The main potential disadvantage of this approach is the potential for plagiarism; however if that occurred, it should be easy to identify because of the open access to the original trial protocol, as well as ensure that appropriate sanctions are used to deal with plagiarism. The 'restricted access' approach involves the use of non-disclosure agreements, legal documents that must be signed between the trial lead centre and collaborative sites. Potential sites must guarantee they will not disclose any details of the study before they are permitted to access the protocol. The main advantages of the restricted access approach are for the lead institution and nominated principal investigator, who protect their intellectual property associated with the trial. The main disadvantages are that ownership of the protocol and intellectual property is assigned to the lead institution; defining who 'needs to know' about the study protocol is difficult; and the use of non-disclosure agreements involves review by lawyers and institutional representatives at each site before access is permitted to the protocol, significantly delaying study implementation and adding substantial indirect costs to research institutes. This extra step may discourage sites from joining a trial. It is possible that the restricted access approach may contribute to the failure of well-designed trials without any significant benefit in protecting intellectual property. Funding agencies should formalize rules around open versus restricted access to the study protocol just as they have around open access to results.
2012-01-01
Intellectual property is associated with the creative work needed to design clinical trials. Two approaches have developed to protect the intellectual property associated with multicentre trial protocols prior to site initiation. The ‘open access’ approach involves publishing the protocol, permitting easy access to the complete protocol. The main advantages of the open access approach are that the protocol is freely available to all stakeholders, permitting them to discuss the protocol widely with colleagues, assess the quality and rigour of the protocol, determine the feasibility of conducting the trial at their centre, and after trial completion, to evaluate the reported findings based on a full understanding of the protocol. The main potential disadvantage of this approach is the potential for plagiarism; however if that occurred, it should be easy to identify because of the open access to the original trial protocol, as well as ensure that appropriate sanctions are used to deal with plagiarism. The ‘restricted access’ approach involves the use of non-disclosure agreements, legal documents that must be signed between the trial lead centre and collaborative sites. Potential sites must guarantee they will not disclose any details of the study before they are permitted to access the protocol. The main advantages of the restricted access approach are for the lead institution and nominated principal investigator, who protect their intellectual property associated with the trial. The main disadvantages are that ownership of the protocol and intellectual property is assigned to the lead institution; defining who ‘needs to know’ about the study protocol is difficult; and the use of non-disclosure agreements involves review by lawyers and institutional representatives at each site before access is permitted to the protocol, significantly delaying study implementation and adding substantial indirect costs to research institutes. This extra step may discourage sites from joining a trial. It is possible that the restricted access approach may contribute to the failure of well-designed trials without any significant benefit in protecting intellectual property. Funding agencies should formalize rules around open versus restricted access to the study protocol just as they have around open access to results. PMID:23270486
Interoperability in the Planetary Science Archive (PSA)
NASA Astrophysics Data System (ADS)
Rios Diaz, C.
2017-09-01
The protocols and standards currently being supported by the recently released new version of the Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet- Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. We explore these protocols in more detail providing scientifically useful examples of their usage within the PSA.
Quantum Tomography Protocols with Positivity are Compressed Sensing Protocols (Open Access)
2015-12-08
ARTICLE OPEN Quantum tomography protocols with positivity are compressed sensing protocols Amir Kalev1, Robert L Kosut2 and Ivan H Deutsch1...Characterising complex quantum systems is a vital task in quantum information science. Quantum tomography, the standard tool used for this purpose, uses a well...designed measurement record to reconstruct quantum states and processes. It is, however, notoriously inefficient. Recently, the classical signal
Schacht Hansen, M; Dørup, J
2001-01-01
The Wireless Application Protocol technology implemented in newer mobile phones has built-in facilities for handling much of the information processing needed in clinical work. To test a practical approach we ported a relational database of the Danish pharmaceutical catalogue to Wireless Application Protocol using open source freeware at all steps. We used Apache 1.3 web software on a Linux server. Data containing the Danish pharmaceutical catalogue were imported from an ASCII file into a MySQL 3.22.32 database using a Practical Extraction and Report Language script for easy update of the database. Data were distributed in 35 interrelated tables. Each pharmaceutical brand name was given its own card with links to general information about the drug, active substances, contraindications etc. Access was available through 1) browsing therapeutic groups and 2) searching for a brand name. The database interface was programmed in the server-side scripting language PHP3. A free, open source Wireless Application Protocol gateway to a pharmaceutical catalogue was established to allow dial-in access independent of commercial Wireless Application Protocol service providers. The application was tested on the Nokia 7110 and Ericsson R320s cellular phones. We have demonstrated that Wireless Application Protocol-based access to a dynamic clinical database can be established using open source freeware. The project opens perspectives for a further integration of Wireless Application Protocol phone functions in clinical information processing: Global System for Mobile communication telephony for bilateral communication, asynchronous unilateral communication via e-mail and Short Message Service, built-in calculator, calendar, personal organizer, phone number catalogue and Dictaphone function via answering machine technology. An independent Wireless Application Protocol gateway may be placed within hospital firewalls, which may be an advantage with respect to security. However, if Wireless Application Protocol phones are to become effective tools for physicians, special attention must be paid to the limitations of the devices. Input tools of Wireless Application Protocol phones should be improved, for instance by increased use of speech control.
Hansen, Michael Schacht
2001-01-01
Background The Wireless Application Protocol technology implemented in newer mobile phones has built-in facilities for handling much of the information processing needed in clinical work. Objectives To test a practical approach we ported a relational database of the Danish pharmaceutical catalogue to Wireless Application Protocol using open source freeware at all steps. Methods We used Apache 1.3 web software on a Linux server. Data containing the Danish pharmaceutical catalogue were imported from an ASCII file into a MySQL 3.22.32 database using a Practical Extraction and Report Language script for easy update of the database. Data were distributed in 35 interrelated tables. Each pharmaceutical brand name was given its own card with links to general information about the drug, active substances, contraindications etc. Access was available through 1) browsing therapeutic groups and 2) searching for a brand name. The database interface was programmed in the server-side scripting language PHP3. Results A free, open source Wireless Application Protocol gateway to a pharmaceutical catalogue was established to allow dial-in access independent of commercial Wireless Application Protocol service providers. The application was tested on the Nokia 7110 and Ericsson R320s cellular phones. Conclusions We have demonstrated that Wireless Application Protocol-based access to a dynamic clinical database can be established using open source freeware. The project opens perspectives for a further integration of Wireless Application Protocol phone functions in clinical information processing: Global System for Mobile communication telephony for bilateral communication, asynchronous unilateral communication via e-mail and Short Message Service, built-in calculator, calendar, personal organizer, phone number catalogue and Dictaphone function via answering machine technology. An independent Wireless Application Protocol gateway may be placed within hospital firewalls, which may be an advantage with respect to security. However, if Wireless Application Protocol phones are to become effective tools for physicians, special attention must be paid to the limitations of the devices. Input tools of Wireless Application Protocol phones should be improved, for instance by increased use of speech control. PMID:11720946
Peer Review and Publication of Research Protocols and Proposals: A Role for Open Access Journals
2004-01-01
Peer-review and publication of research protocols offer several advantages to all parties involved. Among these are the following opportunities for authors: external expert opinion on the methods, demonstration to funding agencies of prior expert review of the protocol, proof of priority of ideas and methods, and solicitation of potential collaborators. We think that review and publication of protocols is an important role for Open Access journals. Because of their electronic form, openness for readers, and author-pays business model, they are better suited than traditional journals to ensure the sustainability and quality of protocol reviews and publications. In this editorial, we describe the workflow for investigators in eHealth research, from protocol submission to a funding agency, to protocol review and (optionally) publication at JMIR, to registration of trials at the International eHealth Study Registry (IESR), and to publication of the report. One innovation at JMIR is that protocol peer reviewers will be paid a honorarium, which will be drawn partly from a new submission fee for protocol reviews. Separating the article processing fee into a submission and a publishing fee will allow authors to opt for “peer-review only” (without subsequent publication) at reduced costs, if they wish to await a funding decision or for other reasons decide not to make the protocol public. PMID:15471763
Peer-review and publication of research protocols and proposals: a role for open access journals.
Eysenbach, Gunther
2004-09-30
Peer-review and publication of research protocols offer several advantages to all parties involved. Among these are the following opportunities for authors: external expert opinion on the methods, demonstration to funding agencies of prior expert review of the protocol, proof of priority of ideas and methods, and solicitation of potential collaborators. We think that review and publication of protocols is an important role for Open Access journals. Because of their electronic form, openness for readers, and author-pays business model, they are better suited than traditional journals to ensure the sustainability and quality of protocol reviews and publications. In this editorial, we describe the workflow for investigators in eHealth research, from protocol submission to a funding agency, to protocol review and (optionally) publication at JMIR, to registration of trials at the International eHealth Study Registry (IESR), and to publication of the report. One innovation at JMIR is that protocol peer reviewers will be paid a honorarium, which will be drawn partly from a new submission fee for protocol reviews. Separating the article processing fee into a submission and a publishing fee will allow authors to opt for "peer-review only" (without subsequent publication) at reduced costs, if they wish to await a funding decision or for other reasons decide not to make the protocol public.
[Open access :an opportunity for biomedical research].
Duchange, Nathalie; Autard, Delphine; Pinhas, Nicole
2008-01-01
Open access within the scientific community depends on the scientific context and the practices of the field. In the biomedical domain, the communication of research results is characterised by the importance of the peer reviewing process, the existence of a hierarchy among journals and the transfer of copyright to the editor. Biomedical publishing has become a lucrative market and the growth of electronic journals has not helped lower the costs. Indeed, it is difficult for today's public institutions to gain access to all the scientific literature. Open access is thus imperative, as demonstrated through the positions taken by a growing number of research funding bodies, the development of open access journals and efforts made in promoting open archives. This article describes the setting up of an Inserm portal for publication in the context of the French national protocol for open-access self-archiving and in an international context.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandoval, M Analisa; Uribe, Eva C; Sandoval, Marisa N
2009-01-01
In 2008 a joint team from Los Alamos National Laboratory (LANL) and Brookhaven National Laboratory (BNL) consisting of specialists in training of IAEA inspectors in the use of complementary access activities formulated a training program to prepare the U.S. Doe laboratories for the entry into force of the Additional Protocol. As a major part of the support of the activity, LANL summer interns provided open source information analysis to the LANL-BNL mock inspection team. They were a part of the Next Generation Safeguards Initiative's (NGSI) summer intern program aimed at producing the next generation of safeguards specialists. This paper describesmore » how they used open source information to 'backstop' the LANL-BNL team's effort to construct meaningful Additional Protocol Complementary Access training scenarios for each of the three DOE laboratories, Lawrence Livermore National Laboratory, Idaho National Laboratory, and Oak Ridge National Laboratory.« less
Pace: Privacy-Protection for Access Control Enforcement in P2P Networks
NASA Astrophysics Data System (ADS)
Sánchez-Artigas, Marc; García-López, Pedro
In open environments such as peer-to-peer (P2P) systems, the decision to collaborate with multiple users — e.g., by granting access to a resource — is hard to achieve in practice due to extreme decentralization and the lack of trusted third parties. The literature contains a plethora of applications in which a scalable solution for distributed access control is crucial. This fact motivates us to propose a protocol to enforce access control, applicable to networks consisting entirely of untrusted nodes. The main feature of our protocol is that it protects both sensitive permissions and sensitive policies, and does not rely on any centralized authority. We analyze the efficiency (computational effort and communication overhead) as well as the security of our protocol.
Land-mobile satellite communication system
NASA Technical Reports Server (NTRS)
Yan, Tsun-Yee (Inventor); Rafferty, William (Inventor); Dessouky, Khaled I. (Inventor); Wang, Charles C. (Inventor); Cheng, Unjeng (Inventor)
1993-01-01
A satellite communications system includes an orbiting communications satellite for relaying communications to and from a plurality of ground stations, and a network management center for making connections via the satellite between the ground stations in response to connection requests received via the satellite from the ground stations, the network management center being configured to provide both open-end service and closed-end service. The network management center of one embodiment is configured to provides both types of service according to a predefined channel access protocol that enables the ground stations to request the type of service desired. The channel access protocol may be configured to adaptively allocate channels to open-end service and closed-end service according to changes in the traffic pattern and include a free-access tree algorithm that coordinates collision resolution among the ground stations.
Open access publishing, article downloads, and citations: randomised controlled trial
Lewenstein, Bruce V; Simon, Daniel H; Booth, James G; Connolly, Mathew J L
2008-01-01
Objective To measure the effect of free access to the scientific literature on article downloads and citations. Design Randomised controlled trial. Setting 11 journals published by the American Physiological Society. Participants 1619 research articles and reviews. Main outcome measures Article readership (measured as downloads of full text, PDFs, and abstracts) and number of unique visitors (internet protocol addresses). Citations to articles were gathered from the Institute for Scientific Information after one year. Interventions Random assignment on online publication of articles published in 11 scientific journals to open access (treatment) or subscription access (control). Results Articles assigned to open access were associated with 89% more full text downloads (95% confidence interval 76% to 103%), 42% more PDF downloads (32% to 52%), and 23% more unique visitors (16% to 30%), but 24% fewer abstract downloads (−29% to −19%) than subscription access articles in the first six months after publication. Open access articles were no more likely to be cited than subscription access articles in the first year after publication. Fifty nine per cent of open access articles (146 of 247) were cited nine to 12 months after publication compared with 63% (859 of 1372) of subscription access articles. Logistic and negative binomial regression analysis of article citation counts confirmed no citation advantage for open access articles. Conclusions Open access publishing may reach more readers than subscription access publishing. No evidence was found of a citation advantage for open access articles in the first year after publication. The citation advantage from open access reported widely in the literature may be an artefact of other causes. PMID:18669565
User Procedures Standardization for Network Access. NBS Technical Note 799.
ERIC Educational Resources Information Center
Neumann, A. J.
User access procedures to information systems have become of crucial importance with the advent of computer networks, which have opened new types of resources to a broad spectrum of users. This report surveys user access protocols of six representative systems: BASIC, GE MK II, INFONET, MEDLINE, NIC/ARPANET and SPIRES. Functional access…
Review and publication of protocol submissions to Trials - what have we learned in 10 years?
Li, Tianjing; Boutron, Isabelle; Al-Shahi Salman, Rustam; Cobo, Erik; Flemyng, Ella; Grimshaw, Jeremy M; Altman, Douglas G
2016-12-16
Trials has 10 years of experience in providing open access publication of protocols for randomised controlled trials. In this editorial, the senior editors and editors-in-chief of Trials discuss editorial issues regarding managing trial protocol submissions, including the content and format of the protocol, timing of submission, approaches to tracking protocol amendments, and the purpose of peer reviewing a protocol submission. With the clarification and guidance provided, we hope we can make the process of publishing trial protocols more efficient and useful to trial investigators and readers.
Publishing priorities of biomedical research funders
Collins, Ellen
2013-01-01
Objectives To understand the publishing priorities, especially in relation to open access, of 10 UK biomedical research funders. Design Semistructured interviews. Setting 10 UK biomedical research funders. Participants 12 employees with responsibility for research management at 10 UK biomedical research funders; a purposive sample to represent a range of backgrounds and organisation types. Conclusions Publicly funded and large biomedical research funders are committed to open access publishing and are pleased with recent developments which have stimulated growth in this area. Smaller charitable funders are supportive of the aims of open access, but are concerned about the practical implications for their budgets and their funded researchers. Across the board, biomedical research funders are turning their attention to other priorities for sharing research outputs, including data, protocols and negative results. Further work is required to understand how smaller funders, including charitable funders, can support open access. PMID:24154520
NASA Astrophysics Data System (ADS)
Macfarlane, A. J.; Docasal, R.; Rios, C.; Barbarisi, I.; Saiz, J.; Vallejo, F.; Besse, S.; Arviset, C.; Barthelemy, M.; De Marchi, G.; Fraga, D.; Grotheer, E.; Heather, D.; Lim, T.; Martinez, S.; Vallat, C.
2018-01-01
The Planetary Science Archive (PSA) is the European Space Agency's (ESA) repository of science data from all planetary science and exploration missions. The PSA provides access to scientific data sets through various interfaces at http://psa.esa.int. Mostly driven by the evolution of the PDS standards which all new ESA planetary missions shall follow and the need to update the interfaces to the archive, the PSA has undergone an important re-engineering. In order to maximise the scientific exploitation of ESA's planetary data holdings, significant improvements have been made by utilising the latest technologies and implementing widely recognised open standards. To facilitate users in handling and visualising the many products stored in the archive which have spatial data associated, the new PSA supports Geographical Information Systems (GIS) by implementing the standards approved by the Open Geospatial Consortium (OGC). The modernised PSA also attempts to increase interoperability with the international community by implementing recognised planetary science specific protocols such as the PDAP (Planetary Data Access Protocol) and EPN-TAP (EuroPlanet-Table Access Protocol). In this paper we describe some of the methods by which the archive may be accessed and present the challenges that are being faced in consolidating data sets of the older PDS3 version of the standards with the new PDS4 deliveries into a single data model mapping to ensure transparent access to the data for users and services whilst maintaining a high performance.
ERIC Educational Resources Information Center
Feintuch, Howard
2009-01-01
OpenCourseWare (OCW) program, offered at the Massachusetts Institute of Technology (MIT), provides open access to course materials for a large number of MIT classes. From this resource, American Megan Brewster, a recent graduate working in Guatemala, was able to formulate and implement a complete protocol to tackle Guatemala's need for a plastics…
Dynamic federations: storage aggregation using open tools and protocols
NASA Astrophysics Data System (ADS)
Furano, Fabrizio; Brito da Rocha, Ricardo; Devresse, Adrien; Keeble, Oliver; Álvarez Ayllón, Alejandro; Fuhrmann, Patrick
2012-12-01
A number of storage elements now offer standard protocol interfaces like NFS 4.1/pNFS and WebDAV, for access to their data repositories, in line with the standardization effort of the European Middleware Initiative (EMI). Also the LCG FileCatalogue (LFC) can offer such features. Here we report on work that seeks to exploit the federation potential of these protocols and build a system that offers a unique view of the storage and metadata ensemble and the possibility of integration of other compatible resources such as those from cloud providers. The challenge, here undertaken by the providers of dCache and DPM, and pragmatically open to other Grid and Cloud storage solutions, is to build such a system while being able to accommodate name translations from existing catalogues (e.g. LFCs), experiment-based metadata catalogues, or stateless algorithmic name translations, also known as “trivial file catalogues”. Such so-called storage federations of standard protocols-based storage elements give a unique view of their content, thus promoting simplicity in accessing the data they contain and offering new possibilities for resilience and data placement strategies. The goal is to consider HTTP and NFS4.1-based storage elements and metadata catalogues and make them able to cooperate through an architecture that properly feeds the redirection mechanisms that they are based upon, thus giving the functionalities of a “loosely coupled” storage federation. One of the key requirements is to use standard clients (provided by OS'es or open source distributions, e.g. Web browsers) to access an already aggregated system; this approach is quite different from aggregating the repositories at the client side through some wrapper API, like for instance GFAL, or by developing new custom clients. Other technical challenges that will determine the success of this initiative include performance, latency and scalability, and the ability to create worldwide storage federations that are able to redirect clients to repositories that they can efficiently access, for instance trying to choose the endpoints that are closer or applying other criteria. We believe that the features of a loosely coupled federation of open-protocols-based storage elements will open many possibilities of evolving the current computing models without disrupting them, and, at the same time, will be able to operate with the existing infrastructures, follow their evolution path and add storage centers that can be acquired as a third-party service.
The SWITCH-ON Virtual Water-Science Laboratory
NASA Astrophysics Data System (ADS)
Arheimer, Berit; Boot, Gerben; Calero, Joan; Ceola, Serena; Gyllensvärd, Frida; Hrachowitz, Markus; Little, Lorna; Montanari, Alberto; Nijzink, Remko; Parajka, Juraj; Wagener, Thorsten
2017-04-01
The SWITCH-ON Virtual Water-Science Laboratory (VWSL) aims to facilitate collaboration and support reproducible experiments in water research. The goal is to overcome geographical distance for comparative hydrology and increase transparency when using computational tools in hydrological sciences. The VWSL gives access to open data through dedicated software tools for data search and upload, and helps creating collaborative protocols for joint experiments in the virtual environment. The VWSL will help scientists with: • Cooperation around the world - straightforward connections with other scientists in comparative analyses and collaboration, as a mean to accelerate scientific advance in hydrology. • Repeatability of experiments -thorough review of a large variety of numerical experiments, which is a foundational principle in scientific research, and improvement of research standards. • New forms of scientific research - by using online 'living' protocols, scientists you can elaborate ideas incrementally with a large group of colleagues and share data, tools, models, etc. in open science. The VWSL was developed within the EU project "Sharing Water Information to Tackle Changes in Hydrology - for Operational Needs" (Grant agreement No 603587). Visitors can choose to Define, Participate or Review experiments by clicking the start buttons (http://www.switch-on-vwsl.eu/). Anyone can view protocols without log-in (that's important for Open Science) - but to create, participate and edit protocols, you need to Log-in for security reasons. During the work process, the protocol is moved from one view to another as the experiment evolves from idea, to on-going, to be completed. The users of the Lab also get access to useful tools for running collaborative experiments, for instance: Open data Search, Data (and metadata) Upload, and Create Protocol tools. So far, eight collaborative experiments have been completed in the VWSL and resulted in research papers (published or submitted), and there are currently four on-going experiments, which also involves external participants, not paid by the project. The VWSL is now launched and open to everyone but it will be continuously developed and sustained also after the project. This presentation will give an on-line demonstration of the major features of the present VWSL and discuss some future visions and major challenges in this e-infrastructure.
NASA World Wind Near Real Time Data for Earth
NASA Astrophysics Data System (ADS)
Hogan, P.
2013-12-01
Innovation requires open standards for data exchange, not to mention ^access to data^ so that value-added, the information intelligence, can be continually created and advanced by the larger community. Likewise, innovation by academia and entrepreneurial enterprise alike, are greatly benefited by an open platform that provides the basic technology for access and visualization of that data. NASA World Wind Java, and now NASA World Wind iOS for the iPhone and iPad, provides that technology. Whether the interest is weather science or climate science, emergency response or supply chain, seeing spatial data in its native context of Earth accelerates understanding and improves decision-making. NASA World Wind open source technology provides the basic elements for 4D visualization, using Open Geospatial Consortium (OGC) protocols, while allowing for customized access to any data, big or small, including support for NetCDF. NASA World Wind includes access to a suite of US Government WMS servers with near real time data. The larger community can readily capitalize on this technology, building their own value-added applications, either open or proprietary. Night lights heat map Glacier National Park
Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things.
Jazayeri, Mohammad Ali; Liang, Steve H L; Huang, Chih-Yuan
2015-09-22
Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision.
Jin, Wenquan; Kim, DoHyeun
2018-05-26
The Internet of Things is comprised of heterogeneous devices, applications, and platforms using multiple communication technologies to connect the Internet for providing seamless services ubiquitously. With the requirement of developing Internet of Things products, many protocols, program libraries, frameworks, and standard specifications have been proposed. Therefore, providing a consistent interface to access services from those environments is difficult. Moreover, bridging the existing web services to sensor and actuator networks is also important for providing Internet of Things services in various industry domains. In this paper, an Internet of Things proxy is proposed that is based on virtual resources to bridge heterogeneous web services from the Internet to the Internet of Things network. The proxy enables clients to have transparent access to Internet of Things devices and web services in the network. The proxy is comprised of server and client to forward messages for different communication environments using the virtual resources which include the server for the message sender and the client for the message receiver. We design the proxy for the Open Connectivity Foundation network where the virtual resources are discovered by the clients as Open Connectivity Foundation resources. The virtual resources represent the resources which expose services in the Internet by web service providers. Although the services are provided by web service providers from the Internet, the client can access services using the consistent communication protocol in the Open Connectivity Foundation network. For discovering the resources to access services, the client also uses the consistent discovery interface to discover the Open Connectivity Foundation devices and virtual resources.
Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things
Jazayeri, Mohammad Ali; Liang, Steve H. L.; Huang, Chih-Yuan
2015-01-01
Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision. PMID:26402683
Multiple Access Schemes for Lunar Missions
NASA Technical Reports Server (NTRS)
Deutsch, Leslie; Hamkins, Jon; Stocklin, Frank J.
2010-01-01
Two years ago, the NASA Coding, Modulation, and Link Protocol (CMLP) study was completed. The study, led by the authors of this paper, recommended codes, modulation schemes, and desired attributes of link protocols for all space communication links in NASA's future space architecture. Portions of the NASA CMLP team were reassembled to resolve one open issue: the use of multiple access (MA) communication from the lunar surface. The CMLP-MA team analyzed and simulated two candidate multiple access schemes that were identified in the original CMLP study: Code Division MA (CDMA) and Frequency Division MA (FDMA) based on a bandwidth-efficient Continuous Phase Modulation (CPM) with a superimposed Pseudo-Noise (PN) ranging signal (CPM/PN). This paper summarizes the results of the analysis and simulation of the CMLP-MA study and describes the final recommendations.
Collaboration using open standards and open source software (examples of DIAS/CEOS Water Portal)
NASA Astrophysics Data System (ADS)
Miura, S.; Sekioka, S.; Kuroiwa, K.; Kudo, Y.
2015-12-01
The DIAS/CEOS Water Portal is a part of the DIAS (Data Integration and Analysis System, http://www.editoria.u-tokyo.ac.jp/projects/dias/?locale=en_US) systems for data distribution for users including, but not limited to, scientists, decision makers and officers like river administrators. One of the functions of this portal is to enable one-stop search and access variable water related data archived multiple data centers located all over the world. This portal itself does not store data. Instead, according to requests made by users on the web page, it retrieves data from distributed data centers on-the-fly and lets them download and see rendered images/plots. Our system mainly relies on the open source software GI-cat (http://essi-lab.eu/do/view/GIcat) and open standards such as OGC-CSW, Opensearch and OPeNDAP protocol to enable the above functions. Details on how it works will be introduced during the presentation. Although some data centers have unique meta data format and/or data search protocols, our portal's brokering function enables users to search across various data centers at one time. And this portal is also connected to other data brokering systems, including GEOSS DAB (Discovery and Access Broker). As a result, users can search over thousands of datasets, millions of files at one time. Users can access the DIAS/CEOS Water Portal system at http://waterportal.ceos.org/.
Geospatial Brokering - Challenges and Future Directions
NASA Astrophysics Data System (ADS)
White, C. E.
2012-12-01
An important feature of many brokers is to facilitate straightforward human access to scientific data while maintaining programmatic access to it for system solutions. Standards-based protocols are critical for this, and there are a number of protocols to choose from. In this discussion, we will present a web application solution that leverages certain protocols - e.g., OGC CSW, REST, and OpenSearch - to provide programmatic as well as human access to geospatial resources. We will also discuss managing resources to reduce duplication yet increase discoverability, federated search solutions, and architectures that combine human-friendly interfaces with powerful underlying data management. The changing requirements witnessed in brokering solutions over time, our recent experience participating in the EarthCube brokering hack-a-thon, and evolving interoperability standards provide insight to future technological and philosophical directions planned for geospatial broker solutions. There has been much change over the past decade, but with the unprecedented data collaboration of recent years, in many ways the challenges and opportunities are just beginning.
OpenFDA: an innovative platform providing access to a wealth of FDA's publicly available data.
Kass-Hout, Taha A; Xu, Zhiheng; Mohebbi, Matthew; Nelsen, Hans; Baker, Adam; Levine, Jonathan; Johanson, Elaine; Bright, Roselie A
2016-05-01
The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs). Using cutting-edge technologies deployed on FDA's new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges. Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved.
OpenFDA: an innovative platform providing access to a wealth of FDA’s publicly available data
Kass-Hout, Taha A; Mohebbi, Matthew; Nelsen, Hans; Baker, Adam; Levine, Jonathan; Johanson, Elaine; Bright, Roselie A
2016-01-01
Objective The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs). Materials and Methods Using cutting-edge technologies deployed on FDA’s new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges. Results Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. Conclusion With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products. PMID:26644398
Guided Tour of Pythonian Museum
NASA Technical Reports Server (NTRS)
Lee, H. Joe
2017-01-01
At http:hdfeos.orgzoo, we have a large collection of Python examples of dealing with NASA HDF (Hierarchical Data Format) products. During this hands-on Python tutorial session, we'll present a few common hacks to access and visualize local NASA HDF data. We'll also cover how to access remote data served by OPeNDAP (Open-source Project for a Network Data Access Protocol). As a glue language, we will demonstrate how you can use Python for your data workflow - from searching data to analyzing data with machine learning.
A hash based mutual RFID tag authentication protocol in telecare medicine information system.
Srivastava, Keerti; Awasthi, Amit K; Kaul, Sonam D; Mittal, R C
2015-01-01
Radio Frequency Identification (RFID) is a technology which has multidimensional applications to reduce the complexity of today life. Everywhere, like access control, transportation, real-time inventory, asset management and automated payment systems etc., RFID has its enormous use. Recently, this technology is opening its wings in healthcare environments, where potential applications include patient monitoring, object traceability and drug administration systems etc. In this paper, we propose a secure RFID-based protocol for the medical sector. This protocol is based on hash operation with synchronized secret. The protocol is safe against active and passive attacks such as forgery, traceability, replay and de-synchronization attack.
Gražulis, Saulius; Daškevič, Adriana; Merkys, Andrius; Chateigner, Daniel; Lutterotti, Luca; Quirós, Miguel; Serebryanaya, Nadezhda R.; Moeck, Peter; Downs, Robert T.; Le Bail, Armel
2012-01-01
Using an open-access distribution model, the Crystallography Open Database (COD, http://www.crystallography.net) collects all known ‘small molecule / small to medium sized unit cell’ crystal structures and makes them available freely on the Internet. As of today, the COD has aggregated ∼150 000 structures, offering basic search capabilities and the possibility to download the whole database, or parts thereof using a variety of standard open communication protocols. A newly developed website provides capabilities for all registered users to deposit published and so far unpublished structures as personal communications or pre-publication depositions. Such a setup enables extension of the COD database by many users simultaneously. This increases the possibilities for growth of the COD database, and is the first step towards establishing a world wide Internet-based collaborative platform dedicated to the collection and curation of structural knowledge. PMID:22070882
Freeing data through The Polar Information Commons
NASA Astrophysics Data System (ADS)
de Bruin, Taco; Chen, Robert; Parsons, Mark; Carlson, David
2010-05-01
The polar regions are changing rapidly with dramatic global effect. Wise management of resources, improved decision support, and effective international cooperation on resource and geopolitical issues require deeper understanding and better prediction of these changes. Unfortunately, polar data and information remain scattered, scarce, and sporadic. Inspired by the Antarctic Treaty of 1959 that established the Antarctic as a global commons to be used only for peaceful purposes and scientific research, we assert that data and information about the polar regions are themselves "public goods" that should be shared ethically and with minimal constraint. We therefore envision the Polar Information Commons (PIC) as an open, virtual repository for vital scientific data and information that would provide a shared, community-based cyber-infrastructure fostering innovation, improving scientific efficiency, and encouraging participation in polar research, education, planning, and management. The PIC will build on the legacy of the International Polar Year (IPY), providing a long-term framework for access to and preservation of both existing and future data and information about the polar regions. Rapid change demands rapid data access. The PIC system will enable scientists to quickly expose their data to the world and share them through open protocols on the Internet. A PIC digital label will alert users and data centers to new polar data and ensure that usage rights are clear. The PIC will utilize the Science Commons Protocol for Implementing Open Access Data, which promotes open data access through the public domain coupled with community norms of practice to ensure use of data in a fair and equitable manner. A set of PIC norms is currently being developed in consultation with key polar data organizations and other stakeholders. We welcome inputs from the broad science community as we further develop and refine the PIC approach and move ahead with implementation.
Freeing data through The Polar Information Commons
NASA Astrophysics Data System (ADS)
de Bruin, T.; Chen, R. S.; Parsons, M. A.; Carlson, D. J.
2009-12-01
The polar regions are changing rapidly with dramatic global effect. Wise management of resources, improved decision support, and effective international cooperation on resource and geopolitical issues require deeper understanding and better prediction of these changes. Unfortunately, polar data and information remain scattered, scarce, and sporadic. Inspired by the Antarctic Treaty of 1959 that established the Antarctic as a global commons to be used only for peaceful purposes and scientific research, we assert that data and information about the polar regions are themselves “public goods” that should be shared ethically and with minimal constraint. We therefore envision the Polar Information Commons (PIC) as an open, virtual repository for vital scientific data and information that would provide a shared, community-based cyber-infrastructure fostering innovation, improving scientific efficiency, and encouraging participation in polar research, education, planning, and management. The PIC will build on the legacy of the International Polar Year (IPY), providing a long-term framework for access to and preservation of both existing and future data and information about the polar regions. Rapid change demands rapid data access. The PIC system will enable scientists to quickly expose their data to the world and share them through open protocols on the Internet. A PIC digital label will alert users and data centers to new polar data and ensure that usage rights are clear. The PIC will utilize the Science Commons Protocol for Implementing Open Access Data, which promotes open data access through the public domain coupled with community norms of practice to ensure use of data in a fair and equitable manner. A set of PIC norms is currently being developed in consultation with key polar data organizations and other stakeholders. We welcome inputs from the broad science community as we further develop and refine the PIC approach and move ahead with implementation.
Freeing data through The Polar Information Commons
NASA Astrophysics Data System (ADS)
de Bruin, T.; Chen, R. S.; Parsons, M. A.; Carlson, D. J.; Cass, K.; Finney, K.; Wilbanks, J.; Jochum, K.
2010-12-01
The polar regions are changing rapidly with dramatic global effect. Wise management of resources, improved decision support, and effective international cooperation on resource and geopolitical issues require deeper understanding and better prediction of these changes. Unfortunately, polar data and information remain scattered, scarce, and sporadic. Inspired by the Antarctic Treaty of 1959 that established the Antarctic as a global commons to be used only for peaceful purposes and scientific research, we assert that data and information about the polar regions are themselves “public goods” that should be shared ethically and with minimal constraint. ICSU’s Committee on Data (CODATA) therefore started the Polar Information Commons (PIC) as an open, virtual repository for vital scientific data and information. The PIC provides a shared, community-based cyber-infrastructure fostering innovation, improving scientific efficiency, and encouraging participation in polar research, education, planning, and management. The PIC builds on the legacy of the International Polar Year (IPY), providing a long-term framework for access to and preservation of both existing and future data and information about the polar regions. Rapid change demands rapid data access. The PIC system enables scientists to quickly expose their data to the world and share them through open protocols on the Internet. A PIC digital label will alert users and data centers to new polar data and ensure that usage rights are clear. The PIC utilizes the Science Commons Protocol for Implementing Open Access Data, which promotes open data access through the public domain coupled with community norms of practice to ensure use of data in a fair and equitable manner. A set of PIC norms has been developed in consultation with key polar data organizations and other stakeholders. We welcome inputs from the broad science community as we further develop and refine the PIC approach and move ahead with implementation.
Development of a data entry auditing protocol and quality assurance for a tissue bank database.
Khushi, Matloob; Carpenter, Jane E; Balleine, Rosemary L; Clarke, Christine L
2012-03-01
Human transcription error is an acknowledged risk when extracting information from paper records for entry into a database. For a tissue bank, it is critical that accurate data are provided to researchers with approved access to tissue bank material. The challenges of tissue bank data collection include manual extraction of data from complex medical reports that are accessed from a number of sources and that differ in style and layout. As a quality assurance measure, the Breast Cancer Tissue Bank (http:\\\\www.abctb.org.au) has implemented an auditing protocol and in order to efficiently execute the process, has developed an open source database plug-in tool (eAuditor) to assist in auditing of data held in our tissue bank database. Using eAuditor, we have identified that human entry errors range from 0.01% when entering donor's clinical follow-up details, to 0.53% when entering pathological details, highlighting the importance of an audit protocol tool such as eAuditor in a tissue bank database. eAuditor was developed and tested on the Caisis open source clinical-research database; however, it can be integrated in other databases where similar functionality is required.
Organ, Michael G.; Hanson, Paul R.; Rolfe, Alan; Samarakoon, Thiwanka B.; Ullah, Farman
2011-01-01
The generation of stereochemically-rich benzothiaoxazepine-1,1′-dioxides for enrichment of high-throughput screening collections is reported. Utilizing a microwave-assisted, continuous flow organic synthesis platform (MACOS), scale-out of core benzothiaoxazepine-1,1′-dioxide scaffolds has been achieved on multi-gram scale using an epoxide opening/SNAr cyclization protocol. Diversification of these sultam scaffolds was attained via a microwave-assisted intermolecular SNAr reaction with a variety of amines. Overall, a facile, 2-step protocol generated a collection of benzothiaoxazepine-1,1′-dioxides possessing stereochemical complexity in rapid fashion, where all 8 stereoisomers were accessed from commercially available starting materials. PMID:22116791
Protocols for Scholarly Communication
NASA Astrophysics Data System (ADS)
Pepe, A.; Yeomans, J.
2007-10-01
CERN, the European Organization for Nuclear Research, has operated an institutional preprint repository for more than 10 years. The repository contains over 850,000 records of which more than 450,000 are full-text OA preprints, mostly in the field of particle physics, and it is integrated with the library's holdings of books, conference proceedings, journals and other grey literature. In order to encourage effective propagation and open access to scholarly material, CERN is implementing a range of innovative library services into its document repository: automatic keywording, reference extraction, collaborative management tools and bibliometric tools. Some of these services, such as user reviewing and automatic metadata extraction, could make up an interesting testbed for future publishing solutions and certainly provide an exciting environment for e-science possibilities. The future protocol for scientific communication should guide authors naturally towards OA publication, and CERN wants to help reach a full open access publishing environment for the particle physics community and related sciences in the next few years.
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E.
2011-12-01
Under several NASA grants, we are generating multi-sensor merged atmospheric datasets to enable the detection of instrument biases and studies of climate trends over decades of data. For example, under a NASA MEASURES grant we are producing a water vapor climatology from the A-Train instruments, stratified by the Cloudsat cloud classification for each geophysical scene. The generation and proper use of such multi-sensor climate data records (CDR's) requires a high level of openness, transparency, and traceability. To make the datasets self-documenting and provide access to full metadata and traceability, we have implemented a set of capabilities and services using known, interoperable protocols. These protocols include OpenSearch, OPeNDAP, Open Provenance Model, service & data casting technologies using Atom feeds, and REST-callable analysis workflows implemented as SciFlo (XML) documents. We advocate that our approach can serve as a blueprint for how to openly "document and serve" complex, multi-sensor CDR's with full traceability. The capabilities and services provided include: - Discovery of the collections by keyword search, exposed using OpenSearch protocol; - Space/time query across the CDR's granules and all of the input datasets via OpenSearch; - User-level configuration of the production workflows so that scientists can select additional physical variables from the A-Train to add to the next iteration of the merged datasets; - Efficient data merging using on-the-fly OPeNDAP variable slicing & spatial subsetting of data out of input netCDF and HDF files (without moving the entire files); - Self-documenting CDR's published in a highly usable netCDF4 format with groups used to organize the variables, CF-style attributes for each variable, numeric array compression, & links to OPM provenance; - Recording of processing provenance and data lineage into a query-able provenance trail in Open Provenance Model (OPM) format, auto-captured by the workflow engine; - Open Publishing of all of the workflows used to generate products as machine-callable REST web services, using the capabilities of the SciFlo workflow engine; - Advertising of the metadata (e.g. physical variables provided, space/time bounding box, etc.) for our prepared datasets as "datacasts" using the Atom feed format; - Publishing of all datasets via our "DataDrop" service, which exploits the WebDAV protocol to enable scientists to access remote data directories as local files on their laptops; - Rich "web browse" of the CDR's with full metadata and the provenance trail one click away; - Advertising of all services as Google-discoverable "service casts" using the Atom format. The presentation will describe our use of the interoperable protocols and demonstrate the capabilities and service GUI's.
Geoscience Information Network (USGIN) Solutions for Interoperable Open Data Access Requirements
NASA Astrophysics Data System (ADS)
Allison, M. L.; Richard, S. M.; Patten, K.
2014-12-01
The geosciences are leading development of free, interoperable open access to data. US Geoscience Information Network (USGIN) is a freely available data integration framework, jointly developed by the USGS and the Association of American State Geologists (AASG), in compliance with international standards and protocols to provide easy discovery, access, and interoperability for geoscience data. USGIN standards include the geologic exchange language 'GeoSciML' (v 3.2 which enables instant interoperability of geologic formation data) which is also the base standard used by the 117-nation OneGeology consortium. The USGIN deployment of NGDS serves as a continent-scale operational demonstration of the expanded OneGeology vision to provide access to all geoscience data worldwide. USGIN is developed to accommodate a variety of applications; for example, the International Renewable Energy Agency streams data live to the Global Atlas of Renewable Energy. Alternatively, users without robust data sharing systems can download and implement a free software packet, "GINstack" to easily deploy web services for exposing data online for discovery and access. The White House Open Data Access Initiative requires all federally funded research projects and federal agencies to make their data publicly accessible in an open source, interoperable format, with metadata. USGIN currently incorporates all aspects of the Initiative as it emphasizes interoperability. The system is successfully deployed as the National Geothermal Data System (NGDS), officially launched at the White House Energy Datapalooza in May, 2014. The USGIN Foundation has been established to ensure this technology continues to be accessible and available.
Interoperability In The New Planetary Science Archive (PSA)
NASA Astrophysics Data System (ADS)
Rios, C.; Barbarisi, I.; Docasal, R.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; Grotheer, E.; Besse, S.; Martinez, S.; Heather, D.; De Marchi, G.; Lim, T.; Fraga, D.; Barthelemy, M.
2015-12-01
As the world becomes increasingly interconnected, there is a greater need to provide interoperability with software and applications that are commonly being used globally. For this purpose, the development of the new Planetary Science Archive (PSA), by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is focused on building a modern science archive that takes into account internationally recognised standards in order to provide access to the archive through tools from third parties, for example by the NASA Planetary Data System (PDS), the VESPA project from the Virtual Observatory of Paris as well as other international institutions. The protocols and standards currently being supported by the new Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet-Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. The architecture of the PSA consists of a Geoserver (an open-source map server), the goal of which is to support use cases such as the distribution of search results, sharing and processing data through a OGC Web Feature Service (WFS) and a Web Map Service (WMS). This server also allows the retrieval of requested information in several standard output formats like Keyhole Markup Language (KML), Geography Markup Language (GML), shapefile, JavaScript Object Notation (JSON) and Comma Separated Values (CSV), among others. The provision of these various output formats enables end-users to be able to transfer retrieved data into popular applications such as Google Mars and NASA World Wind.
SEnviro: a sensorized platform proposal using open hardware and open standards.
Trilles, Sergio; Luján, Alejandro; Belmonte, Óscar; Montoliu, Raúl; Torres-Sospedra, Joaquín; Huerta, Joaquín
2015-03-06
The need for constant monitoring of environmental conditions has produced an increase in the development of wireless sensor networks (WSN). The drive towards smart cities has produced the need for smart sensors to be able to monitor what is happening in our cities. This, combined with the decrease in hardware component prices and the increase in the popularity of open hardware, has favored the deployment of sensor networks based on open hardware. The new trends in Internet Protocol (IP) communication between sensor nodes allow sensor access via the Internet, turning them into smart objects (Internet of Things and Web of Things). Currently, WSNs provide data in different formats. There is a lack of communication protocol standardization, which turns into interoperability issues when connecting different sensor networks or even when connecting different sensor nodes within the same network. This work presents a sensorized platform proposal that adheres to the principles of the Internet of Things and theWeb of Things. Wireless sensor nodes were built using open hardware solutions, and communications rely on the HTTP/IP Internet protocols. The Open Geospatial Consortium (OGC) SensorThings API candidate standard was used as a neutral format to avoid interoperability issues. An environmental WSN developed following the proposed architecture was built as a proof of concept. Details on how to build each node and a study regarding energy concerns are presented.
SEnviro: A Sensorized Platform Proposal Using Open Hardware and Open Standards
Trilles, Sergio; Luján, Alejandro; Belmonte, Óscar; Montoliu, Raúl; Torres-Sospedra, Joaquín; Huerta, Joaquín
2015-01-01
The need for constant monitoring of environmental conditions has produced an increase in the development of wireless sensor networks (WSN). The drive towards smart cities has produced the need for smart sensors to be able to monitor what is happening in our cities. This, combined with the decrease in hardware component prices and the increase in the popularity of open hardware, has favored the deployment of sensor networks based on open hardware. The new trends in Internet Protocol (IP) communication between sensor nodes allow sensor access via the Internet, turning them into smart objects (Internet of Things and Web of Things). Currently, WSNs provide data in different formats. There is a lack of communication protocol standardization, which turns into interoperability issues when connecting different sensor networks or even when connecting different sensor nodes within the same network. This work presents a sensorized platform proposal that adheres to the principles of the Internet of Things and the Web of Things. Wireless sensor nodes were built using open hardware solutions, and communications rely on the HTTP/IP Internet protocols. The Open Geospatial Consortium (OGC) SensorThings API candidate standard was used as a neutral format to avoid interoperability issues. An environmental WSN developed following the proposed architecture was built as a proof of concept. Details on how to build each node and a study regarding energy concerns are presented. PMID:25756864
Yokohama, Noriya
2003-09-01
The author constructed a medical image network system using open source software that took security into consideration. This system was enabled for search and browse with a WWW browser, and images were stored in a DICOM server. In order to realize this function, software was developed to fill in the gap between the DICOM protocol and HTTP using PHP language. The transmission speed was evaluated by the difference in protocols between DICOM and HTTP. Furthermore, an attempt was made to evaluate the convenience of medical image access with a personal information terminal via the Internet through the high-speed mobile communication terminal. Results suggested the feasibility of remote diagnosis and application to emergency care.
Open Access to research data - final perspectives from the RECODE project
NASA Astrophysics Data System (ADS)
Bigagli, Lorenzo; Sondervan, Jeroen
2015-04-01
Many networks, initiatives, and communities are addressing the key barriers to Open Access to data in scientific research. These organizations are typically heterogeneous and fragmented by discipline, location, sector (publishers, academics, data centers, etc.), as well as by other features. Besides, they often work in isolation, or with limited contacts with one another. The Policy RECommendations for Open Access to Research Data in Europe (RECODE) project, which will conclude in the first half of 2015, has scoped and addressed the challenges related to Open Access, dissemination and preservation of scientific data, leveraging the existing networks, initiatives, and communities. The overall objective of RECODE was to identify a series of targeted and over-arching policy recommendations for Open Access to European research data based on existing good practice. RECODE has undertaken a review of the existing state of the art and examined five case studies in different scientific disciplines: particle physics and astrophysics, clinical research, medicine and technical physiology (bioengineering), humanities (archaeology), and environmental sciences (Earth Observation). In particular for the latter discipline, GEOSS has been an optimal test bed for investigating the importance of technical and multidisciplinary interoperability, and what the challenges are in sharing and providing Open Access to research data from a variety of sources, and in a variety of formats. RECODE has identified five main technological and infrastructural challenges: • Heterogeneity - relates to interoperability, usability, accessibility, discoverability; • Sustainability - relates to obsolescence, curation, updates/upgrades, persistence, preservation; • Volume - also related to Big Data, which is somehow implied by Open Data; in our context, it relates to discoverability, accessibility (indexing), bandwidth, storage, scalability, energy footprint; • Quality - relates to completeness, description (metadata), usability, data (peer) review; • Security - relates to the technical aspects of policy enforcement, such the AAA-protocol for authentication, authorization and auditing/accounting, privacy issues, etc. RECODE has also focused on the identification of stakeholder values relevant to Open Access to research data, as well as on policy, legal, and institutional aspects. All these issues are of immediate relevance for the whole scientific ecosystem, including researchers, as data producers/users, as well as publishers and libraries, as means for data dissemination and management.
What if Finding Data was as Easy as Subscribing to the News?
NASA Astrophysics Data System (ADS)
Duerr, R. E.
2011-12-01
Data are the "common wealth of humanity," the fuel that drives the sciences; but much of the data that exist are inaccessible, buried in one of numerous stove-piped data systems, or entirely hidden unless you have direct knowledge of and contact with the investigator that acquired them. Much of the "wealth" is squandered and overall scientific progress inhibited, a situation that is becoming increasingly untenable with the openness required by data-driven science. What are needed are simple interoperability protocols and advertising mechanisms that allow data from disparate data systems to be easily discovered, explored, and accessed. The tools must be simple enough that individual investigators can use them without IT support. The tools cannot rely on centralized repositories or registries but must enable the development of ad-hoc or special purpose aggregations of data and services tailored to individual community needs. In addition, the protocols must scale to support the discovery of and access to the holdings of the global, interdisciplinary community, be they individual investigators or major data centers. NSIDC, in conjunction with other members of the Federation of Earth Science Information Partners and the Polar Information Commons, are working on just such a suite of tools and protocols. In this talk, I discuss data and service casting, aggregation, data badging, and OpenSearch - a suite of tools and protocols which, when used in conjunction with each other, have the potential of completely changing the way that data and services worldwide are discovered and used.
Stakeholder values and ecosystems in developing open access to research data.
NASA Astrophysics Data System (ADS)
Wessels, Bridgette; Sveinsdottir, Thordis; Smallwood, Rod
2014-05-01
One aspect of understanding how to develop open access to research data is to understand the values of stakeholders in the emerging open data ecosystem. The EU FP7 funded project Policy RECommendations for Open Access to Research Data in Europe (RECODE) (Grant Agreement No: 321463) undertook such research to identify stakeholder values and mapped the emerging ecosystem. In this paper we outline and discuss the findings of this research. We address three key objectives, which are: (a) the identification and mapping of the diverse range of stakeholder values in Open Access data and data dissemination and preservation; (b) mapping stakeholder values on to research ecosystems using case studies from different disciplinary perspectives; and (c) evaluate and identify good practice in addressing conflicting value chains and stakeholder fragmentation. The research was structured on three related actions: (a) an analysis of policy and related documents and protocols, in order to map the formal expression of values and motivations; (b) conducting five case studies in particle physics, health sciences, bioengineering, environmental research and archaeology. These explored issues of data size; quality control, ethics and data security; replication of large datasets; interoperability; and the preservation of diverse types of data; and (c) undertaking a validation and dissemination workshop that sought to better understand how to match policies with stakeholder drivers and motivations to increase their effectiveness in promoting Open Access to research data. The research findings include that there is clearly an overall drive for Open Data Access within the policy documents, which is part of a wider drive for open science in general. This is underpinned by the view of science as an open enterprise. Although there is a strong argument for publicly funded science to be made open to the public the details of how to make research data open as yet still unclear. Our research found that discussions of Open Data tend to refer to science as a single sector, leading to differences between disciplines being ignored in policy making. Each discipline has different methods for gathering and analysing data, some disciplines deal with sensitive data, and others deal with data that may have IPR or legal issues. We recommend that these differences are recognised, as they will inform the debate about subject specific requirements and common infrastructures for Open Data Access.
Interoperable Data Access Services for NOAA IOOS
NASA Astrophysics Data System (ADS)
de La Beaujardiere, J.
2008-12-01
The Integrated Ocean Observing System (IOOS) is intended to enhance our ability to collect, deliver, and use ocean information. The goal is to support research and decision-making by providing data on our open oceans, coastal waters, and Great Lakes in the formats, rates, and scales required by scientists, managers, businesses, governments, and the public. The US National Oceanic and Atmospheric Administration (NOAA) is the lead agency for IOOS. NOAA's IOOS office supports the development of regional coastal observing capability and promotes data management efforts to increase data accessibility. Geospatial web services have been established at NOAA data providers including the National Data Buoy Center (NDBC), the Center for Operational Oceanographic Products and Services (CO-OPS), and CoastWatch, and at regional data provider sites. Services established include Open-source Project for a Network Data Access Protocol (OpenDAP), Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), and OGC Web Coverage Service (WCS). These services provide integrated access to data holdings that have been aggregated at each center from multiple sources. We wish to collaborate with other groups to improve our service offerings to maximize interoperability and enhance cross-provider data integration, and to share common service components such as registries, catalogs, data conversion, and gateways. This paper will discuss the current status of NOAA's IOOS efforts and possible next steps.
NASA's Big Earth Data Initiative Accomplishments
NASA Technical Reports Server (NTRS)
Klene, Stephan A.; Pauli, Elisheva; Pressley, Natalie N.; Cechini, Matthew F.; McInerney, Mark
2017-01-01
The goal of NASA's effort for BEDI is to improve the usability, discoverability, and accessibility of Earth Observation data in support of societal benefit areas. Accomplishments: In support of BEDI goals, datasets have been entered into Common Metadata Repository(CMR), made available via the Open-source Project for a Network Data Access Protocol (OPeNDAP), have a Digital Object Identifier (DOI) registered for the dataset, and to support fast visualization many layers have been added in to the Global Imagery Browse Services (GIBS).
NASA's Big Earth Data Initiative Accomplishments
NASA Astrophysics Data System (ADS)
Klene, S. A.; Pauli, E.; Pressley, N. N.; Cechini, M. F.; McInerney, M.
2017-12-01
The goal of NASA's effort for BEDI is to improve the usability, discoverability, and accessibility of Earth Observation data in support of societal benefit areas. Accomplishments: In support of BEDI goals, datasets have been entered into Common Metadata Repository(CMR), made available via the Open-source Project for a Network Data Access Protocol (OPeNDAP), have a Digital Object Identifier (DOI) registered for the dataset, and to support fast visualization many layers have been added in to the Global Imagery Browse Service(GIBS)
Exploring NASA GES DISC Data with Interoperable Services
NASA Technical Reports Server (NTRS)
Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey
2015-01-01
Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.
Providing Internet Access to High-Resolution Lunar Images
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2008-01-01
The OnMoon server is a computer program that provides Internet access to high-resolution Lunar images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of the Moon. The OnMoon server implements the Open Geospatial Consortium (OGC) Web Map Service (WMS) server protocol and supports Moon-specific extensions. Unlike other Internet map servers that provide Lunar data using an Earth coordinate system, the OnMoon server supports encoding of data in Moon-specific coordinate systems. The OnMoon server offers access to most of the available high-resolution Lunar image and elevation data. This server can generate image and map files in the tagged image file format (TIFF) or the Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. Full-precision spectral arithmetic processing is also available, by use of a custom SLD extension. This server can dynamically add shaded relief based on the Lunar elevation to any image layer. This server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.
Mayhew, Alain D; Morton, Rachael L; Greenaway, Christina; Akl, Elie A; Rahman, Prinon; Zenner, Dominik; Pareek, Manish; Tugwell, Peter; Welch, Vivian; Meerpohl, Joerg; Alonso-Coello, Pablo; Hui, Charles; Biggs, Beverley-Ann; Requena-Méndez, Ana; Agbata, Eric; Noori, Teymur; Schünemann, Holger J
2017-01-01
Introduction The European Centre for Disease Prevention and Control is developing evidence-based guidance for voluntary screening, treatment and vaccine prevention of infectious diseases for newly arriving migrants to the European Union/European Economic Area. The objective of this systematic review protocol is to guide the identification, appraisal and synthesis of the best available evidence on prevention and assessment of the following priority infectious diseases: tuberculosis, HIV, hepatitis B, hepatitis C, measles, mumps, rubella, diphtheria, tetanus, pertussis, poliomyelitis (polio), Haemophilus influenza disease, strongyloidiasis and schistosomiasis. Methods and analysis The search strategy will identify evidence from existing systematic reviews and then update the effectiveness and cost-effectiveness evidence using prospective trials, economic evaluations and/or recently published systematic reviews. Interdisciplinary teams have designed logic models to help define study inclusion and exclusion criteria, guiding the search strategy and identifying relevant outcomes. We will assess the certainty of evidence using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach. Ethics and dissemination There are no ethical or safety issues. We anticipate disseminating the findings through open-access publications, conference abstracts and presentations. We plan to publish technical syntheses as GRADEpro evidence summaries and the systematic reviews as part of a special edition open-access publication on refugee health. We are following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses for Protocols reporting guideline. This protocol is registered in PROSPERO: CRD42016045798. PMID:28893741
NASA Astrophysics Data System (ADS)
Hassan, Waleed K.; Al-Assam, Hisham
2017-05-01
The main problem associated with using symmetric/ asymmetric keys is how to securely store and exchange the keys between the parties over open networks particularly in the open environment such as cloud computing. Public Key Infrastructure (PKI) have been providing a practical solution for session key exchange for loads of web services. The key limitation of PKI solution is not only the need for a trusted third partly (e.g. certificate authority) but also the absent link between data owner and the encryption keys. The latter is arguably more important where accessing data needs to be linked with identify of the owner. Currently available key exchange protocols depend on using trusted couriers or secure channels, which can be subject to man-in-the-middle attack and various other attacks. This paper proposes a new protocol for Key Exchange using Biometric Identity Based Encryption (KE-BIBE) that enables parties to securely exchange cryptographic keys even an adversary is monitoring the communication channel between the parties. The proposed protocol combines biometrics with IBE in order to provide a secure way to access symmetric keys based on the identity of the users in unsecure environment. In the KE-BIOBE protocol, the message is first encrypted by the data owner using a traditional symmetric key before migrating it to a cloud storage. The symmetric key is then encrypted using public biometrics of the users selected by data owner to decrypt the message based on Fuzzy Identity-Based Encryption. Only the selected users will be able to decrypt the message by providing a fresh sample of their biometric data. The paper argues that the proposed solution eliminates the needs for a key distribution centre in traditional cryptography. It will also give data owner the power of finegrained sharing of encrypted data by control who can access their data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2010-12-10
Open Energy Information (OpenEI) is an open source web platform—similar to the one used by Wikipedia—developed by the US Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) to make the large amounts of energy-related data and information more easily searched, accessed, and used both by people and automated machine processes. Built utilizing the standards and practices of the Linked Open Data community, the OpenEI platform is much more robust and powerful than typical web sites and databases. As an open platform, all users can search, edit, add, and access data in OpenEI for free. The user communitymore » contributes the content and ensures its accuracy and relevance; as the community expands, so does the content's comprehensiveness and quality. The data are structured and tagged with descriptors to enable cross-linking among related data sets, advanced search functionality, and consistent, usable formatting. Data input protocols and quality standards help ensure the content is structured and described properly and derived from a credible source. Although DOE/NREL is developing OpenEI and seeding it with initial data, it is designed to become a true community model with millions of users, a large core of active contributors, and numerous sponsors.« less
None
2018-02-06
Open Energy Information (OpenEI) is an open source web platformâsimilar to the one used by Wikipediaâdeveloped by the US Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) to make the large amounts of energy-related data and information more easily searched, accessed, and used both by people and automated machine processes. Built utilizing the standards and practices of the Linked Open Data community, the OpenEI platform is much more robust and powerful than typical web sites and databases. As an open platform, all users can search, edit, add, and access data in OpenEI for free. The user community contributes the content and ensures its accuracy and relevance; as the community expands, so does the content's comprehensiveness and quality. The data are structured and tagged with descriptors to enable cross-linking among related data sets, advanced search functionality, and consistent, usable formatting. Data input protocols and quality standards help ensure the content is structured and described properly and derived from a credible source. Although DOE/NREL is developing OpenEI and seeding it with initial data, it is designed to become a true community model with millions of users, a large core of active contributors, and numerous sponsors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horiike, S.; Okazaki, Y.
This paper describes a performance estimation tool developed for modeling and simulation of open distributed energy management systems to support their design. The approach of discrete event simulation with detailed models is considered for efficient performance estimation. The tool includes basic models constituting a platform, e.g., Ethernet, communication protocol, operating system, etc. Application softwares are modeled by specifying CPU time, disk access size, communication data size, etc. Different types of system configurations for various system activities can be easily studied. Simulation examples show how the tool is utilized for the efficient design of open distributed energy management systems.
Cruz-Piris, Luis; Rivera, Diego; Marsa-Maestre, Ivan; de la Hoz, Enrique; Velasco, Juan R
2018-03-20
Internet growth has generated new types of services where the use of sensors and actuators is especially remarkable. These services compose what is known as the Internet of Things (IoT). One of the biggest current challenges is obtaining a safe and easy access control scheme for the data managed in these services. We propose integrating IoT devices in an access control system designed for Web-based services by modelling certain IoT communication elements as resources. This would allow us to obtain a unified access control scheme between heterogeneous devices (IoT devices, Internet-based services, etc.). To achieve this, we have analysed the most relevant communication protocols for these kinds of environments and then we have proposed a methodology which allows the modelling of communication actions as resources. Then, we can protect these resources using access control mechanisms. The validation of our proposal has been carried out by selecting a communication protocol based on message exchange, specifically Message Queuing Telemetry Transport (MQTT). As an access control scheme, we have selected User-Managed Access (UMA), an existing Open Authorization (OAuth) 2.0 profile originally developed for the protection of Internet services. We have performed tests focused on validating the proposed solution in terms of the correctness of the access control system. Finally, we have evaluated the energy consumption overhead when using our proposal.
2018-01-01
Internet growth has generated new types of services where the use of sensors and actuators is especially remarkable. These services compose what is known as the Internet of Things (IoT). One of the biggest current challenges is obtaining a safe and easy access control scheme for the data managed in these services. We propose integrating IoT devices in an access control system designed for Web-based services by modelling certain IoT communication elements as resources. This would allow us to obtain a unified access control scheme between heterogeneous devices (IoT devices, Internet-based services, etc.). To achieve this, we have analysed the most relevant communication protocols for these kinds of environments and then we have proposed a methodology which allows the modelling of communication actions as resources. Then, we can protect these resources using access control mechanisms. The validation of our proposal has been carried out by selecting a communication protocol based on message exchange, specifically Message Queuing Telemetry Transport (MQTT). As an access control scheme, we have selected User-Managed Access (UMA), an existing Open Authorization (OAuth) 2.0 profile originally developed for the protection of Internet services. We have performed tests focused on validating the proposed solution in terms of the correctness of the access control system. Finally, we have evaluated the energy consumption overhead when using our proposal. PMID:29558406
Cheminformatics Research at the Unilever Centre for Molecular Science Informatics Cambridge.
Fuchs, Julian E; Bender, Andreas; Glen, Robert C
2015-09-01
The Centre for Molecular Informatics, formerly Unilever Centre for Molecular Science Informatics (UCMSI), at the University of Cambridge is a world-leading driving force in the field of cheminformatics. Since its opening in 2000 more than 300 scientific articles have fundamentally changed the field of molecular informatics. The Centre has been a key player in promoting open chemical data and semantic access. Though mainly focussing on basic research, close collaborations with industrial partners ensured real world feedback and access to high quality molecular data. A variety of tools and standard protocols have been developed and are ubiquitous in the daily practice of cheminformatics. Here, we present a retrospective of cheminformatics research performed at the UCMSI, thereby highlighting historical and recent trends in the field as well as indicating future directions.
Cheminformatics Research at the Unilever Centre for Molecular Science Informatics Cambridge
Fuchs, Julian E; Bender, Andreas; Glen, Robert C
2015-01-01
The Centre for Molecular Informatics, formerly Unilever Centre for Molecular Science Informatics (UCMSI), at the University of Cambridge is a world-leading driving force in the field of cheminformatics. Since its opening in 2000 more than 300 scientific articles have fundamentally changed the field of molecular informatics. The Centre has been a key player in promoting open chemical data and semantic access. Though mainly focussing on basic research, close collaborations with industrial partners ensured real world feedback and access to high quality molecular data. A variety of tools and standard protocols have been developed and are ubiquitous in the daily practice of cheminformatics. Here, we present a retrospective of cheminformatics research performed at the UCMSI, thereby highlighting historical and recent trends in the field as well as indicating future directions. PMID:26435758
Guo, Rui; Wen, Qiaoyan; Jin, Zhengping; Zhang, Hua
2013-01-01
Sensor networks have opened up new opportunities in healthcare systems, which can transmit patient's condition to health professional's hand-held devices in time. The patient's physiological signals are very sensitive and the networks are extremely vulnerable to many attacks. It must be ensured that patient's privacy is not exposed to unauthorized entities. Therefore, the control of access to healthcare systems has become a crucial challenge. An efficient and secure authentication protocol will thus be needed in wireless medical sensor networks. In this paper, we propose a certificateless authentication scheme without bilinear pairing while providing patient anonymity. Compared with other related protocols, the proposed scheme needs less computation and communication cost and preserves stronger security. Our performance evaluations show that this protocol is more practical for healthcare system in wireless medical sensor networks.
Network time synchronization servers at the US Naval Observatory
NASA Technical Reports Server (NTRS)
Schmidt, Richard E.
1995-01-01
Responding to an increased demand for reliable, accurate time on the Internet and Milnet, the U.S. Naval Observatory Time Service has established the network time servers, tick.usno.navy.mil and tock.usno.navy.mil. The system clocks of these HP9000/747i industrial work stations are synchronized to within a few tens of microseconds of USNO Master Clock 2 using VMEbus IRIG-B interfaces. Redundant time code is available from a VMEbus GPS receiver. UTC(USNO) is provided over the network via a number of protocols, including the Network Time Protocol (NTP) (DARPA Network Working Group Report RFC-1305), the Daytime Protocol (RFC-867), and the Time protocol (RFC-868). Access to USNO network time services is presently open and unrestricted. An overview of USNO time services and results of LAN and WAN time synchronization tests will be presented.
Guo, Rui; Wen, Qiaoyan; Jin, Zhengping; Zhang, Hua
2013-01-01
Sensor networks have opened up new opportunities in healthcare systems, which can transmit patient's condition to health professional's hand-held devices in time. The patient's physiological signals are very sensitive and the networks are extremely vulnerable to many attacks. It must be ensured that patient's privacy is not exposed to unauthorized entities. Therefore, the control of access to healthcare systems has become a crucial challenge. An efficient and secure authentication protocol will thus be needed in wireless medical sensor networks. In this paper, we propose a certificateless authentication scheme without bilinear pairing while providing patient anonymity. Compared with other related protocols, the proposed scheme needs less computation and communication cost and preserves stronger security. Our performance evaluations show that this protocol is more practical for healthcare system in wireless medical sensor networks. PMID:23710147
Protocol for Detection of Yersinia pestis in Environmental ...
Methods Report This is the first ever open-access and detailed protocol available to all government departments and agencies, and their contractors to detect Yersinia pestis, the pathogen that causes plague, from multiple environmental sample types including water. Each analytical method includes sample processing procedure for each sample type in a step-by-step manner. It includes real-time PCR, traditional microbiological culture, and the Rapid Viability PCR (RV-PCR) analytical methods. For large volume water samples it also includes an ultra-filtration-based sample concentration procedure. Because of such a non-restrictive availability of this protocol to all government departments and agencies, and their contractors, the nation will now have increased laboratory capacity to analyze large number of samples during a wide-area plague incident.
Data Integration Support for Data Served in the OPeNDAP and OGC Environments
NASA Technical Reports Server (NTRS)
McDonald, Kenneth R.; Wharton, Stephen W. (Technical Monitor)
2006-01-01
NASA is coordinating a technology development project to construct a gateway between system components built upon the Open-source Project for a Network Data AcceSs Protocol (OPeNDAP) and those made available made available via interfaces specified by the Open Geospatial Consortium (OGC). This project is funded though the Advanced Collaborative Connections for Earth-Sun System Science (ACCESS) Program and is a NASA contribution to the Committee on Earth Satellites (CEOS) Working Group on Information Systems and Services (WGISS). The motivation for the project is the set of data integration needs that have been expressed by the Coordinated Enhanced Observing Period (CEOP), an international program that is addressing the study of the global water cycle. CEOP is assembling a large collection in situ and satellite data and mode1 results from a wide variety of sources covering 35 sites around the globe. The data are provided by systems based on either the OPeNDAP or OGC protocols but the research community desires access to the full range of data and associated services from a single client. This presentation will discuss the current status of the OPeNDAP/OGC Gateway Project. The project is building upon an early prototype that illustrated the feasibility of such a gateway and which was demonstrated to the CEOP science community. In its first year as an ACCESS project, the effort has been has focused on the design of the catalog and data services that will be provided by the gateway and the mappings between the metadata and services provided in the two environments.
Pottie, Kevin; Mayhew, Alain D; Morton, Rachael L; Greenaway, Christina; Akl, Elie A; Rahman, Prinon; Zenner, Dominik; Pareek, Manish; Tugwell, Peter; Welch, Vivian; Meerpohl, Joerg; Alonso-Coello, Pablo; Hui, Charles; Biggs, Beverley-Ann; Requena-Méndez, Ana; Agbata, Eric; Noori, Teymur; Schünemann, Holger J
2017-09-11
The European Centre for Disease Prevention and Control is developing evidence-based guidance for voluntary screening, treatment and vaccine prevention of infectious diseases for newly arriving migrants to the European Union/European Economic Area. The objective of this systematic review protocol is to guide the identification, appraisal and synthesis of the best available evidence on prevention and assessment of the following priority infectious diseases: tuberculosis, HIV, hepatitis B, hepatitis C, measles, mumps, rubella, diphtheria, tetanus, pertussis, poliomyelitis (polio), Haemophilus influenza disease, strongyloidiasis and schistosomiasis. The search strategy will identify evidence from existing systematic reviews and then update the effectiveness and cost-effectiveness evidence using prospective trials, economic evaluations and/or recently published systematic reviews. Interdisciplinary teams have designed logic models to help define study inclusion and exclusion criteria, guiding the search strategy and identifying relevant outcomes. We will assess the certainty of evidence using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach. There are no ethical or safety issues. We anticipate disseminating the findings through open-access publications, conference abstracts and presentations. We plan to publish technical syntheses as GRADEpro evidence summaries and the systematic reviews as part of a special edition open-access publication on refugee health. We are following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses for Protocols reporting guideline. This protocol is registered in PROSPERO: CRD42016045798. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Using Cloud Computing infrastructure with CloudBioLinux, CloudMan and Galaxy
Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James
2012-01-01
Cloud computing has revolutionized availability and access to computing and storage resources; making it possible to provision a large computational infrastructure with only a few clicks in a web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this protocol, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatics analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to setup the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command line interface, and the web-based Galaxy interface. PMID:22700313
OpenAQ: A Platform to Aggregate and Freely Share Global Air Quality Data
NASA Astrophysics Data System (ADS)
Hasenkopf, C. A.; Flasher, J. C.; Veerman, O.; DeWitt, H. L.
2015-12-01
Thousands of ground-based air quality monitors around the world publicly publish real-time air quality data; however, researchers and the public do not have access to this information in the ways most useful to them. Often, air quality data are posted on obscure websites showing only current values, are programmatically inaccessible, and/or are in inconsistent data formats across sites. Yet, historical and programmatic access to such a global dataset would be transformative to several scientific fields, from epidemiology to low-cost sensor technologies to estimates of ground-level aerosol by satellite retrievals. To increase accessibility and standardize this disparate dataset, we have built OpenAQ, an innovative, open platform created by a group of scientists and open data programmers. The source code for the platform is viewable at github.com/openaq. Currently, we are aggregating, storing, and making publicly available real-time air quality data (PM2.5, PM10, SO2, NO2, and O3) via an Application Program Interface (API). We will present the OpenAQ platform, which currently has the following specific capabilities: A continuous ingest mechanism for some of the most polluted cities, generalizable to more sources An API providing data-querying, including ability to filter by location, measurement type and value and date, as well as custom sort options A generalized, chart-based visualization tool to explore data accessible via the API At this stage, we are seeking wider participation and input from multiple research communities in expanding our data retrieval sites, standardizing our protocols, receiving feedback on quality issues, and creating tools that can be built on top of this open platform.
OpenClimateGIS - A Web Service Providing Climate Model Data in Commonly Used Geospatial Formats
NASA Astrophysics Data System (ADS)
Erickson, T. A.; Koziol, B. W.; Rood, R. B.
2011-12-01
The goal of the OpenClimateGIS project is to make climate model datasets readily available in commonly used, modern geospatial formats used by GIS software, browser-based mapping tools, and virtual globes.The climate modeling community typically stores climate data in multidimensional gridded formats capable of efficiently storing large volumes of data (such as netCDF, grib) while the geospatial community typically uses flexible vector and raster formats that are capable of storing small volumes of data (relative to the multidimensional gridded formats). OpenClimateGIS seeks to address this difference in data formats by clipping climate data to user-specified vector geometries (i.e. areas of interest) and translating the gridded data on-the-fly into multiple vector formats. The OpenClimateGIS system does not store climate data archives locally, but rather works in conjunction with external climate archives that expose climate data via the OPeNDAP protocol. OpenClimateGIS provides a RESTful API web service for accessing climate data resources via HTTP, allowing a wide range of applications to access the climate data.The OpenClimateGIS system has been developed using open source development practices and the source code is publicly available. The project integrates libraries from several other open source projects (including Django, PostGIS, numpy, Shapely, and netcdf4-python).OpenClimateGIS development is supported by a grant from NOAA's Climate Program Office.
A Multidisciplinary Approach to Open Access Village Telecenter Initiatives: The Case of Akshaya
ERIC Educational Resources Information Center
Pal, Joyojeet; Nedevschi, Sergiu; Patra, Rabin K.; Brewer, Eric A.
2006-01-01
The Akshaya project in the rural Malappuram district of Kerala, India is home to the first and largest district-wide e-literacy project in India, and one of the largest known Internet Protocol-based networks. Through a network of 600 kiosks, the project has been designed to reach computer literacy into over 600,000 households, and bring 3.6…
Hróbjartsson, Asbjørn; Pildal, Julie; Chan, An-Wen; Haahr, Mette T; Altman, Douglas G; Gøtzsche, Peter C
2009-09-01
To compare the reporting on blinding in protocols and articles describing randomized controlled trials. We studied 73 protocols of trials approved by the scientific/ethical committees for Copenhagen and Frederiksberg, 1994 and 1995, and their corresponding publications. Three out of 73 trials (4%) reported blinding in the protocol that contradicted that in the publication (e.g., "open" vs. "double blind"). The proportion of "double-blind" trials with a clear description of the blinding of participants increased from 11 out of 58 (19%) when based on publications alone to 39 (67%) when adding the information in the protocol. The similar proportions for the blinding of health care providers were 2 (3%) and 22 (38%); and for the blinding of data collectors, they were 8 (14%) and 14 (24%). In 52 of 58 publications (90%), it was unclear whether all patients, health care providers, and data collectors had been blinded. In 4 of the 52 trials (7%), the protocols clarified that all three key trial persons had been blinded. The reporting on blinding in both trial protocols and publications is often inadequate. We suggest developing international guidelines for the reporting of trial protocols and public access to protocols.
Insecurity of Wireless Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheldon, Frederick T; Weber, John Mark; Yoo, Seong-Moo
Wireless is a powerful core technology enabling our global digital infrastructure. Wi-Fi networks are susceptible to attacks on Wired Equivalency Privacy, Wi-Fi Protected Access (WPA), and WPA2. These attack signatures can be profiled into a system that defends against such attacks on the basis of their inherent characteristics. Wi-Fi is the standard protocol for wireless networks used extensively in US critical infrastructures. Since the Wired Equivalency Privacy (WEP) security protocol was broken, the Wi-Fi Protected Access (WPA) protocol has been considered the secure alternative compatible with hardware developed for WEP. However, in November 2008, researchers developed an attack on WPA,more » allowing forgery of Address Resolution Protocol (ARP) packets. Subsequent enhancements have enabled ARP poisoning, cryptosystem denial of service, and man-in-the-middle attacks. Open source systems and methods (OSSM) have long been used to secure networks against such attacks. This article reviews OSSMs and the results of experimental attacks on WPA. These experiments re-created current attacks in a laboratory setting, recording both wired and wireless traffic. The article discusses methods of intrusion detection and prevention in the context of cyber physical protection of critical Internet infrastructure. The basis for this research is a specialized (and undoubtedly incomplete) taxonomy of Wi-Fi attacks and their adaptations to existing countermeasures and protocol revisions. Ultimately, this article aims to provide a clearer picture of how and why wireless protection protocols and encryption must achieve a more scientific basis for detecting and preventing such attacks.« less
Sternberg, Cora N; Castellano, Daniel; Daugaard, Gedske; Géczi, Lajos; Hotte, Sebastien J; Mainwaring, Paul N; Saad, Fred; Souza, Ciro; Tay, Miah H; Garrido, José M Tello; Galli, Luca; Londhe, Anil; De Porre, Peter; Goon, Betty; Lee, Emma; McGowan, Tracy; Naini, Vahid; Todd, Mary B; Molina, Arturo; George, Daniel J
2014-10-01
In the final analysis of the phase 3 COU-AA-301 study, abiraterone acetate plus prednisone significantly prolonged overall survival compared with prednisone alone in patients with metastatic castration-resistant prostate cancer progressing after chemotherapy. Here, we present the final analysis of an early-access protocol trial that was initiated after completion of COU-AA-301 to enable worldwide preapproval access to abiraterone acetate in patients with metastatic castration-resistant prostate cancer progressing after chemotherapy. We did a multicentre, open-label, early-access protocol trial in 23 countries. We enrolled patients who had metastatic castration-resistant prostate cancer progressing after taxane chemotherapy. Participants received oral doses of abiraterone acetate (1000 mg daily) and prednisone (5 mg twice a day) in 28-day cycles until disease progression, development of sustained side-effects, or abiraterone acetate becoming available in the respective country. The primary outcome was the number of adverse events arising during study treatment and within 30 days of discontinuation. Efficacy measures (time to prostate-specific antigen [PSA] progression and time to clinical progression) were gathered to guide treatment decisions. We included in our analysis all patients who received at least one dose of abiraterone acetate. This study is registered with ClinicalTrials.gov, number NCT01217697. Between Nov 17, 2010, and Sept 30, 2013, 2314 patients were enrolled into the early-access protocol trial. Median follow-up was 5·7 months (IQR 3·5-10·6). 952 (41%) patients had a grade 3 or 4 treatment-related adverse event, and grade 3 or 4 serious adverse events were recorded in 585 (25%) people. The most common grade 3 and 4 adverse events were hepatotoxicity (188 [8%]), hypertension (99 [4%]), cardiac disorders (52 [2%]), osteoporosis (31 [1%]), hypokalaemia (28 [1%]), and fluid retention or oedema (23 [1%]). 172 (7%) patients discontinued the study because of adverse events (64 [3%] were drug-related), as assessed by the investigator, and 171 (7%) people died. The funder assessed causes of death, which were due to disease progression (85 [4%]), an unrelated adverse experience (72 [3%]), and unknown reasons (14 [1%]). Of the 86 deaths not attributable to disease progression, 18 (<1%) were caused by a drug-related adverse event, as assessed by the investigator. Median time to PSA progression was 8·5 months (95% CI 8·3-9·7) and median time to clinical progression was 12·7 months (11·8-13·8). No new safety signals or unexpected adverse events were found in this early-access protocol trial to assess abiraterone acetate for patients with metastatic castration-resistant prostate cancer who progressed after chemotherapy. Future work is needed to ascertain the most effective regimen of abiraterone acetate to optimise patients' outcomes. Janssen Research & Development. Copyright © 2014 Elsevier Ltd. All rights reserved.
Multiphoton Intravital Calcium Imaging.
Cheetham, Claire E J
2018-06-26
Multiphoton intravital calcium imaging is a powerful technique that enables high-resolution longitudinal monitoring of cellular and subcellular activity hundreds of microns deep in the living organism. This unit addresses the application of 2-photon microscopy to imaging of genetically encoded calcium indicators (GECIs) in the mouse brain. The protocols in this unit enable real-time intravital imaging of intracellular calcium concentration simultaneously in hundreds of neurons, or at the resolution of single synapses, as mice respond to sensory stimuli or perform behavioral tasks. Protocols are presented for implantation of a cranial imaging window to provide optical access to the brain and for 2-photon image acquisition. Protocols for implantation of both open skull and thinned skull windows for single or multi-session imaging are described. © 2018 by John Wiley & Sons, Inc. © 2018 John Wiley & Sons, Inc.
Wu, Hui-Qun; Lv, Zheng-Min; Geng, Xing-Yun; Jiang, Kui; Tang, Le-Min; Zhou, Guo-Min; Dong, Jian-Cheng
2013-01-01
To address issues in interoperability between different fundus image systems, we proposed a web eye-picture archiving and communication system (PACS) framework in conformance with digital imaging and communication in medicine (DICOM) and health level 7 (HL7) protocol to realize fundus images and reports sharing and communication through internet. Firstly, a telemedicine-based eye care work flow was established based on integrating the healthcare enterprise (IHE) Eye Care technical framework. Then, a browser/server architecture eye-PACS system was established in conformance with the web access to DICOM persistent object (WADO) protocol, which contains three tiers. In any client system installed with web browser, clinicians could log in the eye-PACS to observe fundus images and reports. Multipurpose internet mail extensions (MIME) type of a structured report is saved as pdf/html with reference link to relevant fundus image using the WADO syntax could provide enough information for clinicians. Some functions provided by open-source Oviyam could be used to query, zoom, move, measure, view DICOM fundus images. Such web eye-PACS in compliance to WADO protocol could be used to store and communicate fundus images and reports, therefore is of great significance for teleophthalmology.
NASA Update for Unidata Stratcomm
NASA Technical Reports Server (NTRS)
Lynnes, Chris
2017-01-01
The NASA representative to the Unidata Strategic Committee presented a semiannual update on NASAs work with and use of Unidata technologies. The talk updated Unidata on the program of cloud computing prototypes underway for the Earth Observing System Data and Information System (EOSDIS). Also discussed was a trade study on the use of the Open source Project for a Network Data Access Protocol (OPeNDAP) with Web Object Storage in the cloud.
Data Access Tools And Services At The Goddard Distributed Active Archive Center (GDAAC)
NASA Technical Reports Server (NTRS)
Pham, Long; Eng, Eunice; Sweatman, Paul
2003-01-01
As one of the largest providers of Earth Science data from the Earth Observing System, GDAAC provides the latest data from the Moderate Resolution Imaging Spectroradiometer (MODIS), Atmospheric Infrared Sounder (AIRS), Solar Radiation and Climate Experiment (SORCE) data products via GDAAC's data pool (50TB of disk cache). In order to make this huge volume of data more accessible to the public and science communities, the GDAAC offers multiple data access tools and services: Open Source Project for Network Data Access Protocol (OPeNDAP), Grid Analysis and Display System (GrADS/DODS) (GDS), Live Access Server (LAS), OpenGlS Web Map Server (WMS) and Near Archive Data Mining (NADM). The objective is to assist users in retrieving electronically a smaller, usable portion of data for further analysis. The OPeNDAP server, formerly known as the Distributed Oceanographic Data System (DODS), allows the user to retrieve data without worrying about the data format. OPeNDAP is capable of server-side subsetting of HDF, HDF-EOS, netCDF, JGOFS, ASCII, DSP, FITS and binary data formats. The GrADS/DODS server is capable of serving the same data formats as OPeNDAP. GDS has an additional feature of server-side analysis. Users can analyze the data on the server there by decreasing the computational load on their client's system. The LAS is a flexible server that allows user to graphically visualize data on the fly, to request different file formats and to compare variables from distributed locations. Users of LAS have options to use other available graphics viewers such as IDL, Matlab or GrADS. WMS is based on the OPeNDAP for serving geospatial information. WMS supports OpenGlS protocol to provide data in GIs-friendly formats for analysis and visualization. NADM is another access to the GDAAC's data pool. NADM gives users the capability to use a browser to upload their C, FORTRAN or IDL algorithms, test the algorithms, and mine data in the data pool. With NADM, the GDAAC provides an environment physically close to the data source. NADM will benefit users with mining or offer data reduction algorithms by reducing large volumes of data before transmission over the network to the user.
NASA's Earth Imagery Service as Open Source Software
NASA Astrophysics Data System (ADS)
De Cesare, C.; Alarcon, C.; Huang, T.; Roberts, J. T.; Rodriguez, J.; Cechini, M. F.; Boller, R. A.; Baynes, K.
2016-12-01
The NASA Global Imagery Browse Service (GIBS) is a software system that provides access to an archive of historical and near-real-time Earth imagery from NASA-supported satellite instruments. The imagery itself is open data, and is accessible via standards such as the Open Geospatial Consortium (OGC)'s Web Map Tile Service (WMTS) protocol. GIBS includes three core software projects: The Imagery Exchange (TIE), OnEarth, and the Meta Raster Format (MRF) project. These projects are developed using a variety of open source software, including: Apache HTTPD, GDAL, Mapserver, Grails, Zookeeper, Eclipse, Maven, git, and Apache Commons. TIE has recently been released for open source, and is now available on GitHub. OnEarth, MRF, and their sub-projects have been on GitHub since 2014, and the MRF project in particular receives many external contributions from the community. Our software has been successful beyond the scope of GIBS: the PO.DAAC State of the Ocean and COVERAGE visualization projects reuse components from OnEarth. The MRF source code has recently been incorporated into GDAL, which is a core library in many widely-used GIS software such as QGIS and GeoServer. This presentation will describe the challenges faced in incorporating open software and open data into GIBS, and also showcase GIBS as a platform on which scientists and the general public can build their own applications.
Kinsinger, Christopher R.; Apffel, James; Baker, Mark; Bian, Xiaopeng; Borchers, Christoph H.; Bradshaw, Ralph; Brusniak, Mi-Youn; Chan, Daniel W.; Deutsch, Eric W.; Domon, Bruno; Gorman, Jeff; Grimm, Rudolf; Hancock, William; Hermjakob, Henning; Horn, David; Hunter, Christie; Kolar, Patrik; Kraus, Hans-Joachim; Langen, Hanno; Linding, Rune; Moritz, Robert L.; Omenn, Gilbert S.; Orlando, Ron; Pandey, Akhilesh; Ping, Peipei; Rahbar, Amir; Rivers, Robert; Seymour, Sean L.; Simpson, Richard J.; Slotta, Douglas; Smith, Richard D.; Stein, Stephen E.; Tabb, David L.; Tagle, Danilo; Yates, John R.; Rodriguez, Henry
2011-01-01
Policies supporting the rapid and open sharing of proteomic data are being implemented by the leading journals in the field. The proteomics community is taking steps to ensure that data are made publicly accessible and are of high quality, a challenging task that requires the development and deployment of methods for measuring and documenting data quality metrics. On September 18, 2010, the U.S. National Cancer Institute (NCI) convened the “International Workshop on Proteomic Data Quality Metrics” in Sydney, Australia, to identify and address issues facing the development and use of such methods for open access proteomics data. The stakeholders at the workshop enumerated the key principles underlying a framework for data quality assessment in mass spectrometry data that will meet the needs of the research community, journals, funding agencies, and data repositories. Attendees discussed and agreed up on two primary needs for the wide use of quality metrics: (1) an evolving list of comprehensive quality metrics and (2) standards accompanied by software analytics. Attendees stressed the importance of increased education and training programs to promote reliable protocols in proteomics. This workshop report explores the historic precedents, key discussions, and necessary next steps to enhance the quality of open access data. By agreement, this article is published simultaneously in the Journal of Proteome Research, Molecular and Cellular Proteomics, Proteomics, and Proteomics Clinical Applications as a public service to the research community. The peer review process was a coordinated effort conducted by a panel of referees selected by the journals. PMID:22053864
Kinsinger, Christopher R.; Apffel, James; Baker, Mark; Bian, Xiaopeng; Borchers, Christoph H.; Bradshaw, Ralph; Brusniak, Mi-Youn; Chan, Daniel W.; Deutsch, Eric W.; Domon, Bruno; Gorman, Jeff; Grimm, Rudolf; Hancock, William; Hermjakob, Henning; Horn, David; Hunter, Christie; Kolar, Patrik; Kraus, Hans-Joachim; Langen, Hanno; Linding, Rune; Moritz, Robert L.; Omenn, Gilbert S.; Orlando, Ron; Pandey, Akhilesh; Ping, Peipei; Rahbar, Amir; Rivers, Robert; Seymour, Sean L.; Simpson, Richard J.; Slotta, Douglas; Smith, Richard D.; Stein, Stephen E.; Tabb, David L.; Tagle, Danilo; Yates, John R.; Rodriguez, Henry
2011-01-01
Policies supporting the rapid and open sharing of proteomic data are being implemented by the leading journals in the field. The proteomics community is taking steps to ensure that data are made publicly accessible and are of high quality, a challenging task that requires the development and deployment of methods for measuring and documenting data quality metrics. On September 18, 2010, the United States National Cancer Institute convened the “International Workshop on Proteomic Data Quality Metrics” in Sydney, Australia, to identify and address issues facing the development and use of such methods for open access proteomics data. The stakeholders at the workshop enumerated the key principles underlying a framework for data quality assessment in mass spectrometry data that will meet the needs of the research community, journals, funding agencies, and data repositories. Attendees discussed and agreed up on two primary needs for the wide use of quality metrics: 1) an evolving list of comprehensive quality metrics and 2) standards accompanied by software analytics. Attendees stressed the importance of increased education and training programs to promote reliable protocols in proteomics. This workshop report explores the historic precedents, key discussions, and necessary next steps to enhance the quality of open access data. By agreement, this article is published simultaneously in the Journal of Proteome Research, Molecular and Cellular Proteomics, Proteomics, and Proteomics Clinical Applications as a public service to the research community. The peer review process was a coordinated effort conducted by a panel of referees selected by the journals. PMID:22052993
Zhang, P; Aungskunsiri, K; Martín-López, E; Wabnig, J; Lobino, M; Nock, R W; Munns, J; Bonneau, D; Jiang, P; Li, H W; Laing, A; Rarity, J G; Niskanen, A O; Thompson, M G; O'Brien, J L
2014-04-04
We demonstrate a client-server quantum key distribution (QKD) scheme. Large resources such as laser and detectors are situated at the server side, which is accessible via telecom fiber to a client requiring only an on-chip polarization rotator, which may be integrated into a handheld device. The detrimental effects of unstable fiber birefringence are overcome by employing the reference-frame-independent QKD protocol for polarization qubits in polarization maintaining fiber, where standard QKD protocols fail, as we show for comparison. This opens the way for quantum enhanced secure communications between companies and members of the general public equipped with handheld mobile devices, via telecom-fiber tethering.
2002-09-01
Protocol LAN Local Area Network LDAP Lightweight Directory Access Protocol LLQ Low Latency Queuing MAC Media Access Control MarCorSysCom Marine...Description Protocol SIP Session Initiation Protocol SMTP Simple Mail Transfer Protocol SPAWAR Space and Naval Warfare Systems Center SS7 ...PSTN infrastructure previously required to carry the conversation. The cost of accessing the PSTN is thereby eliminated. In cases where Internet
dCache, Sync-and-Share for Big Data
NASA Astrophysics Data System (ADS)
Millar, AP; Fuhrmann, P.; Mkrtchyan, T.; Behrmann, G.; Bernardt, C.; Buchholz, Q.; Guelzow, V.; Litvintsev, D.; Schwank, K.; Rossi, A.; van der Reest, P.
2015-12-01
The availability of cheap, easy-to-use sync-and-share cloud services has split the scientific storage world into the traditional big data management systems and the very attractive sync-and-share services. With the former, the location of data is well understood while the latter is mostly operated in the Cloud, resulting in a rather complex legal situation. Beside legal issues, those two worlds have little overlap in user authentication and access protocols. While traditional storage technologies, popular in HEP, are based on X.509, cloud services and sync-and-share software technologies are generally based on username/password authentication or mechanisms like SAML or Open ID Connect. Similarly, data access models offered by both are somewhat different, with sync-and-share services often using proprietary protocols. As both approaches are very attractive, dCache.org developed a hybrid system, providing the best of both worlds. To avoid reinventing the wheel, dCache.org decided to embed another Open Source project: OwnCloud. This offers the required modern access capabilities but does not support the managed data functionality needed for large capacity data storage. With this hybrid system, scientists can share files and synchronize their data with laptops or mobile devices as easy as with any other cloud storage service. On top of this, the same data can be accessed via established mechanisms, like GridFTP to serve the Globus Transfer Service or the WLCG FTS3 tool, or the data can be made available to worker nodes or HPC applications via a mounted filesystem. As dCache provides a flexible authentication module, the same user can access its storage via different authentication mechanisms; e.g., X.509 and SAML. Additionally, users can specify the desired quality of service or trigger media transitions as necessary, thus tuning data access latency to the planned access profile. Such features are a natural consequence of using dCache. We will describe the design of the hybrid dCache/OwnCloud system, report on several months of operations experience running it at DESY, and elucidate the future road-map.
Libre: Freeing Polar Data in an Information Commons
NASA Astrophysics Data System (ADS)
Duerr, R. E.; Parsons, M. A.
2010-12-01
As noted in the session description “The polar regions are at the forefront of modern environmental change, currently experiencing the largest and fastest changes in climate and environment”. Wise use of resources, astute management of our environment, improved decision support, and effective international cooperation on natural resource and geopolitical issues require a deeper understanding of, and an ability to predict change and its impact. Understanding and knowledge are built on data and information, yet polar information is scattered, scarce, and sporadic. Rapid change demands rapid data access. We envision a system where investigators quickly expose their data to the world and share them, without restriction, through open protocols on the Internet. A single giant, central archive is not practical for all polar data held around the world. Instead, we seek a collaborative, virtual space, where scientific data and information could be shared ethically and with minimal constraints. Inspired by the Antarctic Treaty of 1959 that established the Antarctic as a global commons to generate greater scientific understanding, the International Council of Science leads the Polar Information Commons (PIC). The PIC, engendered by the International Polar Year (IPY) and work on the IPY data policy, serves as an open, virtual repository for vital scientific data and information. An international network of scientific and data management organizations concerned with the scientific quality, integrity, and stewardship of data is developing the PIC. The PIC utilizes the Science Commons Protocol for Implementing Open Access Data, including establishment of community norms to encourage appropriate contributions to and use of PIC content. Data descriptions (metadata) are not necessarily registered in formal repositories or catalogues. They may simply be exposed to search engines or broadcast through syndication services such as RSS or Atom. The data are labeled or branded as part of the PIC and are, therefore, open for use without restriction. The PIC label also alerts data centers around the world to new polar data. These data centers then assess and acquire important data for formal archiving, curation, and access through national and global data systems. The intent is to enable rapid data access without qualification, while establishing a process for long-term preservation and stewardship of critical data. This paper will review the ethical and legal basis for sharing polar data and information, as well as the technologies being employed to make the PIC a reality.
OC ToGo: bed site image integration into OpenClinica with mobile devices
NASA Astrophysics Data System (ADS)
Haak, Daniel; Gehlen, Johan; Jonas, Stephan; Deserno, Thomas M.
2014-03-01
Imaging and image-based measurements nowadays play an essential role in controlled clinical trials, but electronic data capture (EDC) systems insufficiently support integration of captured images by mobile devices (e.g. smartphones and tablets). The web application OpenClinica has established as one of the world's leading EDC systems and is used to collect, manage and store data of clinical trials in electronic case report forms (eCRFs). In this paper, we present a mobile application for instantaneous integration of images into OpenClinica directly during examination on patient's bed site. The communication between the Android application and OpenClinica is based on the simple object access protocol (SOAP) and representational state transfer (REST) web services for metadata, and secure file transfer protocol (SFTP) for image transfer, respectively. OpenClinica's web services are used to query context information (e.g. existing studies, events and subjects) and to import data into the eCRF, as well as export of eCRF metadata and structural information. A stable image transfer is ensured and progress information (e.g. remaining time) visualized to the user. The workflow is demonstrated for a European multi-center registry, where patients with calciphylaxis disease are included. Our approach improves the EDC workflow, saves time, and reduces costs. Furthermore, data privacy is enhanced, since storage of private health data on the imaging devices becomes obsolete.
The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geospatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ananthakrishnan, Rachana; Bell, Gavin; Cinquini, Luca
2013-01-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less
The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geo-Spatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cinquini, Luca; Crichton, Daniel; Miller, Neill
2012-01-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less
The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data
NASA Technical Reports Server (NTRS)
Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark;
2012-01-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).
Khalique, Omar K; Pulerwitz, Todd C; Halliburton, Sandra S; Kodali, Susheel K; Hahn, Rebecca T; Nazif, Tamim M; Vahl, Torsten P; George, Isaac; Leon, Martin B; D'Souza, Belinda; Einstein, Andrew J
2016-01-01
Transcatheter aortic valve replacement (TAVR) is performed frequently in patients with severe, symptomatic aortic stenosis who are at high risk or inoperable for open surgical aortic valve replacement. Computed tomography angiography (CTA) has become the gold standard imaging modality for pre-TAVR cardiac anatomic and vascular access assessment. Traditionally, cardiac CTA has been most frequently used for assessment of coronary artery stenosis, and scanning protocols have generally been tailored for this purpose. Pre-TAVR CTA has different goals than coronary CTA and the high prevalence of chronic kidney disease in the TAVR patient population creates a particular need to optimize protocols for a reduction in iodinated contrast volume. This document reviews details which allow the physician to tailor CTA examinations to maximize image quality and minimize harm, while factoring in multiple patient and scanner variables which must be considered in customizing a pre-TAVR protocol. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.
Access and accounting schemes of wireless broadband
NASA Astrophysics Data System (ADS)
Zhang, Jian; Huang, Benxiong; Wang, Yan; Yu, Xing
2004-04-01
In this paper, two wireless broadband access and accounting schemes were introduced. There are some differences in the client and the access router module between them. In one scheme, Secure Shell (SSH) protocol is used in the access system. The SSH server makes the authentication based on private key cryptography. The advantage of this scheme is the security of the user's information, and we have sophisticated access control. In the other scheme, Secure Sockets Layer (SSL) protocol is used the access system. It uses the technology of public privacy key. Nowadays, web browser generally combines HTTP and SSL protocol and we use the SSL protocol to implement the encryption of the data between the clients and the access route. The schemes are same in the radius sever part. Remote Authentication Dial in User Service (RADIUS), as a security protocol in the form of Client/Sever, is becoming an authentication/accounting protocol for standard access to the Internet. It will be explained in a flow chart. In our scheme, the access router serves as the client to the radius server.
Providing Internet Access to High-Resolution Mars Images
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2008-01-01
The OnMars server is a computer program that provides Internet access to high-resolution Mars images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of Mars. The OnMars server is an implementation of the Open Geospatial Consortium (OGC) Web Map Service (WMS) server. Unlike other Mars Internet map servers that provide Martian data using an Earth coordinate system, the OnMars WMS server supports encoding of data in Mars-specific coordinate systems. The OnMars server offers access to most of the available high-resolution Martian image and elevation data, including an 8-meter-per-pixel uncontrolled mosaic of most of the Mars Global Surveyor (MGS) Mars Observer Camera Narrow Angle (MOCNA) image collection, which is not available elsewhere. This server can generate image and map files in the tagged image file format (TIFF), Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. The OnMars server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.
NASA Astrophysics Data System (ADS)
Carsughi, Flavio; Fonseca, Luis
2017-06-01
NFFA-EUROPE is an European open access resource for experimental and theoretical nanoscience and sets out a platform to carry out comprehensive projects for multidisciplinary research at the nanoscale extending from synthesis to nanocharacterization to theory and numerical simulation. Advanced infrastructures specialized on growth, nano-lithography, nano-characterization, theory and simulation and fine-analysis with Synchrotron, FEL and Neutron radiation sources are integrated in a multi-site combination to develop frontier research on methods for reproducible nanoscience research and to enable European and international researchers from diverse disciplines to carry out advanced proposals impacting science and innovation. NFFA-EUROPE will enable coordinated access to infrastructures on different aspects of nanoscience research that is not currently available at single specialized ones and without duplicating their specific scopes. Approved user projects will have access to the best suited instruments and support competences for performing the research, including access to analytical large scale facilities, theory and simulation and high-performance computing facilities. Access is offered free of charge to European users and users will receive a financial contribution for their travel, accommodation and subsistence costs. The users access will include several "installations" and will be coordinated through a single entry point portal that will activate an advanced user-infrastructure dialogue to build up a personalized access programme with an increasing return on science and innovation production. The own research activity of NFFA-EUROPE will address key bottlenecks of nanoscience research: nanostructure traceability, protocol reproducibility, in-operando nano-manipulation and analysis, open data.
A survey of system architecture requirements for health care-based wireless sensor networks.
Egbogah, Emeka E; Fapojuwo, Abraham O
2011-01-01
Wireless Sensor Networks (WSNs) have emerged as a viable technology for a vast number of applications, including health care applications. To best support these health care applications, WSN technology can be adopted for the design of practical Health Care WSNs (HCWSNs) that support the key system architecture requirements of reliable communication, node mobility support, multicast technology, energy efficiency, and the timely delivery of data. Work in the literature mostly focuses on the physical design of the HCWSNs (e.g., wearable sensors, in vivo embedded sensors, et cetera). However, work towards enhancing the communication layers (i.e., routing, medium access control, et cetera) to improve HCWSN performance is largely lacking. In this paper, the information gleaned from an extensive literature survey is shared in an effort to fortify the knowledge base for the communication aspect of HCWSNs. We highlight the major currently existing prototype HCWSNs and also provide the details of their routing protocol characteristics. We also explore the current state of the art in medium access control (MAC) protocols for WSNs, for the purpose of seeking an energy efficient solution that is robust to mobility and delivers data in a timely fashion. Furthermore, we review a number of reliable transport layer protocols, including a network coding based protocol from the literature, that are potentially suitable for delivering end-to-end reliability of data transmitted in HCWSNs. We identify the advantages and disadvantages of the reviewed MAC, routing, and transport layer protocols as they pertain to the design and implementation of a HCWSN. The findings from this literature survey will serve as a useful foundation for designing a reliable HCWSN and also contribute to the development and evaluation of protocols for improving the performance of future HCWSNs. Open issues that required further investigations are highlighted.
Colby, Kathleen N; Levy, Julie K; Dunn, Kiri F; Michaud, Rachel I
2011-03-22
The high prevalence of heartworm infection in shelter dogs creates a dilemma for shelter managers, who frequently operate with insufficient funding, staffing, and expertise to comply with heartworm guidelines developed for owned pet dogs. The purpose of this study was to survey canine heartworm management protocols used by 504 animal sheltering agencies in the endemic states of Alabama, Florida, Georgia, and Mississippi. Open-admission shelters, which tended to be larger and more likely to perform animal control functions, were less likely (41%) to test all adult dogs than were limited-admission shelters (80%), which tended to be smaller non-profit humane agencies, and foster programs (98%) based out of private residences. Open-admission shelters were more likely to euthanize infected dogs (27%) or to release them without treatment (39%), whereas limited-admission shelters and foster programs were more likely to provide adulticide therapy (82% and 89%, respectively). Of the 319 agencies that treated infections, 44% primarily used a standard two-dose melarsomine protocol, and 35% primarily used a three-dose split-treatment melarsomine protocol. Long-term low-dose ivermectin was the most common treatment used in 22% of agencies. Open-admission shelters were less likely (35%) to provide preventive medications for all dogs than were limited-admission shelters (82%) and foster programs (97%). More agencies used preventives labeled for monthly use in dogs (60%) than ivermectin products labeled for livestock (38%). The most common reason diagnostic testing and preventive medication was not provided was cost. These results indicate a lack of protocol uniformity among agencies and insufficient resources to identify, treat, and prevent infection. Sheltering agencies and companion animal health industries should develop guidelines that are feasible for use in sheltering agencies and provide improved access to preventive and treatment strategies for management of Dirofilaria immitis. Copyright © 2011 Elsevier B.V. All rights reserved.
Medium Access Control Protocols for Cognitive Radio Ad Hoc Networks: A Survey
Islam, A. K. M. Muzahidul; Baharun, Sabariah; Mansoor, Nafees
2017-01-01
New wireless network paradigms will demand higher spectrum use and availability to cope with emerging data-hungry devices. Traditional static spectrum allocation policies cause spectrum scarcity, and new paradigms such as Cognitive Radio (CR) and new protocols and techniques need to be developed in order to have efficient spectrum usage. Medium Access Control (MAC) protocols are accountable for recognizing free spectrum, scheduling available resources and coordinating the coexistence of heterogeneous systems and users. This paper provides an ample review of the state-of-the-art MAC protocols, which mainly focuses on Cognitive Radio Ad Hoc Networks (CRAHN). First, a description of the cognitive radio fundamental functions is presented. Next, MAC protocols are divided into three groups, which are based on their channel access mechanism, namely time-slotted protocol, random access protocol and hybrid protocol. In each group, a detailed and comprehensive explanation of the latest MAC protocols is presented, as well as the pros and cons of each protocol. A discussion on future challenges for CRAHN MAC protocols is included with a comparison of the protocols from a functional perspective. PMID:28926952
Variation in the Breeding System of Prunella vulgaris L.
Qu, Luping; Widrlechner, Mark P
2011-05-01
Prunella vulgaris (Lamiaceae), commonly known as selfheal, is a perennial herb with a long history of use in traditional medicine. Recent studies have found that P. vulgaris possesses anti-inflammatory, antiviral, and antibacterial properties, and it is likely that this will lead to increased commercial demand for this species. To date, research publications on P. vulgaris cultivation and genetics are scarce. Using accessions originally collected from different geographical regions, we investigated the breeding system of this species by observing variation in floral morphology, time of pollen release, and selfed-seed set in bagged flowers and isolated plants. Two types of floral morphology, one with exerted styles, extending past open corollas when viewed from above, and the other with shorter, inserted styles, were found among 30 accessions. Two accessions originally collected from Asia uniformly displayed exerted styles, and 27 accessions had inserted styles. One accession from Oregon displayed variation in this trait among individual plants. Microscopic observation of seven accessions, including ones with both exerted and inserted styles, revealed that they all release pollen to some degree before the flowers open. Using bagged flowers, we found that selfed-seed set varied widely among eight accessions, ranging from 6% to 94%. However, bagging may underestimate seed set for some accessions. The two accessions with the lowest rates when using bagged flowers increased in seed set by 350% and 158%, respectively, when we evaluated single, unbagged plants in isolation cages. The accession with 6% selfed-seed set when bagged also had exerted styles. These findings suggest that mating systems in P. vulgaris may be in the process of evolutionary change and that understanding breeding-system variation should be useful in developing efficient seed-regeneration protocols and breeding and selection strategies for this species.
Variation in the Breeding System of Prunella vulgaris L
Qu, Luping; Widrlechner, Mark P.
2011-01-01
Prunella vulgaris (Lamiaceae), commonly known as selfheal, is a perennial herb with a long history of use in traditional medicine. Recent studies have found that P. vulgaris possesses anti-inflammatory, antiviral, and antibacterial properties, and it is likely that this will lead to increased commercial demand for this species. To date, research publications on P. vulgaris cultivation and genetics are scarce. Using accessions originally collected from different geographical regions, we investigated the breeding system of this species by observing variation in floral morphology, time of pollen release, and selfed-seed set in bagged flowers and isolated plants. Two types of floral morphology, one with exerted styles, extending past open corollas when viewed from above, and the other with shorter, inserted styles, were found among 30 accessions. Two accessions originally collected from Asia uniformly displayed exerted styles, and 27 accessions had inserted styles. One accession from Oregon displayed variation in this trait among individual plants. Microscopic observation of seven accessions, including ones with both exerted and inserted styles, revealed that they all release pollen to some degree before the flowers open. Using bagged flowers, we found that selfed-seed set varied widely among eight accessions, ranging from 6% to 94%. However, bagging may underestimate seed set for some accessions. The two accessions with the lowest rates when using bagged flowers increased in seed set by 350% and 158%, respectively, when we evaluated single, unbagged plants in isolation cages. The accession with 6% selfed-seed set when bagged also had exerted styles. These findings suggest that mating systems in P. vulgaris may be in the process of evolutionary change and that understanding breeding-system variation should be useful in developing efficient seed-regeneration protocols and breeding and selection strategies for this species. PMID:21776085
Capitalizing on Global Demands for Open Data Access and Interoperability - the USGIN Story
NASA Astrophysics Data System (ADS)
Allison, M. L.; Richard, S. M.
2015-12-01
The U.S. National Geothermal Data System's (NGDS - www.geothermaldata.org) provides free open access to ~ 10 million data records, maps, and reports, sharing relevant geoscience and land use data to propel geothermal development and production in the U.S. Since the NGDS is built using the U.S. Geoscience Information Network (USGIN - http://usgin.org) data integration framework the system is compliant with international standards and protocols, scalable, extensible, and can be deployed throughout the world for a myriad of applications. NGDS currently serves information from hundreds of the U.S. Department of Energy's sponsored projects and geologic data feeds from 60+ data providers in all 50 states, using free and open source software, in a federated system where data owners maintain control of their data. This interactive online system is opening new exploration opportunities and shortening project development by making data easily discoverable, accessible, and interoperable at no cost to users. USGIN Foundation, Inc. was established in 2014 as a not-for-profit company to deploy the USGIN data integration framework for other natural resource (energy, water, and minerals), natural hazards, and geoscience investigations applications, nationally and worldwide. The USGIN vision is that as each data node adds to its data repositories, the system-wide USGIN functions become increasingly valuable to it. Each data provider will have created a value-added service that is transportable and scalable to cover all data in its possession. Thus, there are benefits to each participant to continue to add data to the system and maintain it. The long term goal is that the data network reach a 'tipping point' at which it becomes like a data equivalent to the World Wide Web - where everyone will maintain the function because it is expected by its clientele and it fills critical needs. Applying this vision to NGDS, it also opens the door for additional data providers external to geothermal development, thus increasing the value of data integration platform, USGIN. USGIN meets all the requirements of the White House Open Data Access Initiative that applies to (almost) all federally-funded research and all federally-maintained data, opening up huge opportunities for further deployment.
A simple, effective media access protocol system for integrated, high data rate networks
NASA Technical Reports Server (NTRS)
Foudriat, E. C.; Maly, K.; Overstreet, C. M.; Khanna, S.; Zhang, L.
1992-01-01
The operation and performance of a dual media access protocol for integrated, gigabit networks are described. Unlike other dual protocols, each protocol supports a different class of traffic. The Carrier Sensed Multiple Access-Ring Network (CSMA/RN) protocol and the Circulating Reservation Packet (CRP) protocol support asynchronous and synchronous traffic, respectively. The two protocols operate with minimal impact upon each other. Performance information demonstrates that they support a complete range of integrated traffic loads, do not require call setup/termination or a special node for synchronous traffic control, and provide effective pre-use and recovery. The CRP also provides guaranteed access and fairness control for the asynchronous system. The paper demonstrates that the CSMA-CRP system fulfills many of the requirements for gigabit LAN-MAN networks most effectively and simply. To accomplish this, CSMA-CRP features are compared against similar ring and bus systems, such as Cambridge Fast Ring, Metaring, Cyclic Reservation Multiple Access, and Distributed Dual Queue Data Bus (DQDB).
Synthesizing Existing CSMA and TDMA Based MAC Protocols for VANETs
Huang, Jiawei; Li, Qi; Zhong, Shaohua; Liu, Lianhai; Zhong, Ping; Wang, Jianxin; Ye, Jin
2017-01-01
Many Carrier Sense Multiple Access (CSMA) and Time Division Multiple Access (TDMA) based medium access control (MAC) protocols for vehicular ad hoc networks (VANETs) have been proposed recently. Contrary to the common perception that they are competitors, we argue that the underlying strategies used in these MAC protocols are complementary. Based on this insight, we design CTMAC, a MAC protocol that synthesizes existing strategies; namely, random accessing channel (used in CSMA-style protocols) and arbitral reserving channel (used in TDMA-based protocols). CTMAC swiftly changes its strategy according to the vehicle density, and its performance is better than the state-of-the-art protocols. We evaluate CTMAC using at-scale simulations. Our results show that CTMAC reduces the channel completion time and increases the network goodput by 45% for a wide range of application workloads and network settings. PMID:28208590
Protocol for Communication Networking for Formation Flying
NASA Technical Reports Server (NTRS)
Jennings, Esther; Okino, Clayton; Gao, Jay; Clare, Loren
2009-01-01
An application-layer protocol and a network architecture have been proposed for data communications among multiple autonomous spacecraft that are required to fly in a precise formation in order to perform scientific observations. The protocol could also be applied to other autonomous vehicles operating in formation, including robotic aircraft, robotic land vehicles, and robotic underwater vehicles. A group of spacecraft or other vehicles to which the protocol applies could be characterized as a precision-formation- flying (PFF) network, and each vehicle could be characterized as a node in the PFF network. In order to support precise formation flying, it would be necessary to establish a corresponding communication network, through which the vehicles could exchange position and orientation data and formation-control commands. The communication network must enable communication during early phases of a mission, when little positional knowledge is available. Particularly during early mission phases, the distances among vehicles may be so large that communication could be achieved only by relaying across multiple links. The large distances and need for omnidirectional coverage would limit communication links to operation at low bandwidth during these mission phases. Once the vehicles were in formation and distances were shorter, the communication network would be required to provide high-bandwidth, low-jitter service to support tight formation-control loops. The proposed protocol and architecture, intended to satisfy the aforementioned and other requirements, are based on a standard layered-reference-model concept. The proposed application protocol would be used in conjunction with conventional network, data-link, and physical-layer protocols. The proposed protocol includes the ubiquitous Institute of Electrical and Electronics Engineers (IEEE) 802.11 medium access control (MAC) protocol to be used in the datalink layer. In addition to its widespread and proven use in diverse local-area networks, this protocol offers both (1) a random- access mode needed for the early PFF deployment phase and (2) a time-bounded-services mode needed during PFF-maintenance operations. Switching between these two modes could be controlled by upper-layer entities using standard link-management mechanisms. Because the early deployment phase of a PFF mission can be expected to involve multihop relaying to achieve network connectivity (see figure), the proposed protocol includes the open shortest path first (OSPF) network protocol that is commonly used in the Internet. Each spacecraft in a PFF network would be in one of seven distinct states as the mission evolved from initial deployment, through coarse formation, and into precise formation. Reconfiguration of the formation to perform different scientific observations would also cause state changes among the network nodes. The application protocol provides for recognition and tracking of the seven states for each node and for protocol changes under specified conditions to adapt the network and satisfy communication requirements associated with the current PFF mission phase. Except during early deployment, when peer-to-peer random access discovery methods would be used, the application protocol provides for operation in a centralized manner.
National Geothermal Data System: Open Access to Geoscience Data, Maps, and Documents
NASA Astrophysics Data System (ADS)
Caudill, C. M.; Richard, S. M.; Musil, L.; Sonnenschein, A.; Good, J.
2014-12-01
The U.S. National Geothermal Data System (NGDS) provides free open access to millions of geoscience data records, publications, maps, and reports via distributed web services to propel geothermal research, development, and production. NGDS is built on the US Geoscience Information Network (USGIN) data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG), and is compliant with international standards and protocols. NGDS currently serves geoscience information from 60+ data providers in all 50 states. Free and open source software is used in this federated system where data owners maintain control of their data. This interactive online system makes geoscience data easily discoverable, accessible, and interoperable at no cost to users. The dynamic project site http://geothermaldata.org serves as the information source and gateway to the system, allowing data and applications discovery and availability of the system's data feed. It also provides access to NGDS specifications and the free and open source code base (on GitHub), a map-centric and library style search interface, other software applications utilizing NGDS services, NGDS tutorials (via YouTube and USGIN site), and user-created tools and scripts. The user-friendly map-centric web-based application has been created to support finding, visualizing, mapping, and acquisition of data based on topic, location, time, provider, or key words. Geographic datasets visualized through the map interface also allow users to inspect the details of individual GIS data points (e.g. wells, geologic units, etc.). In addition, the interface provides the information necessary for users to access the GIS data from third party software applications such as GoogleEarth, UDig, and ArcGIS. A redistributable, free and open source software package called GINstack (USGIN software stack) was also created to give data providers a simple way to release data using interoperable and shareable standards, upload data and documents, and expose those data as a node in the NGDS or any larger data system through a CSW endpoint. The easy-to-use interface is supported by back-end software including Postgres, GeoServer, and custom CKAN extensions among others.
McClelland, Shearwood; Chernykh, Marina; Dengina, Natalia; Gillespie, Erin F; Likhacheva, Anna; Usychkin, Sergey; Pankratov, Alexandr; Kharitonova, Ekaterina; Egorova, Yulia; Tsimafeyeu, Ilya; Tjulandin, Sergei; Thomas, Charles R; Mitin, Timur
2018-06-25
Radiation oncologists in Russia face a number of unique professional difficulties including lack of standardized training and continuing medical education. To combat this, under the auspices of the Russian Society of Clinical Oncology (RUSSCO), our group has developed a series of ongoing in-person interactive contouring workshops that are held during the major Russian oncology conferences in Moscow, Russia. Since November 2016 during each workshop, we utilized a web-based open-access interactive three-dimensional contouring atlas as part of our didactics. We sought to determine the impact of this resource on radiation oncology practice in Russia. We distributed an IRB-approved web-based survey to 172 practicing radiation oncologists in Russia. We inquired about practice demographics, RUSSCO contouring workshop attendance, and the clinical use of open-access English language interactive contouring atlas (eContour). The survey remained open for 2 months until November 2017. Eighty radiation oncologists completed the survey with a 46.5% response rate. Mean number of years in practice was 13.7. Sixty respondents (75%) attended at least one RUSSCO contouring workshop. Of those who were aware of eContour, 76% were introduced during a RUSSCO contouring workshop, and 81% continue to use it in their daily practice. The greatest obstacles to using the program were language barrier (51%) and internet access (38%). Nearly 90% reported their contouring practices changed since they started using the program, particularly for delineation of clinical target volumes (57%) and/or organs at risk (46%). More than 97% found the clinical pearls/links to cooperative group protocols in the software helpful in their daily practice. The majority used the contouring program several times per month (43%) or several times per week (41%). Face-to-face contouring instruction in combination with open-access web-based interactive contouring resource had a meaningful impact on perceived quality of radiation oncology contours among Russian practitioners and has the potential to have applications worldwide.
O'Reilly, Christian; Gosselin, Nadia; Carrier, Julie; Nielsen, Tore
2014-12-01
Manual processing of sleep recordings is extremely time-consuming. Efforts to automate this process have shown promising results, but automatic systems are generally evaluated on private databases, not allowing accurate cross-validation with other systems. In lacking a common benchmark, the relative performances of different systems are not compared easily and advances are compromised. To address this fundamental methodological impediment to sleep study, we propose an open-access database of polysomnographic biosignals. To build this database, whole-night recordings from 200 participants [97 males (aged 42.9 ± 19.8 years) and 103 females (aged 38.3 ± 18.9 years); age range: 18-76 years] were pooled from eight different research protocols performed in three different hospital-based sleep laboratories. All recordings feature a sampling frequency of 256 Hz and an electroencephalography (EEG) montage of 4-20 channels plus standard electro-oculography (EOG), electromyography (EMG), electrocardiography (ECG) and respiratory signals. Access to the database can be obtained through the Montreal Archive of Sleep Studies (MASS) website (http://www.ceams-carsm.ca/en/MASS), and requires only affiliation with a research institution and prior approval by the applicant's local ethical review board. Providing the research community with access to this free and open sleep database is expected to facilitate the development and cross-validation of sleep analysis automation systems. It is also expected that such a shared resource will be a catalyst for cross-centre collaborations on difficult topics such as improving inter-rater agreement on sleep stage scoring. © 2014 European Sleep Research Society.
Peer-to-Peer Science Data Environment
NASA Astrophysics Data System (ADS)
Byrnes, J. B.; Holland, M. P.
2004-12-01
The goal of P2PSDE is to provide a convenient and extensible Peer-to-Peer (P2P) network architecture that allows: distributed science-data services-seamlessly incorporating collaborative value-added services with search-oriented access to remote science data. P2PSDE features the real-time discovery of data-serving peers (plus peer-groups and peer-group services), in addition to the searching for and transferring of science data. These features are implemented using "Project JXTA", the first and only standardized set of open, generalized P2P protocols that allow arbitrary network devices to communicate and collaborate as peers. The JXTA protocols standardize the manner in which peers discover each other, self-organize into peer groups, advertise and discover network services, and securely communicate with and monitor each other-even across network firewalls. The key benefits include: Potential for dramatic improvements in science-data dissemination; Real-time-discoverable, potentially redundant (reliable), science-data services; Openness/Extensibility; Decentralized use of small, inexpensive, readily-available desktop machines; and Inherently secure-with ability to create variable levels of security by group.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-12
... Connection testing [using current Nasdaq access protocols] during the normal operating hours of the NTF; No Charge--For Idle Connection testing [using current Nasdaq access protocols]; $333/hour--For Active Connection testing [using current Nasdaq access protocols] at all times other than the normal operating hours...
NASA Astrophysics Data System (ADS)
Mohammad, Atif Farid; Straub, Jeremy
2015-05-01
A multi-craft asteroid survey has significant data synchronization needs. Limited communication speeds drive exacting performance requirements. Tables have been used in Relational Databases, which are structure; however, DOMBA (Distributed Objects Management Based Articulation) deals with data in terms of collections. With this, no read/write roadblocks to the data exist. A master/slave architecture is created by utilizing the Gossip protocol. This facilitates expanding a mission that makes an important discovery via the launch of another spacecraft. The Open Space Box Framework facilitates the foregoing while also providing a virtual caching layer to make sure that continuously accessed data is available in memory and that, upon closing the data file, recharging is applied to the data.
FD/DAMA Scheme For Mobile/Satellite Communications
NASA Technical Reports Server (NTRS)
Yan, Tsun-Yee; Wang, Charles C.; Cheng, Unjeng; Rafferty, William; Dessouky, Khaled I.
1992-01-01
Integrated-Adaptive Mobile Access Protocol (I-AMAP) proposed to allocate communication channels to subscribers in first-generation MSAT-X mobile/satellite communication network. Based on concept of frequency-division/demand-assigned multiple access (FD/DAMA) where partition of available spectrum adapted to subscribers' demands for service. Requests processed, and competing requests resolved according to channel-access protocol, or free-access tree algorithm described in "Connection Protocol for Mobile/Satellite Communications" (NPO-17735). Assigned spectrum utilized efficiently.
Secure Remote Access Issues in a Control Center Environment
NASA Technical Reports Server (NTRS)
Pitts, Lee; McNair, Ann R. (Technical Monitor)
2002-01-01
The ISS finally reached an operational state and exists for local and remote users. Onboard payload systems are managed by the Huntsville Operations Support Center (HOSC). Users access HOSC systems by internet protocols in support of daily operations, preflight simulation, and test. In support of this diverse user community, a modem security architecture has been implemented. The architecture has evolved over time from an isolated but open system to a system which supports local and remote access to the ISS over broad geographic regions. This has been accomplished through the use of an evolved security strategy, PKI, and custom design. Through this paper, descriptions of the migration process and the lessons learned are presented. This will include product decision criteria, rationale, and the use of commodity products in the end architecture. This paper will also stress the need for interoperability of various products and the effects of seemingly insignificant details.
Secure Payload Access to the International Space Station
NASA Technical Reports Server (NTRS)
Pitts, R. Lee; Reid, Chris
2002-01-01
The ISS finally reached an operational state and exists for local and remote users. Onboard payload systems are managed by the Huntsville Operations Support Center (HOSC). Users access HOSC systems by internet protocols in support of daily operations, preflight simulation, and test. In support of this diverse user community, a modem security architecture has been implemented. The architecture has evolved over time from an isolated but open system to a system which supports local and remote access to the ISS over broad geographic regions. This has been accomplished through the use of an evolved security strategy, PKI, and custom design. Through this paper, descriptions of the migration process and the lessons learned are presented. This will include product decision criteria, rationale, and the use of commodity products in the end architecture. This paper will also stress the need for interoperability of various products and the effects of seemingly insignificant details.
Rolling Deck to Repository (R2R): Standards and Semantics for Open Access to Research Data
NASA Astrophysics Data System (ADS)
Arko, Robert; Carbotte, Suzanne; Chandler, Cynthia; Smith, Shawn; Stocks, Karen
2015-04-01
In recent years, a growing number of funding agencies and professional societies have issued policies calling for open access to research data. The Rolling Deck to Repository (R2R) program is working to ensure open access to the environmental sensor data routinely acquired by the U.S. academic research fleet. Currently 25 vessels deliver 7 terabytes of data to R2R each year, acquired from a suite of geophysical, oceanographic, meteorological, and navigational sensors on over 400 cruises worldwide. R2R is working to ensure these data are preserved in trusted repositories, discoverable via standard protocols, and adequately documented for reuse. R2R maintains a master catalog of cruises for the U.S. academic research fleet, currently holding essential documentation for over 3,800 expeditions including vessel and cruise identifiers, start/end dates and ports, project titles and funding awards, science parties, dataset inventories with instrument types and file formats, data quality assessments, and links to related content at other repositories. A Digital Object Identifier (DOI) is published for 1) each cruise, 2) each original field sensor dataset, 3) each post-field data product such as quality-controlled shiptrack navigation produced by the R2R program, and 4) each document such as a cruise report submitted by the science party. Scientists are linked to personal identifiers, such as the Open Researcher and Contributor ID (ORCID), where known. Using standard global identifiers such as DOIs and ORCIDs facilitates linking with journal publications and generation of citation metrics. Since its inception, the R2R program has worked in close collaboration with other data repositories in the development of shared semantics for oceanographic research. The R2R cruise catalog uses community-standard terms and definitions hosted by the NERC Vocabulary Server, and publishes ISO metadata records for each cruise that use community-standard profiles developed with the NOAA Data Centers and the EU SeaDataNet project. R2R is a partner in the Ocean Data Interoperability Platform (ODIP), working to strengthen links among regional and national data systems, as well as a lead partner in the EarthCube "GeoLink" project, developing a standard set of ontology design patterns for publishing research data using Semantic Web protocols.
Qualitative study of physicians' varied uses of biomedical research in the USA
Maggio, Lauren A; Moorhead, Laura L; Willinsky, John M
2016-01-01
Objective To investigate the nature of physicians' use of research evidence in experimental conditions of open access to inform training and policy. Design This qualitative study was a component of a larger mixed-methods initiative that provided 336 physicians with relatively complete access to research literature via PubMed and UpToDate, for 1 year via an online portal, with their usage recorded in web logs. Using a semistructured interview protocol, a subset of 38 physician participants were interviewed about their use of research articles in general and were probed about their reasons for accessing specific articles as identified through their web logs. Transcripts were analysed using a general inductive approach. Setting Physician participants were recruited from and registered in the USA. Participants 38 physicians from 16 US states, engaged in 22 medical specialties, possessing more than 1 year of experience postresidency training participated. Results 26 participants attested to the value of consulting research literature within the context of the study by making reference to their roles as clinicians, educators, researchers, learners, administrators and advocates. The physicians reported previously encountering what they experienced as a prohibitive paywall barrier to the research literature and other frustrations with the nature of information systems, such as the need for passwords. Conclusions The findings, against the backdrop of growing open access to biomedical research, indicate that a minority of physicians, at least initially, is likely to seek out and use research and do so in a variety of common roles. Physicians' use of research in these roles has not traditionally been part of their training or part of the considerations for open access policies. The findings have implications for educational and policy initiatives directed towards increasing the effectiveness of this access to and use of research in improving the quality of healthcare. PMID:27872121
Odaka, Mizuho; Minakata, Kenji; Toyokuni, Hideaki; Yamazaki, Kazuhiro; Yonezawa, Atsushi; Sakata, Ryuzo; Matsubara, Kazuo
2015-08-01
This study aimed to develop and assess the effectiveness of a protocol for antibiotic prophylaxis based on preoperative kidney function in patients undergoing open heart surgery. We established a protocol for antibiotic prophylaxis based on preoperative kidney function in patients undergoing open heart surgery. This novel protocol was assessed by comparing patients undergoing open heart surgery before (control group; n = 30) and after its implementation (protocol group; n = 31) at Kyoto University Hospital between July 2012 and January 2013. Surgical site infections (SSIs) were observed in 4 control group patients (13.3 %), whereas no SSIs were observed in the protocol group patients (P < 0.05). The total duration of antibiotic use decreased significantly from 80.7 ± 17.6 h (mean ± SD) in the control group to 55.5 ± 14.9 h in the protocol group (P < 0.05). Similarly, introduction of the protocol significantly decreased the total antibiotic dose used in the perioperative period (P < 0.05). Furthermore, antibiotic regimens were changed under suspicion of infection in 5 of 30 control group patients, whereas none of the protocol group patients required this additional change in the antibiotic regimen (P < 0.05). Our novel antibiotic prophylaxis protocol based on preoperative kidney function effectively prevents SSIs in patients undergoing open heart surgery.
Questioning the efficacy of 'gold' open access to published articles.
Fredericks, Suzanne
2015-07-01
To question the efficacy of 'gold' open access to published articles. Open access is unrestricted access to academic, theoretical and research literature that is scholarly and peer-reviewed. Two models of open access exist: 'gold' and 'green'. Gold open access provides everyone with access to articles during all stages of publication, with processing charges paid by the author(s). Green open access involves placing an already published article into a repository to provide unrestricted access, with processing charges incurred by the publisher. This is a discussion paper. An exploration of the relative benefits and drawbacks of the 'gold' and 'green' open access systems. Green open access is a more economic and efficient means of granting open access to scholarly literature but a large number of researchers select gold open access journals as their first choices for manuscript submissions. This paper questions the efficacy of gold open access models and presents an examination of green open access models to encourage nurse researchers to consider this approach. In the current academic environment, with increased pressures to publish and low funding success rates, it is difficult to understand why gold open access still exists. Green open access enhances the visibility of an academic's work, as increased downloads of articles tend to lead to increased citations. Green open access is the cheaper option, as well as the most beneficial choice, for universities that want to provide unrestricted access to all literature at minimal risk.
On Ramps: Options and Issues in Accessing the Internet.
ERIC Educational Resources Information Center
Bocher, Bob
1995-01-01
Outlines the basic options that schools and libraries have for accessing the Internet, focusing on four models: direct connection; dial access using SLIP/PPP (Serial Line Internet Protocol/Point-to-Point Protocol); dial-up using terminal emulation mode; and dial access through commercial online services. Discusses access option issues such as…
Serving Satellite Remote Sensing Data to User Community through the OGC Interoperability Protocols
NASA Astrophysics Data System (ADS)
di, L.; Yang, W.; Bai, Y.
2005-12-01
Remote sensing is one of the major methods for collecting geospatial data. Hugh amount of remote sensing data has been collected by space agencies and private companies around the world. For example, NASA's Earth Observing System (EOS) is generating more than 3 Tb of remote sensing data per day. The data collected by EOS are processed, distributed, archived, and managed by the EOS Data and Information System (EOSDIS). Currently, EOSDIS is managing several petabytes of data. All of those data are not only valuable for global change research, but also useful for local and regional application and decision makings. How to make the data easily accessible to and usable by the user community is one of key issues for realizing the full potential of these valuable datasets. In the past several years, the Open Geospatial Consortium (OGC) has developed several interoperability protocols aiming at making geospatial data easily accessible to and usable by the user community through Internet. The protocols particularly relevant to the discovery, access, and integration of multi-source satellite remote sensing data are the Catalog Service for Web (CS/W) and Web Coverage Services (WCS) Specifications. The OGC CS/W specifies the interfaces, HTTP protocol bindings, and a framework for defining application profiles required to publish and access digital catalogues of metadata for geographic data, services, and related resource information. The OGC WCS specification defines the interfaces between web-based clients and servers for accessing on-line multi-dimensional, multi-temporal geospatial coverage in an interoperable way. Based on definitions by OGC and ISO 19123, coverage data include all remote sensing images as well as gridded model outputs. The Laboratory for Advanced Information Technology and Standards (LAITS), George Mason University, has been working on developing and implementing OGC specifications for better serving NASA Earth science data to the user community for many years. We have developed the NWGISS software package that implements multiple OGC specifications, including OGC WMS, WCS, CS/W, and WFS. As a part of NASA REASON GeoBrain project, the NWGISS WCS and CS/W servers have been extended to provide operational access to NASA EOS data at data pools through OGC protocols and to make both services chainable in the web-service chaining. The extensions in the WCS server include the implementation of WCS 1.0.0 and WCS 1.0.2, and the development of WSDL description of the WCS services. In order to find the on-line EOS data resources, the CS/W server is extended at the backend to search metadata in NASA ECHO. This presentation reports those extensions and discuss lessons-learned on the implementation. It also discusses the advantage, disadvantages, and future improvement of OGC specifications, particularly the WCS.
Validity of Assessments of Youth Access to Tobacco: The Familiarity Effect
Landrine, Hope; Klonoff, Elizabeth A.
2003-01-01
Objectives. We examined the standard compliance protocol and its validity as a measure of youth access to tobacco. Methods. In Study 1, youth smokers reported buying cigarettes in stores where they are regular customers. In Study 2, youths attempted to purchase cigarettes by using the Standard Protocol, in which they appeared at stores once for cigarettes, and by using the Familiarity Protocol, in which they were rendered regular customers by purchasing nontobacco items 4 times and then requested cigarettes during their fifth visit. Results. Sales to youths aged 17 years in the Familiarity Protocol were significantly higher than sales to the same age group in the Standard Protocols (62.5% vs. 6%, respectively). Conclusions. The Standard Protocol does not match how youths obtain cigarettes. Access is low for stranger youths within compliance studies, but access is high for familiar youths outside of compliance studies. PMID:14600057
The Water SWITCH-ON Spatial Information Platform (SIP)
NASA Astrophysics Data System (ADS)
Sala Calero, J., Sr.; Boot, G., Sr.; Dihé, P., Sr.; Arheimer, B.
2017-12-01
The amount of hydrological open data is continually growing and providing opportunities to the scientific community. Although the existing data portals (GEOSS Portal, INSPIRE community geoportal and others) enable access to open data, many users still find browsing through them difficult. Moreover, the time spent on gathering and preparing data usually is more significant than the time spent on the experiment itself. Thus, any improvement on searching, understanding, accessing or using open data is greatly beneficial. The Spatial Information Platform (SIP) has been developed to tackle these issues within the SWITCH-ON European Commission funded FP7 project. The SIP has been designed as a set of tools based on open standards that provide to the user all the necessary functionalities as described in the Publish-Find-Bind (PFB) pattern. In other words, this means that the SIP helps users to locate relevant and suitable data for their experiments analysis, to access and transform it (filtering, extraction, selection, conversion, aggregation). Moreover, the SIP can be used to provide descriptive information about the data and to publish it so others can find and use it. The SIP is based on existing open data protocols such as the OGC/CSW, OGC/WMS, OpenDAP and open-source components like PostgreSQL/PostGIS, GeoServer and pyCSW. The SIP is divided in three main user interfaces: the BYOD (Browse your open dataset) web interface, the Expert GUI tool and the Upload Data and Metadata web interface. The BYOD HTML5 client is the main entry point for users that want to browse through open data in the SIP. The BYOD has a map interface based on Leaflet JavaScript libraries so that the users can search more efficiently. The web-based Open Data Registration Tool is a user-friendly upload and metadata description interface (geographical extent, license, DOI generation). The Expert GUI is a desktop application that provides full metadata editing capabilities for the metadata moderators of the project. In conclusion, the Spatial Information Platform (SIP) provides to its community a set of tools for better understanding and ease of use of hydrological open-data. Moreover, the SIP has been based on well-known OGC standards that will allow the connection and data harvesting from popular open data portals such as the GEOSS system of systems.
A slotted access control protocol for metropolitan WDM ring networks
NASA Astrophysics Data System (ADS)
Baziana, P. A.; Pountourakis, I. E.
2009-03-01
In this study we focus on the serious scalability problems that many access protocols for WDM ring networks introduce due to the use of a dedicated wavelength per access node for either transmission or reception. We propose an efficient slotted MAC protocol suitable for WDM ring metropolitan area networks. The proposed network architecture employs a separate wavelength for control information exchange prior to the data packet transmission. Each access node is equipped with a pair of tunable transceivers for data communication and a pair of fixed tuned transceivers for control information exchange. Also, each access node includes a set of fixed delay lines for synchronization reasons; to keep the data packets, while the control information is processed. An efficient access algorithm is applied to avoid both the data wavelengths and the receiver collisions. In our protocol, each access node is capable of transmitting and receiving over any of the data wavelengths, facing the scalability issues. Two different slot reuse schemes are assumed: the source and the destination stripping schemes. For both schemes, performance measures evaluation is provided via an analytic model. The analytical results are validated by a discrete event simulation model that uses Poisson traffic sources. Simulation results show that the proposed protocol manages efficient bandwidth utilization, especially under high load. Also, comparative simulation results prove that our protocol achieves significant performance improvement as compared with other WDMA protocols which restrict transmission over a dedicated data wavelength. Finally, performance measures evaluation is explored for diverse numbers of buffer size, access nodes and data wavelengths.
NASA Astrophysics Data System (ADS)
Liang, J.; Sédillot, S.; Traverson, B.
1997-09-01
This paper addresses federation of a transactional object standard - Object Management Group (OMG) object transaction service (OTS) - with the X/Open distributed transaction processing (DTP) model and International Organization for Standardization (ISO) open systems interconnection (OSI) transaction processing (TP) communication protocol. The two-phase commit propagation rules within a distributed transaction tree are similar in the X/Open, ISO and OMG models. Building an OTS on an OSI TP protocol machine is possible because the two specifications are somewhat complementary. OTS defines a set of external interfaces without specific internal protocol machine, while OSI TP specifies an internal protocol machine without any application programming interface. Given these observations, and having already implemented an X/Open two-phase commit transaction toolkit based on an OSI TP protocol machine, we analyse the feasibility of using this implementation as a transaction service provider for OMG interfaces. Based on the favourable result of this feasibility study, we are implementing an OTS compliant system, which, by initiating the extensibility and openness strengths of OSI TP, is able to provide interoperability between X/Open DTP and OMG OTS models.
NASA Technical Reports Server (NTRS)
Ingels, Frank; Owens, John; Daniel, Steven
1989-01-01
The protocol definition and terminal hardware for the modified free access protocol, a communications protocol similar to Ethernet, are developed. A MFA protocol simulator and a CSMA/CD math model are also developed. The protocol is tailored to communication systems where the total traffic may be divided into scheduled traffic and Poisson traffic. The scheduled traffic should occur on a periodic basis but may occur after a given event such as a request for data from a large number of stations. The Poisson traffic will include alarms and other random traffic. The purpose of the protocol is to guarantee that scheduled packets will be delivered without collision. This is required in many control and data collection systems. The protocol uses standard Ethernet hardware and software requiring minimum modifications to an existing system. The modification to the protocol only affects the Ethernet transmission privileges and does not effect the Ethernet receiver.
NASA Astrophysics Data System (ADS)
Cole, M.; Alameh, N.; Bambacus, M.
2006-05-01
The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.
Scalable Lunar Surface Networks and Adaptive Orbit Access
NASA Technical Reports Server (NTRS)
Wang, Xudong
2015-01-01
Teranovi Technologies, Inc., has developed innovative network architecture, protocols, and algorithms for both lunar surface and orbit access networks. A key component of the overall architecture is a medium access control (MAC) protocol that includes a novel mechanism of overlaying time division multiple access (TDMA) and carrier sense multiple access with collision avoidance (CSMA/CA), ensuring scalable throughput and quality of service. The new MAC protocol is compatible with legacy Institute of Electrical and Electronics Engineers (IEEE) 802.11 networks. Advanced features include efficiency power management, adaptive channel width adjustment, and error control capability. A hybrid routing protocol combines the advantages of ad hoc on-demand distance vector (AODV) routing and disruption/delay-tolerant network (DTN) routing. Performance is significantly better than AODV or DTN and will be particularly effective for wireless networks with intermittent links, such as lunar and planetary surface networks and orbit access networks.
Experiences with http/WebDAV protocols for data access in high throughput computing
NASA Astrophysics Data System (ADS)
Bernabeu, Gerard; Martinez, Francisco; Acción, Esther; Bria, Arnau; Caubet, Marc; Delfino, Manuel; Espinal, Xavier
2011-12-01
In the past, access to remote storage was considered to be at least one order of magnitude slower than local disk access. Improvement on network technologies provide the alternative of using remote disk. For those accesses one can today reach levels of throughput similar or exceeding those of local disks. Common choices as access protocols in the WLCG collaboration are RFIO, [GSI]DCAP, GRIDFTP, XROOTD and NFS. HTTP protocol shows a promising alternative as it is a simple, lightweight protocol. It also enables the use of standard technologies such as http caching or load balancing which can be used to improve service resilience and scalability or to boost performance for some use cases seen in HEP such as the "hot files". WebDAV extensions allow writing data, giving it enough functionality to work as a remote access protocol. This paper will show our experiences with the WebDAV door for dCache, in terms of functionality and performance, applied to some of the HEP work flows in the LHC Tier1 at PIC.
A Survey of MAC Protocols for Cognitive Radio Body Area Networks.
Bhandari, Sabin; Moh, Sangman
2015-04-20
The advancement in electronics, wireless communications and integrated circuits has enabled the development of small low-power sensors and actuators that can be placed on, in or around the human body. A wireless body area network (WBAN) can be effectively used to deliver the sensory data to a central server, where it can be monitored, stored and analyzed. For more than a decade, cognitive radio (CR) technology has been widely adopted in wireless networks, as it utilizes the available spectra of licensed, as well as unlicensed bands. A cognitive radio body area network (CRBAN) is a CR-enabled WBAN. Unlike other wireless networks, CRBANs have specific requirements, such as being able to automatically sense their environments and to utilize unused, licensed spectra without interfering with licensed users, but existing protocols cannot fulfill them. In particular, the medium access control (MAC) layer plays a key role in cognitive radio functions, such as channel sensing, resource allocation, spectrum mobility and spectrum sharing. To address various application-specific requirements in CRBANs, several MAC protocols have been proposed in the literature. In this paper, we survey MAC protocols for CRBANs. We then compare the different MAC protocols with one another and discuss challenging open issues in the relevant research.
A VM-shared desktop virtualization system based on OpenStack
NASA Astrophysics Data System (ADS)
Liu, Xi; Zhu, Mingfa; Xiao, Limin; Jiang, Yuanjie
2018-04-01
With the increasing popularity of cloud computing, desktop virtualization is rising in recent years as a branch of virtualization technology. However, existing desktop virtualization systems are mostly designed as a one-to-one mode, which one VM can only be accessed by one user. Meanwhile, previous desktop virtualization systems perform weakly in terms of response time and cost saving. This paper proposes a novel VM-Shared desktop virtualization system based on OpenStack platform. The paper modified the connecting process and the display data transmission process of the remote display protocol SPICE to support VM-Shared function. On the other hand, we propose a server-push display mode to improve user interactive experience. The experimental results show that our system performs well in response time and achieves a low CPU consumption.
Extending OPeNDAP's Data-Access Protocol to Include Enhanced Pre-Retrieval Operations
NASA Astrophysics Data System (ADS)
Fulker, D. W.
2013-12-01
We describe plans to extend OPeNDAP's Web-services protocol as a Building Block for NSF's EarthCube initiative. Though some data-access services have offered forms of subset-selection for decades, other pre-retrieval operations have been unavailable, in part because their benefits (over equivalent post-retrieval actions) are only now becoming fully evident. This is due in part to rapid growth in the volumes of data that are pertinent to the geosciences, exacerbated by limitations such as Internet speeds and latencies as well as pressures toward data usage on ever-smaller devices. In this context, as recipients of a "Building Blocks" award from the most recent round of EarthCube funding, we are launching the specification and prototype implementation of a new Open Data Services Invocation Protocol (ODSIP), by which clients may invoke a newly rich set of data-acquisition services, ranging from statistical summarization and criteria-driven subsetting to re-gridding/resampling. ODSIP will be an extension to DAP4, the latest version of OPeNDAP's widely used data access protocol, which underpins a number of open-source, multilingual, client-server systems (offering data access as a Web service), including THREDDS, PyDAP, GrADS, ERDAP and FERRET, as well as OPeNDAP's own Hyrax servers. We are motivated by the idea that key parts of EarthCube can be built effectively around clients and servers that employ a common and conceptually rich protocol for data acquisition. This concept extends 'data provision' to include pre-retrieval operations that, even when invoked by remote clients, exhibit efficiencies of data-proximate computation. Our aim for ODSIP is to embed a largely domain-neutral algebra of server functions that, despite being deliberately compact, can fulfill a broad range of user needs for pre-retrieval operations. To that end, our approach builds upon languages and tools that have proven effective in multi-domain contexts, and we will employ a user-centered design process built around three science scenarios: 1) accelerated visualization/analysis of model outputs on non-rectangular meshes (over coastal North Carolina); 2) dynamic downscaling of climate predictions for regional utility (over Hawaii); and 3) feature-oriented retrievals of satellite imagery (focusing on satellite-derived sea-surface-temperature fronts). These scenarios will test important aspects of the server-function algebra: * The Hawaii climate study requires coping with issues of scale on rectangular grids, placing strong emphasis on statistical functions. * The east-coast storm-surge study requires irregular grids, thus exploring mathematical challenges that have been addressed in many domains via the GridFields library, which we will employ. We think important classes of geoscience problems in multiple domains--where dealing with discontinuities, for example--are essentially intractable without polygonal meshes. * The sea-surface fronts study integrates vector-style features with array-style coverages, thus touching on the kinds of mathematics that arise when mixing Eulerian and Lagrangian frameworks. Our presentation will sketch the context for ODSIP, our process for a user-centered design, and our hopes for how ODSIP, as an emerging cyberinfrastructure concept for the Geosciences, may serve as a fundamental building block for EarthCube.
Does chlorhexidine prevent dry socket?
Richards, Derek
2012-01-01
The BBO (Bibliografia Brasileira de Odontologia), Biomed Central, Cochrane Library, Directory of Open Access Journals, LILACS, Open-J-Gate, OpenSIGLE, PubMed, Sabinet and Science-Direct databases were searched. Articles were selected for review from the search results on the basis of their compliance with the broad inclusion criteria: relevant to the review question; and prospective two-arm (or more) clinical study. The primary outcome measure was the incidence of AO reported at the patient level. Two reviewers (VY and SM) independently extracted data and assessed the quality of the accepted articles. Individual dichotomous datasets for the control and test group were extracted from each article. Where possible, missing data were calculated from information given in the text or tables. In addition, authors were contacted in order to obtain missing information. Datasets were assessed for their clinical and methodological heterogeneity following Cochrane guidelines. Meta-analysis was conducted with homogeneous datasets. Publication bias was assessed by use of a funnel plot and Egger's regression. Ten randomised trials were included; almost all involved the removal of third molars. Only two of six identified application protocols (single application of chlorhexidine 0.2% gel or multiple application of 0.12% rinse versus placebo) were found to significantly decrease the incidence of AO. Within the limitations of this review, only two of six identified application protocols were found to significantly decrease the incidence of AO. The evidence for both protocols is weak and may be challenged on the grounds of high risk of selection, detection/performance and attrition bias. This systematic review could not identify sufficient evidence supporting the use of chlorhexidine for the prevention of AO. Chlorhexidine seems not to cause any significantly higher adverse reactions than placebo. Future high-quality randomised control trials are needed to provide conclusive evidence on this topic.
Palomaki, Glenn E; Lee, Jo Ellen S; Canick, Jacob A; McDowell, Geraldine A; Donnenfeld, Alan E
2009-09-01
This statement is intended to augment the current general ACMG Standards and Guidelines for Clinical Genetics Laboratories and to address guidelines specific to first-trimester screening for Down syndrome. The aim is to provide the laboratory the necessary information to ensure accurate and reliable Down syndrome screening results given a screening protocol (e.g., combined first trimester and integrated testing). Information about various test combinations and their expected performance are provided, but other issues such as availability of reagents, patient interest in early test results, access to open neural tube defect screening, and availability of chorionic villus sampling are all contextual factors in deciding which screening protocol(s) will be selected by individual health care providers. Individual laboratories are responsible for meeting the quality assurance standards described by the Clinical Laboratory Improvement Act, the College of American Pathologists, and other regulatory agencies, with respect to appropriate sample documentation, assay validation, general proficiency, and quality control measures. These guidelines address first-trimester screening that includes ultrasound measurement and interpretation of nuchal translucency thickness and protocols that combine markers from both the first and second trimesters. Laboratories can use their professional judgment to make modification or additions.
NASA Astrophysics Data System (ADS)
Kadlec, J.; Ames, D. P.
2014-12-01
The aim of the presented work is creating a freely accessible, dynamic and re-usable snow cover map of the world by combining snow extent and snow depth datasets from multiple sources. The examined data sources are: remote sensing datasets (MODIS, CryoLand), weather forecasting model outputs (OpenWeatherMap, forecast.io), ground observation networks (CUAHSI HIS, GSOD, GHCN, and selected national networks), and user-contributed snow reports on social networks (cross-country and backcountry skiing trip reports). For adding each type of dataset, an interface and an adapter is created. Each adapter supports queries by area, time range, or combination of area and time range. The combined dataset is published as an online snow cover mapping service. This web service lowers the learning curve that is required to view, access, and analyze snow depth maps and snow time-series. All data published by this service are licensed as open data; encouraging the re-use of the data in customized applications in climatology, hydrology, sports and other disciplines. The initial version of the interactive snow map is on the website snow.hydrodata.org. This website supports the view by time and view by site. In view by time, the spatial distribution of snow for a selected area and time period is shown. In view by site, the time-series charts of snow depth at a selected location is displayed. All snow extent and snow depth map layers and time series are accessible and discoverable through internationally approved protocols including WMS, WFS, WCS, WaterOneFlow and WaterML. Therefore they can also be easily added to GIS software or 3rd-party web map applications. The central hypothesis driving this research is that the integration of user contributed data and/or social-network derived snow data together with other open access data sources will result in more accurate and higher resolution - and hence more useful snow cover maps than satellite data or government agency produced data by itself.
Open Source Hbim for Cultural Heritage: a Project Proposal
NASA Astrophysics Data System (ADS)
Diara, F.; Rinaudo, F.
2018-05-01
Actual technologies are changing Cultural Heritage research, analysis, conservation and development ways, allowing new innovative approaches. The possibility of integrating Cultural Heritage data, like archaeological information, inside a three-dimensional environment system (like a Building Information Modelling) involve huge benefits for its management, monitoring and valorisation. Nowadays there are many commercial BIM solutions. However, these tools are thought and developed mostly for architecture design or technical installations. An example of better solution could be a dynamic and open platform that might consider Cultural Heritage needs as priority. Suitable solution for better and complete data usability and accessibility could be guaranteed by open source protocols. This choice would allow adapting software to Cultural Heritage needs and not the opposite, thus avoiding methodological stretches. This work will focus exactly on analysis and experimentations about specific characteristics of these kind of open source software (DBMS, CAD, Servers) applied to a Cultural Heritage example, in order to verifying their flexibility, reliability and then creating a dynamic HBIM open source prototype. Indeed, it might be a starting point for a future creation of a complete HBIM open source solution that we could adapt to others Cultural Heritage researches and analysis.
50 CFR 660.332 - Open access daily trip limit (DTL) fishery for sablefish.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Open access daily trip limit (DTL) fishery... COAST STATES West Coast Groundfish-Open Access Fisheries § 660.332 Open access daily trip limit (DTL) fishery for sablefish. (a) Open access DTL fisheries both north and south of 36° N. lat. Open access...
Azagury, Dan; Liu, Rockson C; Morgan, Ashley; Spain, David A
2015-10-01
The initial goal of evaluating a patient with SBO is to immediately identify strangulation and need for urgent operative intervention, concurrent with rapid resuscitation. This relies on a combination of traditional clinical signs and CT findings. In patients without signs of strangulation, a protocol for administration of Gastrografin immediately in the emergency department efficiently sorts patients into those who will resolve their obstructions and those who will fail nonoperative management.Furthermore, because of the unique ability of Gastrografin to draw water into the bowel lumen, it expedites resolution of partial obstructions, shortening time to removal of nasogastric tube liberalization of diet, and discharge from the hospital. Implementation of such a protocol is a complex, multidisciplinary, and time-consuming endeavor. As such, we cannot over emphasize the importance of clear, open communication with everyone involved.If surgical management is warranted, we encourage an initial laparoscopic approach with open access. Even if this results in immediate conversion to laparotomy after assessment of the intra-abdominal status, we encourage this approach with a goal of 30% conversion rate or higher. This will attest that patients will have been given the highest likelihood of a successful laparoscopic LOA.
Maintaining the momentum of Open Search in Earth Science Data discovery
NASA Astrophysics Data System (ADS)
Newman, D. J.; Lynnes, C.
2013-12-01
Federated Search for Earth Observation data has been a hallmark of EOSDIS (Earth Observing System Data and Information System) for two decades. Originally, the EOSDIS Version 0 system provided both data-collection-level and granule/file-level search in the mid 1990s with EOSDIS-specific socket protocols and message formats. Since that time, the advent of several standards has helped to simplify EOSDIS federated search, beginning with HTTP as the transfer protocol. Most recently, OpenSearch (www.opensearch.org) was employed for the EOS Clearinghouse (ECHO), based on a set of conventions that had been developed within the Earth Science Information Partners (ESIP) Federation. The ECHO OpenSearch API has evolved to encompass the ESIP RFC and the Open Geospatial Consortium (OGC) Open Search standard. Uptake of the ECHO Open Search API has been significant and has made ECHO accessible to client developers that found the previous ECHO SOAP API and current REST API too complex. Client adoption of the OpenSearch API appears to be largely driven by the simplicity of the OpenSearch convention. This simplicity is thus important to retain as the standard and convention evolve. For example, ECHO metrics indicate that the vast majority of ECHO users favor the following search criteria when using the REST API, - Spatial - bounding box, polygon, line and point - Temporal - start and end time - Keywords - free text Fewer than 10% of searches use additional constraints, particularly those requiring a controlled vocabulary, such as instrument, sensor, etc. This suggests that ongoing standardization efforts around OpenSearch usage for Earth Observation data may be more productive if oriented toward improving support for the Spatial, Temporal and Keyword search aspects. Areas still requiring improvement include support of - Concrete requirements for keyword constraints - Phrasal search for keyword constraints - Temporal constraint relations - Terminological symmetry between search URLs and response documents for both temporal and spatial terms - Best practices for both servers and clients. Over the past year we have seen several ongoing efforts to further standardize Open Search in the earth science domain such as, - Federation of Earth Science Information Partners (ESIP) - Open Geospatial Consortium (OGC) - Committee on Earth Observation Satellites (CEOS)
Distributed reservation control protocols for random access broadcasting channels
NASA Technical Reports Server (NTRS)
Greene, E. P.; Ephremides, A.
1981-01-01
Attention is given to a communication network consisting of an arbitrary number of nodes which can communicate with each other via a time-division multiple access (TDMA) broadcast channel. The reported investigation is concerned with the development of efficient distributed multiple access protocols for traffic consisting primarily of single packet messages in a datagram mode of operation. The motivation for the design of the protocols came from the consideration of efficient multiple access utilization of moderate to high bandwidth (4-40 Mbit/s capacity) communication satellite channels used for the transmission of short (1000-10,000 bits) fixed length packets. Under these circumstances, the ratio of roundtrip propagation time to packet transmission time is between 100 to 10,000. It is shown how a TDMA channel can be adaptively shared by datagram traffic and constant bandwidth users such as in digital voice applications. The distributed reservation control protocols described are a hybrid between contention and reservation protocols.
NASA Astrophysics Data System (ADS)
Palanisamy, G.; Krassovski, M.; Devarakonda, R.; Santhana Vannan, S.
2012-12-01
The current climate debate is highlighting the importance of free, open, and authoritative sources of high quality climate data that are available for peer review and for collaborative purposes. It is increasingly important to allow various organizations around the world to share climate data in an open manner, and to enable them to perform dynamic processing of climate data. This advanced access to data can be enabled via Web-based services, using common "community agreed" standards without having to change their internal structure used to describe the data. The modern scientific community has become diverse and increasingly complex in nature. To meet the demands of such diverse user community, the modern data supplier has to provide data and other related information through searchable, data and process oriented tool. This can be accomplished by setting up on-line, Web-based system with a relational database as a back end. The following common features of the web data access/search systems will be outlined in the proposed presentation: - A flexible data discovery - Data in commonly used format (e.g., CSV, NetCDF) - Preparing metadata in standard formats (FGDC, ISO19115, EML, DIF etc.) - Data subseting capabilities and ability to narrow down to individual data elements - Standards based data access protocols and mechanisms (SOAP, REST, OpenDAP, OGC etc.) - Integration of services across different data systems (discovery to access, visualizations and subseting) This presentation will also include specific examples of integration of various data systems that are developed by Oak Ridge National Laboratory's - Climate Change Science Institute, their ability to communicate between each other to enable better data interoperability and data integration. References: [1] Devarakonda, Ranjeet, and Harold Shanafield. "Drupal: Collaborative framework for science research." Collaboration Technologies and Systems (CTS), 2011 International Conference on. IEEE, 2011. [2]Devarakonda, R., Shrestha, B., Palanisamy, G., Hook, L. A., Killeffer, T. S., Boden, T. A., ... & Lazer, K. (2014). THE NEW ONLINE METADATA EDITOR FOR GENERATING STRUCTURED METADATA. Oak Ridge National Laboratory (ORNL).
NASA Astrophysics Data System (ADS)
Jian, Wei; Estevez, Claudio; Chowdhury, Arshad; Jia, Zhensheng; Wang, Jianxin; Yu, Jianguo; Chang, Gee-Kung
2010-12-01
This paper presents an energy-efficient Medium Access Control (MAC) protocol for very-high-throughput millimeter-wave (mm-wave) wireless sensor communication networks (VHT-MSCNs) based on hybrid multiple access techniques of frequency division multiplexing access (FDMA) and time division multiplexing access (TDMA). An energy-efficient Superframe for wireless sensor communication network employing directional mm-wave wireless access technologies is proposed for systems that require very high throughput, such as high definition video signals, for sensing, processing, transmitting, and actuating functions. Energy consumption modeling for each network element and comparisons among various multi-access technologies in term of power and MAC layer operations are investigated for evaluating the energy-efficient improvement of proposed MAC protocol.
Performance analysis of a proposed tightly-coupled medical instrument network based on CAN protocol.
Mujumdar, Shantanu; Thongpithoonrat, Pongnarin; Gurkan, D; McKneely, Paul K; Chapman, Frank M; Merchant, Fatima
2010-01-01
Advances in medical devices and health care has been phenomenal during the recent years. Although medical device manufacturers have been improving their instruments, network connection of these instruments still rely on proprietary technologies. Even if the interface has been provided by the manufacturer (e.g., RS-232, USB, or Ethernet coupled with a proprietary API), there is no widely-accepted uniform data model to access data of various bedside instruments. There is a need for a common standard which allows for internetworking with the medical devices from different manufacturers. ISO/IEEE 11073 (X73) is a standard attempting to unify the interfaces of all medical devices. X73 defines a client access mechanism that would be implemented into the communication controllers (residing between an instrument and the network) in order to access/network patient data. On the other hand, MediCAN™ technology suite has been demonstrated with various medical instruments to achieve interfacing and networking with a similar goal in its open standardization approach. However, it provides a more generic definition for medical data to achieve flexibility for networking and client access mechanisms. The instruments are in turn becoming more sophisticated; however, the operation of an instrument is still expected to be locally done by authorized medical personnel. Unfortunately, each medical instrument has its unique proprietary API (application programming interface - if any) to provide automated and electronic access to monitoring data. Integration of these APIs requires an agreement with the manufacturers towards realization of interoperable health care networking. As long as the interoperability of instruments with a network is not possible, ubiquitous access to patient status is limited only to manual entry based systems. This paper demonstrates an attempt to realize an interoperable medical instrument interface for networking using MediCAN technology suite as an open standard.
Strasser, Torsten; Peters, Tobias; Jägle, Herbert; Zrenner, Eberhart
2018-02-01
The ISCEV standards and recommendations for electrophysiological recordings in ophthalmology define a set of protocols with stimulus parameters, acquisition settings, and recording conditions, to unify the data and enable comparability of results across centers. Up to now, however, there are no standards to define the storage and exchange of such electrophysiological recordings. The aim of this study was to develop an open standard data format for the exchange and storage of visual electrophysiological data (ElVisML). We first surveyed existing data formats for biomedical signals and examined their suitability for electrophysiological data in ophthalmology. We then compared the suitability of text-based and binary formats, as well as encoding in Extensible Markup Language (XML) and character/comma-separated values. The results of the methodological consideration led to the development of ElVisML with an XML-encoded text-based format. This allows referential integrity, extensibility, the storing of accompanying units, as well as ensuring confidentiality and integrity of the data. A visualization of ElVisML documents (ElVisWeb) has additionally been developed, which facilitates the exchange of recordings on mailing lists and allows open access to data along with published articles. The open data format ElVisML ensures the quality, validity, and integrity of electrophysiological data transmission and storage as well as providing manufacturer-independent access and long-term archiving in a future-proof format. Standardization of the format of such neurophysiology data would promote the development of new techniques and open software for the use of neurophysiological data in both clinic and research.
OnEarth: An Open Source Solution for Efficiently Serving High-Resolution Mapped Image Products
NASA Astrophysics Data System (ADS)
Thompson, C. K.; Plesea, L.; Hall, J. R.; Roberts, J. T.; Cechini, M. F.; Schmaltz, J. E.; Alarcon, C.; Huang, T.; McGann, J. M.; Chang, G.; Boller, R. A.; Ilavajhala, S.; Murphy, K. J.; Bingham, A. W.
2013-12-01
This presentation introduces OnEarth, a server side software package originally developed at the Jet Propulsion Laboratory (JPL), that facilitates network-based, minimum-latency geolocated image access independent of image size or spatial resolution. The key component in this package is the Meta Raster Format (MRF), a specialized raster file extension to the Geospatial Data Abstraction Library (GDAL) consisting of an internal indexed pyramid of image tiles. Imagery to be served is converted to the MRF format and made accessible online via an expandable set of server modules handling requests in several common protocols, including the Open Geospatial Consortium (OGC) compliant Web Map Tile Service (WMTS) as well as Tiled WMS and Keyhole Markup Language (KML). OnEarth has recently transitioned to open source status and is maintained and actively developed as part of GIBS (Global Imagery Browse Services), a collaborative project between JPL and Goddard Space Flight Center (GSFC). The primary function of GIBS is to enhance and streamline the data discovery process and to support near real-time (NRT) applications via the expeditious ingestion and serving of full-resolution imagery representing science products from across the NASA Earth Science spectrum. Open source software solutions are leveraged where possible in order to utilize existing available technologies, reduce development time, and enlist wider community participation. We will discuss some of the factors and decision points in transitioning OnEarth to a suitable open source paradigm, including repository and licensing agreement decision points, institutional hurdles, and perceived benefits. We will also provide examples illustrating how OnEarth is integrated within GIBS and other applications.
Formats and Network Protocols for Browser Access to 2D Raster Data
NASA Astrophysics Data System (ADS)
Plesea, L.
2015-12-01
Tiled web maps in browsers are a major success story, forming the foundation of many current web applications. Enabling tiled data access is the next logical step, and is likely to meet with similar success. Many ad-hoc approaches have already started to appear, and something similar is explored within the Open Geospatial Consortium. One of the main obstacles in making browser data access a reality is the lack of a well-known data format. This obstacle also represents an opportunity to analyze the requirements and possible candidates, applying lessons learned from web tiled image services and protocols. Similar to the image counterpart, a web tile raster data format needs to have good intrinsic compression and be able to handle high byte count data types including floating point. An overview of a possible solution to the format problem, a 2D data raster compression algorithm called Limited Error Raster Compression (LERC) will be presented. In addition to the format, best practices for high request rate HTTP services also need to be followed. In particular, content delivery network (CDN) caching suitability needs to be part of any design, not an after-thought. Last but not least, HTML 5 browsers will certainly be part of any solution since they provide improved access to binary data, as well as more powerful ways to view and interact with the data in the browser. In a simple but relevant application, digital elevation model (DEM) raster data is served as LERC compressed data tiles which are used to generate terrain by a HTML5 scene viewer.
50 CFR 660.313 - Open access fishery-recordkeeping and reporting.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Open access fishery-recordkeeping and... West Coast Groundfish-Open Access Fisheries § 660.313 Open access fishery—recordkeeping and reporting... to open access fisheries. (b) Declaration reports for vessels using nontrawl gear. Declaration...
50 CFR 660.313 - Open access fishery-recordkeeping and reporting.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Open access fishery-recordkeeping and... West Coast Groundfish-Open Access Fisheries § 660.313 Open access fishery—recordkeeping and reporting... to open access fisheries. (b) Declaration reports for vessels using nontrawl gear. Declaration...
50 CFR 660.313 - Open access fishery-recordkeeping and reporting.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 50 Wildlife and Fisheries 13 2014-10-01 2014-10-01 false Open access fishery-recordkeeping and... West Coast Groundfish-Open Access Fisheries § 660.313 Open access fishery—recordkeeping and reporting... to open access fisheries. (b) Declaration reports for vessels using nontrawl gear. Declaration...
50 CFR 660.313 - Open access fishery-recordkeeping and reporting.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 50 Wildlife and Fisheries 13 2012-10-01 2012-10-01 false Open access fishery-recordkeeping and... West Coast Groundfish-Open Access Fisheries § 660.313 Open access fishery—recordkeeping and reporting... to open access fisheries. (b) Declaration reports for vessels using nontrawl gear. Declaration...
50 CFR 660.313 - Open access fishery-recordkeeping and reporting.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 50 Wildlife and Fisheries 13 2013-10-01 2013-10-01 false Open access fishery-recordkeeping and... West Coast Groundfish-Open Access Fisheries § 660.313 Open access fishery—recordkeeping and reporting... to open access fisheries. (b) Declaration reports for vessels using nontrawl gear. Declaration...
Storage quality-of-service in cloud-based scientific environments: a standardization approach
NASA Astrophysics Data System (ADS)
Millar, Paul; Fuhrmann, Patrick; Hardt, Marcus; Ertl, Benjamin; Brzezniak, Maciej
2017-10-01
When preparing the Data Management Plan for larger scientific endeavors, PIs have to balance between the most appropriate qualities of storage space along the line of the planned data life-cycle, its price and the available funding. Storage properties can be the media type, implicitly determining access latency and durability of stored data, the number and locality of replicas, as well as available access protocols or authentication mechanisms. Negotiations between the scientific community and the responsible infrastructures generally happen upfront, where the amount of storage space, media types, like: disk, tape and SSD and the foreseeable data life-cycles are negotiated. With the introduction of cloud management platforms, both in computing and storage, resources can be brokered to achieve the best price per unit of a given quality. However, in order to allow the platform orchestrator to programmatically negotiate the most appropriate resources, a standard vocabulary for different properties of resources and a commonly agreed protocol to communicate those, has to be available. In order to agree on a basic vocabulary for storage space properties, the storage infrastructure group in INDIGO-DataCloud together with INDIGO-associated and external scientific groups, created a working group under the umbrella of the Research Data Alliance (RDA). As communication protocol, to query and negotiate storage qualities, the Cloud Data Management Interface (CDMI) has been selected. Necessary extensions to CDMI are defined in regular meetings between INDIGO and the Storage Network Industry Association (SNIA). Furthermore, INDIGO is contributing to the SNIA CDMI reference implementation as the basis for interfacing the various storage systems in INDIGO to the agreed protocol and to provide an official Open-Source skeleton for systems not being maintained by INDIGO partners.
NASA Astrophysics Data System (ADS)
Civera Lorenzo, Tamara
2017-10-01
Brief presentation about the J-PLUS EDR data access web portal (http://archive.cefca.es/catalogues/jplus-edr) where the different services available to retrieve images and catalogues data have been presented.J-PLUS Early Data Release (EDR) archive includes two types of data: images and dual and single catalogue data which include parameters measured from images. J-PLUS web portal offers catalogue data and images through several different online data access tools or services each suited to a particular need. The different services offered are: Coverage map Sky navigator Object visualization Image search Cone search Object list search Virtual observatory services: Simple Cone Search Simple Image Access Protocol Simple Spectral Access Protocol Table Access Protocol
50 CFR 660.330 - Open access fishery-management measures.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 50 Wildlife and Fisheries 13 2014-10-01 2014-10-01 false Open access fishery-management measures... West Coast Groundfish-Open Access Fisheries § 660.330 Open access fishery—management measures. (a) General. Groundfish species taken in open access fisheries will be managed with cumulative trip limits...
50 CFR 660.330 - Open access fishery-management measures.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Open access fishery-management measures... West Coast Groundfish-Open Access Fisheries § 660.330 Open access fishery—management measures. (a) General. Groundfish species taken in open access fisheries will be managed with cumulative trip limits...
50 CFR 660.330 - Open access fishery-management measures.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 50 Wildlife and Fisheries 13 2013-10-01 2013-10-01 false Open access fishery-management measures... West Coast Groundfish-Open Access Fisheries § 660.330 Open access fishery—management measures. (a) General. Groundfish species taken in open access fisheries will be managed with cumulative trip limits...
50 CFR 660.320 - Open access fishery-crossover provisions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Open access fishery-crossover provisions... West Coast Groundfish-Open Access Fisheries § 660.320 Open access fishery—crossover provisions. (a) Operating in both limited entry and open access fisheries. See provisions at § 660.60, subpart C. (b...
50 CFR 660.320 - Open access fishery-crossover provisions.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Open access fishery-crossover provisions... West Coast Groundfish-Open Access Fisheries § 660.320 Open access fishery—crossover provisions. (a) Operating in both limited entry and open access fisheries. See provisions at § 660.60, subpart C. (b...
50 CFR 660.312 - Open access fishery-prohibitions.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 50 Wildlife and Fisheries 13 2012-10-01 2012-10-01 false Open access fishery-prohibitions. 660.312... Groundfish-Open Access Fisheries § 660.312 Open access fishery—prohibitions. General groundfish prohibitions..., possess, or land groundfish in excess of the landing limit for the open access fishery without having a...
50 CFR 660.330 - Open access fishery-management measures.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 50 Wildlife and Fisheries 13 2012-10-01 2012-10-01 false Open access fishery-management measures... West Coast Groundfish-Open Access Fisheries § 660.330 Open access fishery—management measures. (a) General. Groundfish species taken in open access fisheries will be managed with cumulative trip limits...
50 CFR 660.312 - Open access fishery-prohibitions.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 50 Wildlife and Fisheries 13 2014-10-01 2014-10-01 false Open access fishery-prohibitions. 660.312... Groundfish-Open Access Fisheries § 660.312 Open access fishery—prohibitions. General groundfish prohibitions..., possess, or land groundfish in excess of the landing limit for the open access fishery without having a...
50 CFR 660.312 - Open access fishery-prohibitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Open access fishery-prohibitions. 660.312... Groundfish-Open Access Fisheries § 660.312 Open access fishery—prohibitions. General groundfish prohibitions..., possess, or land groundfish in excess of the landing limit for the open access fishery without having a...
50 CFR 660.312 - Open access fishery-prohibitions.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 50 Wildlife and Fisheries 13 2013-10-01 2013-10-01 false Open access fishery-prohibitions. 660.312... Groundfish-Open Access Fisheries § 660.312 Open access fishery—prohibitions. General groundfish prohibitions..., possess, or land groundfish in excess of the landing limit for the open access fishery without having a...
50 CFR 660.312 - Open access fishery-prohibitions.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Open access fishery-prohibitions. 660.312... Groundfish-Open Access Fisheries § 660.312 Open access fishery—prohibitions. General groundfish prohibitions..., possess, or land groundfish in excess of the landing limit for the open access fishery without having a...
50 CFR 660.330 - Open access fishery-management measures.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Open access fishery-management measures... West Coast Groundfish-Open Access Fisheries § 660.330 Open access fishery—management measures. (a) General. Groundfish species taken in open access fisheries will be managed with cumulative trip limits...
Medical education and information literacy in the era of open access.
Brower, Stewart M
2010-01-01
The Open Access movement in scholarly communications poses new issues and concerns for medical education in general and information literacy education specifically. For medical educators, Open Access can affect the availability of new information, instructional materials, and scholarship in medical education. For students, Open Access materials continue to be available to them post-graduation, regardless of affiliation. Libraries and information literacy librarians are challenged in their responses to the Open Access publishing movement in how best to support Open Access endeavors within their own institutions, and how best to educate their user base about Open Access in general.
Publishing in open access era: focus on respiratory journals
Xu, Dingyao; Zhong, Xiyao; Li, Li; Ling, Qibo; Bu, Zhaode
2014-01-01
We have entered an open access publishing era. The impact and significance of open access is still under debate after two decades of evolution. Open access journals benefit researchers and the general public by promoting visibility, sharing and communicating. Non-mainstream journals should turn the challenge of open access into opportunity of presenting best research articles to the global readership. Open access journals need to optimize their business models to promote the healthy and continuous development. PMID:24822120
Publishing in open access era: focus on respiratory journals.
Dai, Ni; Xu, Dingyao; Zhong, Xiyao; Li, Li; Ling, Qibo; Bu, Zhaode
2014-05-01
We have entered an open access publishing era. The impact and significance of open access is still under debate after two decades of evolution. Open access journals benefit researchers and the general public by promoting visibility, sharing and communicating. Non-mainstream journals should turn the challenge of open access into opportunity of presenting best research articles to the global readership. Open access journals need to optimize their business models to promote the healthy and continuous development.
Opening a door to safe abortion: international perspectives on medical abortifacient use.
Pollack, A E; Pine, R N
2000-01-01
International experience compels us to revisit how we define and assess the safety and efficacy of medical abortifacients such as misoprostol. In some countries where safe abortion is neither accessible nor legal, even unsupervised, off-protocol use of misoprostol can provide women with a means to safely terminate pregnancy. This is due primarily to misoprostol-induced uterine contractions that cause bleeding, which in turn provides access to existing reasonable quality health services that would otherwise be unavailable. Several studies have suggested that an increase in the underground use of misoprostol in Brazil has already reduced serious complications from unsafe abortion. Thus, the availability of medical abortifacients combined with strengthened postabortion care services can legitimately be considered a public health success in countries in which safe abortion services do not exist and law reform is unlikely.
Publishing in Open Access Education Journals: The Authors' Perspectives
ERIC Educational Resources Information Center
Coonin, Bryna; Younce, Leigh M.
2010-01-01
Open access publishing is now an accepted method of scholarly communication. However, the greatest traction for open access publishing thus far has been in the sciences. Penetration of open access publishing has been much slower among the social sciences. This study surveys 309 authors from recent issues of open access journals in education to…
50 CFR 660.311 - Open access fishery-definitions.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 50 Wildlife and Fisheries 13 2012-10-01 2012-10-01 false Open access fishery-definitions. 660.311... Groundfish-Open Access Fisheries § 660.311 Open access fishery—definitions. General definitions for the... specific to the open access fishery covered in this subpart and are in addition to those specified at § 660...
50 CFR 660.311 - Open access fishery-definitions.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Open access fishery-definitions. 660.311... Groundfish-Open Access Fisheries § 660.311 Open access fishery—definitions. General definitions for the... specific to the open access fishery covered in this subpart and are in addition to those specified at § 660...
50 CFR 660.320 - Open access fishery-crossover provisions.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 50 Wildlife and Fisheries 13 2014-10-01 2014-10-01 false Open access fishery-crossover provisions... West Coast Groundfish-Open Access Fisheries § 660.320 Open access fishery—crossover provisions. The crossover provisions listed at § 660.60(h)(7), apply to vessels fishing in the open access fishery. [76 FR...
50 CFR 660.320 - Open access fishery-crossover provisions.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 50 Wildlife and Fisheries 13 2013-10-01 2013-10-01 false Open access fishery-crossover provisions... West Coast Groundfish-Open Access Fisheries § 660.320 Open access fishery—crossover provisions. The crossover provisions listed at § 660.60(h)(7), apply to vessels fishing in the open access fishery. [76 FR...
50 CFR 660.319 - Open access fishery gear identification and marking.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 50 Wildlife and Fisheries 13 2013-10-01 2013-10-01 false Open access fishery gear identification... COAST STATES West Coast Groundfish-Open Access Fisheries § 660.319 Open access fishery gear identification and marking. (a) Gear identification. (1) Open access fixed gear (longline, trap or pot, set net...
50 CFR 660.311 - Open access fishery-definitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Open access fishery-definitions. 660.311... Groundfish-Open Access Fisheries § 660.311 Open access fishery—definitions. General definitions for the... specific to the open access fishery covered in this subpart and are in addition to those specified at § 660...
50 CFR 660.320 - Open access fishery-crossover provisions.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 50 Wildlife and Fisheries 13 2012-10-01 2012-10-01 false Open access fishery-crossover provisions... West Coast Groundfish-Open Access Fisheries § 660.320 Open access fishery—crossover provisions. The crossover provisions listed at § 660.60(h)(7), apply to vessels fishing in the open access fishery. [76 FR...
50 CFR 660.311 - Open access fishery-definitions.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 50 Wildlife and Fisheries 13 2013-10-01 2013-10-01 false Open access fishery-definitions. 660.311... Groundfish-Open Access Fisheries § 660.311 Open access fishery—definitions. General definitions for the... specific to the open access fishery covered in this subpart and are in addition to those specified at § 660...
50 CFR 660.311 - Open access fishery-definitions.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 50 Wildlife and Fisheries 13 2014-10-01 2014-10-01 false Open access fishery-definitions. 660.311... Groundfish-Open Access Fisheries § 660.311 Open access fishery—definitions. General definitions for the... specific to the open access fishery covered in this subpart and are in addition to those specified at § 660...
Education Scholars' Perceptions and Practices toward Open Access Publishing
ERIC Educational Resources Information Center
Ellingford, Lori Michelle
2012-01-01
Although open access publishing has been available since 1998, we know little regarding scholars' perceptions and practices toward publishing in open access outlets, especially in the social science community. Open access publishing has been slow to penetrate the field of education, yet the potential impact of open access could make this…
Shamseer, Larissa; Moher, David; Clarke, Mike; Ghersi, Davina; Liberati, Alessandro; Petticrew, Mark; Shekelle, Paul; Stewart, Lesley A
2015-01-02
Protocols of systematic reviews and meta-analyses allow for planning and documentation of review methods, act as a guard against arbitrary decision making during review conduct, enable readers to assess for the presence of selective reporting against completed reviews, and, when made publicly available, reduce duplication of efforts and potentially prompt collaboration. Evidence documenting the existence of selective reporting and excessive duplication of reviews on the same or similar topics is accumulating and many calls have been made in support of the documentation and public availability of review protocols. Several efforts have emerged in recent years to rectify these problems, including development of an international register for prospective reviews (PROSPERO) and launch of the first open access journal dedicated to the exclusive publication of systematic review products, including protocols (BioMed Central's Systematic Reviews). Furthering these efforts and building on the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-analyses) guidelines, an international group of experts has created a guideline to improve the transparency, accuracy, completeness, and frequency of documented systematic review and meta-analysis protocols--PRISMA-P (for protocols) 2015. The PRISMA-P checklist contains 17 items considered to be essential and minimum components of a systematic review or meta-analysis protocol.This PRISMA-P 2015 Explanation and Elaboration paper provides readers with a full understanding of and evidence about the necessity of each item as well as a model example from an existing published protocol. This paper should be read together with the PRISMA-P 2015 statement. Systematic review authors and assessors are strongly encouraged to make use of PRISMA-P when drafting and appraising review protocols. © BMJ Publishing Group Ltd 2014.
Goicolea, Isabel; Carson, Dean; San Sebastian, Miguel; Christianson, Monica; Wiklund, Maria; Hurtig, Anna-Karin
2018-01-11
The purpose of this paper is to propose a protocol for researching the impact of rural youth health service strategies on health care access. There has been no published comprehensive assessment of the effectiveness of youth health strategies in rural areas, and there is no clearly articulated model of how such assessments might be conducted. The protocol described here aims to gather information to; i) Assess rural youth access to health care according to their needs, ii) Identify and understand the strategies developed in rural areas to promote youth access to health care, and iii) Propose actions for further improvement. The protocol is described with particular reference to research being undertaken in the four northernmost counties of Sweden, which contain a widely dispersed and diverse youth population. The protocol proposes qualitative and quantitative methodologies sequentially in four phases. First, to map youth access to health care according to their health care needs, including assessing horizontal equity (equal use of health care for equivalent health needs,) and vertical equity (people with greater health needs should receive more health care than those with lesser needs). Second, a multiple case study design investigates strategies developed across the region (youth clinics, internet applications, public health programs) to improve youth access to health care. Third, qualitative comparative analysis of the 24 rural municipalities in the region identifies the best combination of conditions leading to high youth access to health care. Fourth, a concept mapping study involving rural stakeholders, care providers and youth provides recommended actions to improve rural youth access to health care. The implementation of this research protocol will contribute to 1) generating knowledge that could contribute to strengthening rural youth access to health care, as well as to 2) advancing the application of mixed methods to explore access to health care.
ERIC Educational Resources Information Center
Krishnamurthy, M.
2008-01-01
Purpose: The purpose of this paper is to describe the open access and open source movement in the digital library world. Design/methodology/approach: A review of key developments in the open access and open source movement is provided. Findings: Open source software and open access to research findings are of great use to scholars in developing…
Capitalizing on global demands for open data access and interoperability - the USGIN story
NASA Astrophysics Data System (ADS)
Richard, Stephen; Allison, Lee
2016-04-01
U.S. Geoscience Information Network (USGIN - http://usgin.org) data integration framework packages data so that it can be accessible through a broad array of open-source software and applications, including GeoServer, QGIS, GrassGIS, uDig, and gvSIG. USGIN data-sharing networks are designed to interact with other data exchange systems and have the ability to connect information on a granular level without jeopardizing data ownership. The system is compliant with international standards and protocols, scalable, extensible, and can be deployed throughout the world for a myriad of applications. Using GeoSciML as its data transfer standard and a collaborative approach to Content Model development and management, much of the architecture is publically available through GitHub. Initially developed by the USGS and Association of American State Geologists as a distributed, self-maintained platform for sharing geoscience information, USGIN meets all the requirements of the White House Open Data Access Initiative that applies to (almost) all federally-funded research and all federally-maintained data, opening up huge opportunities for further deployment. In December 2015, the USGIN Content Model schema was recommended for adoption by the White House-led US Group on Earth Observations (USGEO) "Draft Common Framework for Earth-Observation Data" for all US earth observation (i.e., satellite) data. The largest USGIN node is the U.S. National Geothermal Data System (NGDS - www.geothermaldata.org). NGDS provides free open access to ~ 10 million data records, maps, and reports, sharing relevant geoscience and land use data to propel geothermal development and production in the U.S. NGDS currently serves information from hundreds of the U.S. Department of Energy's sponsored projects and geologic data feeds from 60+ data providers in all 50 states, using free and open source software, in a federated system where data owners maintain control of their data. This interactive online system is opening new exploration opportunities and shortening project development by making data easily discoverable, accessible, and interoperable at no cost to users. USGIN Foundation, Inc. was established in 2014 as a not-for-profit company to deploy the USGIN data integration framework for other natural resource (energy, water, and minerals), natural hazards, and geoscience investigations applications, nationally and worldwide. The USGIN vision is that as each data node adds to its data repositories, the system-wide USGIN functions become increasingly valuable to it. The long term goal is that the data network reach a 'tipping point' at which it becomes like a data equivalent to the World Wide Web - where everyone will maintain the function because it is expected by its clientele and it fills critical needs.
Using the ACR/NEMA standard with TCP/IP and Ethernet
NASA Astrophysics Data System (ADS)
Chimiak, William J.; Williams, Rodney C.
1991-07-01
There is a need for a consolidated picture archival and communications system (PACS) in hospitals. At the Bowman Gray School of Medicine of Wake Forest University (BGSM), the authors are enhancing the ACR/NEMA Version 2 protocol using UNIX sockets and TCP/IP to greatly improve connectivity. Initially, nuclear medicine studies using gamma cameras are to be sent to PACS. The ACR/NEMA Version 2 protocol provides the functionality of the upper three layers of the open system interconnection (OSI) model in this implementation. The images, imaging equipment information, and patient information are then sent in ACR/NEMA format to a software socket. From there it is handed to the TCP/IP protocol, which provides the transport and network service. TCP/IP, in turn, uses the services of IEEE 802.3 (Ethernet) to complete the connectivity. The advantage of this implementation is threefold: (1) Only one I/O port is consumed by numerous nuclear medicine cameras, instead of a physical port for each camera. (2) Standard protocols are used which maximize interoperability with ACR/NEMA compliant PACSs. (3) The use of sockets allows a migration path to the transport and networking services of OSIs TP4 and connectionless network service as well as the high-performance protocol being considered by the American National Standards Institute (ANSI) and the International Standards Organization (ISO) -- the Xpress Transfer Protocol (XTP). The use of sockets also gives access to ANSI's Fiber Distributed Data Interface (FDDI) as well as other high-speed network standards.
Toward Synthesis, Analysis, and Certification of Security Protocols
NASA Technical Reports Server (NTRS)
Schumann, Johann
2004-01-01
Implemented security protocols are basically pieces of software which are used to (a) authenticate the other communication partners, (b) establish a secure communication channel between them (using insecure communication media), and (c) transfer data between the communication partners in such a way that these data only available to the desired receiver, but not to anyone else. Such an implementation usually consists of the following components: the protocol-engine, which controls in which sequence the messages of the protocol are sent over the network, and which controls the assembly/disassembly and processing (e.g., decryption) of the data. the cryptographic routines to actually encrypt or decrypt the data (using given keys), and t,he interface to the operating system and to the application. For a correct working of such a security protocol, all of these components must work flawlessly. Many formal-methods based techniques for the analysis of a security protocols have been developed. They range from using specific logics (e.g.: BAN-logic [4], or higher order logics [12] to model checking [2] approaches. In each approach, the analysis tries to prove that no (or at least not a modeled intruder) can get access to secret data. Otherwise, a scenario illustrating the &tack may be produced. Despite the seeming simplicity of security protocols ("only" a few messages are sent between the protocol partners in order to ensure a secure communication), many flaws have been detected. Unfortunately, even a perfect protocol engine does not guarantee flawless working of a security protocol, as incidents show. Many break-ins and security vulnerabilities are caused by exploiting errors in the implementation of the protocol engine or the underlying operating system. Attacks using buffer-overflows are a very common class of such attacks. Errors in the implementation of exception or error handling can open up additional vulnerabilities. For example, on a website with a log-in screen: multiple tries with invalid passwords caused the expected error message (too many retries). but let the user nevertheless pass. Finally, security can be compromised by silly implementation bugs or design decisions. In a commercial VPN software, all calls to the encryption routines were incidentally replaced by stubs, probably during factory testing. The product worked nicely. and the error (an open VPN) would have gone undetected, if a team member had not inspected the low-level traffic out of curiosity. Also, the use secret proprietary encryption routines can backfire, because such algorithms often exhibit weaknesses which can be exploited easily (see e.g., DVD encoding). Summarizing, there is large number of possibilities to make errors which can compromise the security of a protocol. In today s world with short time-to-market and the use of security protocols in open and hostile networks for safety-critical applications (e.g., power or air-traffic control), such slips could lead to catastrophic situations. Thus, formal methods and automatic reasoning techniques should not be used just for the formal proof of absence of an attack, but they ought to be used to provide an end-to-end tool-supported framework for security software. With such an approach all required artifacts (code, documentation, test cases) , formal analyses, and reliable certification will be generated automatically, given a single, high level specification. By a combination of program synthesis, formal protocol analysis, certification; and proof-carrying code, this goal is within practical reach, since all the important technologies for such an approach actually exist and only need to be assembled in the right way.
Open access for operational research publications from low- and middle-income countries: who pays?
Kumar, A. M. V.; Reid, A. J.; Van den Bergh, R.; Isaakidis, P.; Draguez, B.; Delaunois, P.; Nagaraja, S. B.; Ramsay, A.; Reeder, J. C.; Denisiuk, O.; Ali, E.; Khogali, M.; Hinderaker, S. G.; Kosgei, R. J.; van Griensven, J.; Quaglio, G. L.; Maher, D.; Billo, N. E.; Terry, R. F.; Harries, A. D.
2014-01-01
Open-access journal publications aim to ensure that new knowledge is widely disseminated and made freely accessible in a timely manner so that it can be used to improve people's health, particularly those in low- and middle-income countries. In this paper, we briefly explain the differences between closed- and open-access journals, including the evolving idea of the ‘open-access spectrum’. We highlight the potential benefits of supporting open access for operational research, and discuss the conundrum and ways forward as regards who pays for open access. PMID:26400799
The continued movement for open access to peer-reviewed literature.
Liesegang, Thomas J
2013-09-01
To provide a current overview of the movement for open access to the peer review literature. Perspective. Literature review of recent advances in the open access movement with a personal viewpoint of the nuances of the movement. The open access movement is complex, with many different constituents. The idealists for the open access movement are seeking open access to the literature but also to the data that constitute the research within the manuscript. The business model of the traditional subscription journal is being scrutinized in relation to the surge in the number of open access journals. Within this environment authors should beware predatory practices. More government and funding agencies are mandating open access to their funded research. This open access movement will continue to be disruptive until a business model ensures continuity of the scientific record. A flood of open access articles that might enrich, but also might pollute or confuse, the medical literature has altered the filtering mechanism provided by the traditional peer review system. At some point there may be a shake-out, with some literature being lost in cyberspace. The open access movement is maturing and must be embraced in some format. The challenge is to establish a sustainable financial business model that will permit the use of digital technology but yet not endanger the decades-old traditional publication model and peer review system. Authors seem to be slower in adopting open access than the idealists in the movement. Copyright © 2013 Elsevier Inc. All rights reserved.
Structural barriers in access to medical marijuana in the USA-a systematic review protocol.
Valencia, Celina I; Asaolu, Ibitola O; Ehiri, John E; Rosales, Cecilia
2017-08-07
There are 43 state medical marijuana programs in the USA, yet limited evidence is available on the demographic characteristics of the patient population accessing these programs. Moreover, insights into the social and structural barriers that inform patients' success in accessing medical marijuana are limited. A current gap in the scientific literature exists regarding generalizable data on the social, cultural, and structural mechanisms that hinder access to medical marijuana among qualifying patients. The goal of this systematic review, therefore, is to identify the aforementioned mechanisms that inform disparities in access to medical marijuana in the USA. This scoping review protocol outlines the proposed study design for the systematic review and evaluation of peer-reviewed scientific literature on structural barriers to medical marijuana access. The protocol follows the guidelines set forth by the Preferred Reporting Items for Systematic review and Meta-Analysis Protocols (PRISMA-P) checklist. The overarching goal of this study is to rigorously evaluate the existing peer-reviewed data on access to medical marijuana in the USA. Income, ethnic background, stigma, and physician preferences have been posited as the primary structural barriers influencing medical marijuana patient population demographics in the USA. Identification of structural barriers to accessing medical marijuana provides a framework for future policies and programs. Evidence-based policies and programs for increasing medical marijuana access help minimize the disparity of access among qualifying patients.
Advanced teleprocessing systems
NASA Astrophysics Data System (ADS)
Kleinrock, L.; Gerla, M.
1982-09-01
This Annual Technical Report covers research covering the period from October 1, 1981 to September 30, 1982. This contract has three primary designated research areas: packet radio systems, resource sharing and allocation, and distributed processing and control. This report contains abstracts of publications which summarize research results in these areas followed by the main body of the report which is devoted to a study of channel access protocols that are executed by the nodes of a network to schedule their transmissions on multi-access broadcast channel. In particular the main body consists of a Ph.D. dissertation, Channel Access Protocols for Multi-Hop Broadcast Packet Radio Networks. This work discusses some new channel access protocols useful for mobile radio networks. Included is an analysis of slotted ALOHA and some tight bounds on the performance of all possible protocols in a mobile environment.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-12
... month.[the following charges: $285/hour--For Active Connection testing using current Exchange access... using current Exchange access protocols; $333/hour--For Active Connection testing using current Exchange... a fee of $285 per hour for active connection testing using current BX access protocols during the...
Data visualization in interactive maps and time series
NASA Astrophysics Data System (ADS)
Maigne, Vanessa; Evano, Pascal; Brockmann, Patrick; Peylin, Philippe; Ciais, Philippe
2014-05-01
State-of-the-art data visualization has nothing to do with plots and maps we used few years ago. Many opensource tools are now available to provide access to scientific data and implement accessible, interactive, and flexible web applications. Here we will present a web site opened November 2013 to create custom global and regional maps and time series from research models and datasets. For maps, we explore and get access to data sources from a THREDDS Data Server (TDS) with the OGC WMS protocol (using the ncWMS implementation) then create interactive maps with the OpenLayers javascript library and extra information layers from a GeoServer. Maps become dynamic, zoomable, synchroneaously connected to each other, and exportable to Google Earth. For time series, we extract data from a TDS with the Netcdf Subset Service (NCSS) then display interactive graphs with a custom library based on the Data Driven Documents javascript library (D3.js). This time series application provides dynamic functionalities such as interpolation, interactive zoom on different axes, display of point values, and export to different formats. These tools were implemented for the Global Carbon Atlas (http://www.globalcarbonatlas.org): a web portal to explore, visualize, and interpret global and regional carbon fluxes from various model simulations arising from both human activities and natural processes, a work led by the Global Carbon Project.
NASA Technical Reports Server (NTRS)
Wallett, Thomas M.
2009-01-01
This paper surveys and describes some of the existing media access control and data link layer technologies for possible application in lunar surface communications and the advanced wideband Direct Sequence Code Division Multiple Access (DSCDMA) conceptual systems utilizing phased-array technology that will evolve in the next decade. Time Domain Multiple Access (TDMA) and Code Division Multiple Access (CDMA) are standard Media Access Control (MAC) techniques that can be incorporated into lunar surface communications architectures. Another novel hybrid technique that is recently being developed for use with smart antenna technology combines the advantages of CDMA with those of TDMA. The relatively new and sundry wireless LAN data link layer protocols that are continually under development offer distinct advantages for lunar surface applications over the legacy protocols which are not wireless. Also several communication transport and routing protocols can be chosen with characteristics commensurate with smart antenna systems to provide spacecraft communications for links exhibiting high capacity on the surface of the Moon. The proper choices depend on the specific communication requirements.
NASA Astrophysics Data System (ADS)
Bambacus, M.; Alameh, N.; Cole, M.
2006-12-01
The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.
Next-Generation Search Engines for Information Retrieval
DOE Office of Scientific and Technical Information (OSTI.GOV)
Devarakonda, Ranjeet; Hook, Leslie A; Palanisamy, Giri
In the recent years, there have been significant advancements in the areas of scientific data management and retrieval techniques, particularly in terms of standards and protocols for archiving data and metadata. Scientific data is rich, and spread across different places. In order to integrate these pieces together, a data archive and associated metadata should be generated. Data should be stored in a format that can be retrievable and more importantly it should be in a format that will continue to be accessible as technology changes, such as XML. While general-purpose search engines (such as Google or Bing) are useful formore » finding many things on the Internet, they are often of limited usefulness for locating Earth Science data relevant (for example) to a specific spatiotemporal extent. By contrast, tools that search repositories of structured metadata can locate relevant datasets with fairly high precision, but the search is limited to that particular repository. Federated searches (such as Z39.50) have been used, but can be slow and the comprehensiveness can be limited by downtime in any search partner. An alternative approach to improve comprehensiveness is for a repository to harvest metadata from other repositories, possibly with limits based on subject matter or access permissions. Searches through harvested metadata can be extremely responsive, and the search tool can be customized with semantic augmentation appropriate to the community of practice being served. One such system, Mercury, a metadata harvesting, data discovery, and access system, built for researchers to search to, share and obtain spatiotemporal data used across a range of climate and ecological sciences. Mercury is open-source toolset, backend built on Java and search capability is supported by the some popular open source search libraries such as SOLR and LUCENE. Mercury harvests the structured metadata and key data from several data providing servers around the world and builds a centralized index. The harvested files are indexed against SOLR search API consistently, so that it can render search capabilities such as simple, fielded, spatial and temporal searches across a span of projects ranging from land, atmosphere, and ocean ecology. Mercury also provides data sharing capabilities using Open Archive Initiatives Protocol for Metadata Handling (OAI-PMH). In this paper we will discuss about the best practices for archiving data and metadata, new searching techniques, efficient ways of data retrieval and information display.« less
An operational open-end file transfer protocol for mobile satellite communications
NASA Technical Reports Server (NTRS)
Wang, Charles; Cheng, Unjeng; Yan, Tsun-Yee
1988-01-01
This paper describes an operational open-end file transfer protocol which includes the connecting procedure, data transfer, and relinquishment procedure for mobile satellite communications. The protocol makes use of the frame level and packet level formats of the X.25 standard for the data link layer and network layer, respectively. The structure of a testbed for experimental simulation of this protocol over a mobile fading channel is also introduced.
Open Access Journal Policies: A Systematic Analysis of Radiology Journals.
Narayan, Anand; Lobner, Katie; Fritz, Jan
2018-02-01
The open access movement has pushed for greater access to scientific knowledge by expanding access to scientific journal articles. There is limited information about the extent to which open access policies have been adopted by radiology journals. We performed a systematic analysis to ascertain the proportion of radiology journals with open access options. A search was performed with the assistance of a clinical informationist. Full and mixed English-language diagnostic and interventional radiology Web of Science journals (impact factors > 1.0) were included. Nuclear medicine, radiation oncology, physics, and solicitation-only journals were excluded. Primary outcome was open access option (yes or no) with additional outcomes including presence or absence of embargo, complete or partial copyright transfer, publication fees, and self-archiving policies. Secondary outcomes included journal citations, journal impact factors, immediacy, Eigenfactor, and article influence scores. Independent double readings were performed with differences resolved by consensus, supplemented by contacting editorial staff at each journal. In all, 125 journals were identified; review yielded 49 journals (39%, mean impact factor of 2.61). Thirty-six of the journals had open access options (73.4%), and four journals were exclusively open access (8.2%). Twelve-month embargoes were most commonly cited (90.6%) with 28.6% of journals stating that they did not require a complete transfer of copyright. Prices for open access options ranged from $750 to $4,000 (median $3,000). No statistically significant differences were found in journal impact measures comparing journals with open access options to journals without open access options. Diagnostic and interventional radiology journals have widely adopted open access options with a few radiology journals being exclusively open access. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Policies for implementing network firewalls
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, C.D.
1994-05-01
Corporate networks are frequently protected by {open_quotes}firewalls{close_quotes} or gateway systems that control access to/from other networks, e.g., the Internet, in order to reduce the network`s vulnerability to hackers and other unauthorized access. Firewalls typically limit access to particular network nodes and application protocols, and they often perform special authentication and authorization functions. One of the difficult issues associated with network firewalls is determining which applications should be permitted through the firewall. For example, many networks permit the exchange of electronic mail with the outside but do not permit file access to be initiated by outside users, as this might allowmore » outside users to access sensitive data or to surreptitiously modify data or programs (e.g., to intall Trojan Horse software). However, if access through firewalls is severely restricted, legitimate network users may find it difficult or impossible to collaborate with outside users and to share data. Some of the most serious issues regarding firewalls involve setting policies for firewalls with the goal of achieving an acceptable balance between the need for greater functionality and the associated risks. Two common firewall implementation techniques, screening routers and application gateways, are discussed below, followed by some common policies implemented by network firewalls.« less
Research on a Queue Scheduling Algorithm in Wireless Communications Network
NASA Astrophysics Data System (ADS)
Yang, Wenchuan; Hu, Yuanmei; Zhou, Qiancai
This paper proposes a protocol QS-CT, Queue Scheduling Mechanism based on Multiple Access in Ad hoc net work, which adds queue scheduling mechanism to RTS-CTS-DATA using multiple access protocol. By endowing different queues different scheduling mechanisms, it makes networks access to the channel much more fairly and effectively, and greatly enhances the performance. In order to observe the final performance of the network with QS-CT protocol, we simulate it and compare it with MACA/C-T without QS-CT protocol. Contrast to MACA/C-T, the simulation result shows that QS-CT has greatly improved the throughput, delay, rate of packets' loss and other key indicators.
Review of access, licenses and understandability of open datasets used in hydrology research
NASA Astrophysics Data System (ADS)
Falkenroth, Esa; Arheimer, Berit; Lagerbäck Adolphi, Emma
2015-04-01
The amount of open data available for hydrology research is continually growing. In the EU-funded project SWITCH-ON (Sharing Water-related Information to Tackle Changes in the Hydrosphere - for Operational Needs), we are addressing water concerns by exploring and exploiting the untapped potential of these new open data. This work is enabled by many ongoing efforts to facilitate the use of open data. For instance, a number of portals (such as the GEOSS Portal and the INSPIRE community geoportal) provide the means to search for such open data sets and open spatial data services. However, in general, the systematic use of available open data is still fairly uncommon in hydrology research. Factors that limits (re)usability of a data set include: (1) accessibility, (2) understandability and (3) licences. If you cannot access the data set, you cannot use if for research. If you cannot understand the data set you cannot use it for research. Finally, if you are not permitted to use the data, you cannot use it for research. Early on in the project, we sent out a questionnaire to our research partners (SMHI, Universita di Bologna, University of Bristol, Technische Universiteit Delft and Technische Universitaet Wien) to find out what data sets they were planning to use in their experiments. The result was a comprehensive list of useful open data sets. Later, this list of data sets was extended with additional information on data sets for planned commercial water-information products and services. With the list of 50 common data sets as a starting point, we reviewed issues related to access, understandability and licence conditions. Regarding access to data sets, a majority of data sets were available through direct internet download via some well-known transfer protocol such as ftp or http. However, several data sets were found to be inaccessible due to server downtime, incorrect links or problems with the host database management system. One possible explanation for this could be that many data sets have been assembled by research project that no longer are funded. Hence, their server infrastructure would be less maintained compared to large-scale operational services. Regarding understandability of the data sets, the issues encountered were mainly due to incomplete documentation or metadata and problems with decoding binary formats. Ideally, open data sets should be represented in well-known formats and they should be accompanied with sufficient documentation so the data set can be understood. Furthermore, machine-readable format would be preferrable. Here, the development efforts on Water ML and NETCDF and other standards should improve understandability of data sets over time but in this review, only a few data sets were provided in these wellknown formats. Instead, the majority of datasets were stored in various text-based or binary formats or even document-oriented formats such as PDF. For some binary formats, we could not find information on what software was necessary to decipher the files. Other domains such as meteorology have long-standing traditions of operational data exchange format whereas hydrology research is still quite fragmented and the data exchange is usually done on a case-by-case basis. With the increased sharing of open data there is a good chance the situation will improve for data sets used in hydrology research. Finally, regarding licensce issue, a high number of data sets did not have a clear statement on terms of use and limitation for access. In most cases the provider could be contacted regarding licensing issues.
Shahzad, Aamir; Landry, René; Lee, Malrey; Xiong, Naixue; Lee, Jongho; Lee, Changhoon
2016-01-01
Substantial changes have occurred in the Information Technology (IT) sectors and with these changes, the demand for remote access to field sensor information has increased. This allows visualization, monitoring, and control through various electronic devices, such as laptops, tablets, i-Pads, PCs, and cellular phones. The smart phone is considered as a more reliable, faster and efficient device to access and monitor industrial systems and their corresponding information interfaces anywhere and anytime. This study describes the deployment of a protocol whereby industrial system information can be securely accessed by cellular phones via a Supervisory Control And Data Acquisition (SCADA) server. To achieve the study goals, proprietary protocol interconnectivity with non-proprietary protocols and the usage of interconnectivity services are considered in detail. They support the visualization of the SCADA system information, and the related operations through smart phones. The intelligent sensors are configured and designated to process real information via cellular phones by employing information exchange services between the proprietary protocol and non-proprietary protocols. SCADA cellular access raises the issue of security flaws. For these challenges, a cryptography-based security method is considered and deployed, and it could be considered as a part of a proprietary protocol. Subsequently, transmission flows from the smart phones through a cellular network. PMID:27314351
Shahzad, Aamir; Landry, René; Lee, Malrey; Xiong, Naixue; Lee, Jongho; Lee, Changhoon
2016-06-14
Substantial changes have occurred in the Information Technology (IT) sectors and with these changes, the demand for remote access to field sensor information has increased. This allows visualization, monitoring, and control through various electronic devices, such as laptops, tablets, i-Pads, PCs, and cellular phones. The smart phone is considered as a more reliable, faster and efficient device to access and monitor industrial systems and their corresponding information interfaces anywhere and anytime. This study describes the deployment of a protocol whereby industrial system information can be securely accessed by cellular phones via a Supervisory Control And Data Acquisition (SCADA) server. To achieve the study goals, proprietary protocol interconnectivity with non-proprietary protocols and the usage of interconnectivity services are considered in detail. They support the visualization of the SCADA system information, and the related operations through smart phones. The intelligent sensors are configured and designated to process real information via cellular phones by employing information exchange services between the proprietary protocol and non-proprietary protocols. SCADA cellular access raises the issue of security flaws. For these challenges, a cryptography-based security method is considered and deployed, and it could be considered as a part of a proprietary protocol. Subsequently, transmission flows from the smart phones through a cellular network.
ERDDAP: Reducing Data Friction with an Open Source Data Platform
NASA Astrophysics Data System (ADS)
O'Brien, K.
2017-12-01
Data friction is not just an issue facing interdisciplinary research. Often times, even within disciplines, significant data friction can exist. Issues of differing formats, limited metadata and non-existent machine-to-machine data access are all issues that exist within disciplines and make it that much harder for successful interdisciplinary cooperation. Therefore, reducing data friction within disciplines is crucial first step in providing better overall collaboration. ERDDAP, an open source data platform developed at NOAA's Southwest Fisheries Center, is well poised to improve data useability and understanding and reduce data friction, both in single and multi-disciplinary research. By virtue of its ability to integrate data of varying formats and provide RESTful-based user access to data and metadata, use of ERDDAP has grown substantially throughout the ocean data community. ERDDAP also supports standards such as the DAP data protocol, the Climate and Forecast (CF) metadata conventions and the Bagit document standard for data archival. In this presentation, we will discuss the advantages of using ERDDAP as a data platform. We will also show specific use cases where utilizing ERDDAP has reduced friction within a single discipline (physical oceanography) and improved interdisciplinary collaboration as well.
Metadata management for high content screening in OMERO
Li, Simon; Besson, Sébastien; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Lindner, Dominik; Linkert, Melissa; Moore, William J.; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Allan, Chris; Burel, Jean-Marie; Moore, Josh; Swedlow, Jason R.
2016-01-01
High content screening (HCS) experiments create a classic data management challenge—multiple, large sets of heterogeneous structured and unstructured data, that must be integrated and linked to produce a set of “final” results. These different data include images, reagents, protocols, analytic output, and phenotypes, all of which must be stored, linked and made accessible for users, scientists, collaborators and where appropriate the wider community. The OME Consortium has built several open source tools for managing, linking and sharing these different types of data. The OME Data Model is a metadata specification that supports the image data and metadata recorded in HCS experiments. Bio-Formats is a Java library that reads recorded image data and metadata and includes support for several HCS screening systems. OMERO is an enterprise data management application that integrates image data, experimental and analytic metadata and makes them accessible for visualization, mining, sharing and downstream analysis. We discuss how Bio-Formats and OMERO handle these different data types, and how they can be used to integrate, link and share HCS experiments in facilities and public data repositories. OME specifications and software are open source and are available at https://www.openmicroscopy.org. PMID:26476368
Metadata management for high content screening in OMERO.
Li, Simon; Besson, Sébastien; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Lindner, Dominik; Linkert, Melissa; Moore, William J; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Allan, Chris; Burel, Jean-Marie; Moore, Josh; Swedlow, Jason R
2016-03-01
High content screening (HCS) experiments create a classic data management challenge-multiple, large sets of heterogeneous structured and unstructured data, that must be integrated and linked to produce a set of "final" results. These different data include images, reagents, protocols, analytic output, and phenotypes, all of which must be stored, linked and made accessible for users, scientists, collaborators and where appropriate the wider community. The OME Consortium has built several open source tools for managing, linking and sharing these different types of data. The OME Data Model is a metadata specification that supports the image data and metadata recorded in HCS experiments. Bio-Formats is a Java library that reads recorded image data and metadata and includes support for several HCS screening systems. OMERO is an enterprise data management application that integrates image data, experimental and analytic metadata and makes them accessible for visualization, mining, sharing and downstream analysis. We discuss how Bio-Formats and OMERO handle these different data types, and how they can be used to integrate, link and share HCS experiments in facilities and public data repositories. OME specifications and software are open source and are available at https://www.openmicroscopy.org. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Security Implications of OPC, OLE, DCOM, and RPC in Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2006-01-01
OPC is a collection of software programming standards and interfaces used in the process control industry. It is intended to provide open connectivity and vendor equipment interoperability. The use of OPC technology simplifies the development of control systems that integrate components from multiple vendors and support multiple control protocols. OPC-compliant products are available from most control system vendors, and are widely used in the process control industry. OPC was originally known as OLE for Process Control; the first standards for OPC were based on underlying services in the Microsoft Windows computing environment. These underlying services (OLE [Object Linking and Embedding],more » DCOM [Distributed Component Object Model], and RPC [Remote Procedure Call]) have been the source of many severe security vulnerabilities. It is not feasible to automatically apply vendor patches and service packs to mitigate these vulnerabilities in a control systems environment. Control systems using the original OPC data access technology can thus inherit the vulnerabilities associated with these services. Current OPC standardization efforts are moving away from the original focus on Microsoft protocols, with a distinct trend toward web-based protocols that are independent of any particular operating system. However, the installed base of OPC equipment consists mainly of legacy implementations of the OLE for Process Control protocols.« less
An Outline of Data Aggregation Security in Heterogeneous Wireless Sensor Networks.
Boubiche, Sabrina; Boubiche, Djallel Eddine; Bilami, Azzedine; Toral-Cruz, Homero
2016-04-12
Data aggregation processes aim to reduce the amount of exchanged data in wireless sensor networks and consequently minimize the packet overhead and optimize energy efficiency. Securing the data aggregation process is a real challenge since the aggregation nodes must access the relayed data to apply the aggregation functions. The data aggregation security problem has been widely addressed in classical homogeneous wireless sensor networks, however, most of the proposed security protocols cannot guarantee a high level of security since the sensor node resources are limited. Heterogeneous wireless sensor networks have recently emerged as a new wireless sensor network category which expands the sensor nodes' resources and capabilities. These new kinds of WSNs have opened new research opportunities where security represents a most attractive area. Indeed, robust and high security level algorithms can be used to secure the data aggregation at the heterogeneous aggregation nodes which is impossible in classical homogeneous WSNs. Contrary to the homogeneous sensor networks, the data aggregation security problem is still not sufficiently covered and the proposed data aggregation security protocols are numberless. To address this recent research area, this paper describes the data aggregation security problem in heterogeneous wireless sensor networks and surveys a few proposed security protocols. A classification and evaluation of the existing protocols is also introduced based on the adopted data aggregation security approach.
OpenDrop: An Integrated Do-It-Yourself Platform for Personal Use of Biochips
Alistar, Mirela; Gaudenz, Urs
2017-01-01
Biochips, or digital labs-on-chip, are developed with the purpose of being used by laboratory technicians or biologists in laboratories or clinics. In this article, we expand this vision with the goal of enabling everyone, regardless of their expertise, to use biochips for their own personal purposes. We developed OpenDrop, an integrated electromicrofluidic platform that allows users to develop and program their own bio-applications. We address the main challenges that users may encounter: accessibility, bio-protocol design and interaction with microfluidics. OpenDrop consists of a do-it-yourself biochip, an automated software tool with visual interface and a detailed technique for at-home operations of microfluidics. We report on two years of use of OpenDrop, released as an open-source platform. Our platform attracted a highly diverse user base with participants originating from maker communities, academia and industry. Our findings show that 47% of attempts to replicate OpenDrop were successful, the main challenge remaining the assembly of the device. In terms of usability, the users managed to operate their platforms at home and are working on designing their own bio-applications. Our work provides a step towards a future in which everyone will be able to create microfluidic devices for their personal applications, thereby democratizing parts of health care. PMID:28952524
ERIC Educational Resources Information Center
Armbruster, Chris
2008-01-01
Online, open access is the superior model for scholarly communication. A variety of scientific communities in physics, the life sciences and economics have gone furthest in innovating their scholarly communication through open access, enhancing accessibility for scientists, students and the interested public. Open access enjoys a comparative…
Method-centered digital communities on protocols.io for fast-paced scientific innovation.
Kindler, Lori; Stoliartchouk, Alexei; Teytelman, Leonid; Hurwitz, Bonnie L
2016-01-01
The Internet has enabled online social interaction for scientists beyond physical meetings and conferences. Yet despite these innovations in communication, dissemination of methods is often relegated to just academic publishing. Further, these methods remain static, with subsequent advances published elsewhere and unlinked. For communities undergoing fast-paced innovation, researchers need new capabilities to share, obtain feedback, and publish methods at the forefront of scientific development. For example, a renaissance in virology is now underway given the new metagenomic methods to sequence viral DNA directly from an environment. Metagenomics makes it possible to "see" natural viral communities that could not be previously studied through culturing methods. Yet, the knowledge of specialized techniques for the production and analysis of viral metagenomes remains in a subset of labs. This problem is common to any community using and developing emerging technologies and techniques. We developed new capabilities to create virtual communities in protocols.io, an open access platform, for disseminating protocols and knowledge at the forefront of scientific development. To demonstrate these capabilities, we present a virology community forum called VERVENet. These new features allow virology researchers to share protocols and their annotations and optimizations, connect with the broader virtual community to share knowledge, job postings, conference announcements through a common online forum, and discover the current literature through personalized recommendations to promote discussion of cutting edge research. Virtual communities in protocols.io enhance a researcher's ability to: discuss and share protocols, connect with fellow community members, and learn about new and innovative research in the field. The web-based software for developing virtual communities is free to use on protocols.io. Data are available through public APIs at protocols.io.
An improved ATAC-seq protocol reduces background and enables interrogation of frozen tissues.
Corces, M Ryan; Trevino, Alexandro E; Hamilton, Emily G; Greenside, Peyton G; Sinnott-Armstrong, Nicholas A; Vesuna, Sam; Satpathy, Ansuman T; Rubin, Adam J; Montine, Kathleen S; Wu, Beijing; Kathiria, Arwa; Cho, Seung Woo; Mumbach, Maxwell R; Carter, Ava C; Kasowski, Maya; Orloff, Lisa A; Risca, Viviana I; Kundaje, Anshul; Khavari, Paul A; Montine, Thomas J; Greenleaf, William J; Chang, Howard Y
2017-10-01
We present Omni-ATAC, an improved ATAC-seq protocol for chromatin accessibility profiling that works across multiple applications with substantial improvement of signal-to-background ratio and information content. The Omni-ATAC protocol generates chromatin accessibility profiles from archival frozen tissue samples and 50-μm sections, revealing the activities of disease-associated DNA elements in distinct human brain structures. The Omni-ATAC protocol enables the interrogation of personal regulomes in tissue context and translational studies.
Abbreviated MRI Protocols: Wave of the Future for Breast Cancer Screening.
Chhor, Chloe M; Mercado, Cecilia L
2017-02-01
The purpose of this article is to describe the use of abbreviated breast MRI protocols for improving access to screening for women at intermediate risk. Breast MRI is not a cost-effective modality for screening women at intermediate risk, including those with dense breast tissue as the only risk. Abbreviated breast MRI protocols have been proposed as a way of achieving efficiency and rapid throughput. Use of these abbreviated protocols may increase availability and provide women with greater access to breast MRI.
Xrootd in dCache - design and experiences
NASA Astrophysics Data System (ADS)
Behrmann, Gerd; Ozerov, Dmitry; Zangerl, Thomas
2011-12-01
dCache is a well established distributed storage solution used in both high energy physics computing and other disciplines. An overview of the implementation of the xrootd data access protocol within dCache is presented. The performance of various access mechanisms is studied and compared and it is concluded that our implementation is as perfomant as other protocols. This makes dCache a compelling alternative to the Scalla software suite implementation of xrootd, with added value from broad protocol support, including the IETF approved NFS 4.1 protocol.
30 CFR 291.113 - What actions may MMS take to remedy denial of open and nondiscriminatory access?
Code of Federal Regulations, 2010 CFR
2010-07-01
... open and nondiscriminatory access? 291.113 Section 291.113 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR APPEALS OPEN AND NONDISCRIMINATORY ACCESS TO OIL AND GAS PIPELINES... grantee or transporter has not provided open access or nondiscriminatory access, then the decision will...
Giglia, E
2010-09-01
This contribution is aimed at presenting a sort of "state of the art" of Open Access on the occasion of the 2010 international Open Access Week, to be held from October 18 to October 24. We shall see facts and figures about open archives and the mandates to deposit; about Open Access journals; about impact and citation advantages for the researchers, and about economic sustainability.
Open access: changing global science publishing.
Gasparyan, Armen Yuri; Ayvazyan, Lilit; Kitas, George D
2013-08-01
The article reflects on open access as a strategy of changing the quality of science communication globally. Successful examples of open-access journals are presented to highlight implications of archiving in open digital repositories for the quality and citability of research output. Advantages and downsides of gold, green, and hybrid models of open access operating in diverse scientific environments are described. It is assumed that open access is a global trend which influences the workflow in scholarly journals, changing their quality, credibility, and indexability.
Design of the frame structure for a multiservice interactive system using ATM-PON
NASA Astrophysics Data System (ADS)
Nam, Jae-Hyun; Jang, Jongwook; Lee, Jung-Tae
1998-10-01
The MAC (Medium Access Control) protocol controls B-NT1s' (Optical Network Unit) access to the shared capacity on the PON, this protocol is very important if TDMA (Time Division Multiple Access) multiplexing is used on the upstream. To control the upstream traffic some kind of access protocol has to be implemented. There are roughly two different approaches to use request cells: in a collision free way or such that collisions in a request slot are allowed. It is the objective of this paper to describe a MAC-protocol structure that supports both approaches and hybrids of it. In our paper we grantee the QoS (Quality of Service) of each B-NT1 through LOC, LOV, LOA field that are the length field of the transmitted cell at each B-NT1. Each B-NT1 transmits its status of request on request cell.
NASA Astrophysics Data System (ADS)
Zheng, Jun; Ansari, Nirwan
2005-06-01
Call for Papers: Optical Access Networks With the wide deployment of fiber-optic technology over the past two decades, we have witnessed a tremendous growth of bandwidth capacity in the backbone networks of today's telecommunications infrastructure. However, access networks, which cover the "last-mile" areas and serve numerous residential and small business users, have not been scaled up commensurately. The local subscriber lines for telephone and cable television are still using twisted pairs and coaxial cables. Most residential connections to the Internet are still through dial-up modems operating at a low speed on twisted pairs. As the demand for access bandwidth increases with emerging high-bandwidth applications, such as distance learning, high-definition television (HDTV), and video on demand (VoD), the last-mile access networks have become a bandwidth bottleneck in today's telecommunications infrastructure. To ease this bottleneck, it is imperative to provide sufficient bandwidth capacity in the access networks to open the bottleneck and thus present more opportunities for the provisioning of multiservices. Optical access solutions promise huge bandwidth to service providers and low-cost high-bandwidth services to end users and are therefore widely considered the technology of choice for next-generation access networks. To realize the vision of optical access networks, however, many key issues still need to be addressed, such as network architectures, signaling protocols, and implementation standards. The major challenges lie in the fact that an optical solution must be not only robust, scalable, and flexible, but also implemented at a low cost comparable to that of existing access solutions in order to increase the economic viability of many potential high-bandwidth applications. In recent years, optical access networks have been receiving tremendous attention from both academia and industry. A large number of research activities have been carried out or are now underway this hot area. The purpose of this feature issue is to expose the networking community to the latest research breakthroughs and progresses in the area of optical access networks. This feature issue aims to present a collection of papers that focus on the state-of-the-art research in various networking aspects of optical access networks. Original papers are solicited from all researchers involved in area of optical access networks. Topics of interest include but not limited to: Optical access network architectures and protocols Passive optical networks (BPON, EPON, GPON, etc.) Active optical networks Multiple access control Multiservices and QoS provisioning Network survivability Field trials and standards Performance modeling and analysis
NASA Astrophysics Data System (ADS)
Zheng, Jun; Ansari, Nirwan; Jersey Inst Ansari, New; Jersey Inst, New
2005-04-01
Call for Papers: Optical Access Networks With the wide deployment of fiber-optic technology over the past two decades, we have witnessed a tremendous growth of bandwidth capacity in the backbone networks of today's telecommunications infrastructure. However, access networks, which cover the "last-mile" areas and serve numerous residential and small business users, have not been scaled up commensurately. The local subscriber lines for telephone and cable television are still using twisted pairs and coaxial cables. Most residential connections to the Internet are still through dial-up modems operating at a low speed on twisted pairs. As the demand for access bandwidth increases with emerging high-bandwidth applications, such as distance learning, high-definition television (HDTV), and video on demand (VoD), the last-mile access networks have become a bandwidth bottleneck in today's telecommunications infrastructure. To ease this bottleneck, it is imperative to provide sufficient bandwidth capacity in the access networks to open the bottleneck and thus present more opportunities for the provisioning of multiservices. Optical access solutions promise huge bandwidth to service providers and low-cost high-bandwidth services to end users and are therefore widely considered the technology of choice for next-generation access networks. To realize the vision of optical access networks, however, many key issues still need to be addressed, such as network architectures, signaling protocols, and implementation standards. The major challenges lie in the fact that an optical solution must be not only robust, scalable, and flexible, but also implemented at a low cost comparable to that of existing access solutions in order to increase the economic viability of many potential high-bandwidth applications. In recent years, optical access networks have been receiving tremendous attention from both academia and industry. A large number of research activities have been carried out or are now underway this hot area. The purpose of this feature issue is to expose the networking community to the latest research breakthroughs and progresses in the area of optical access networks. This feature issue aims to present a collection of papers that focus on the state-of-the-art research in various networking aspects of optical access networks. Original papers are solicited from all researchers involved in area of optical access networks. Topics of interest include but not limited to: Optical access network architectures and protocols Passive optical networks (BPON, EPON, GPON, etc.) Active optical networks Multiple access control Multiservices and QoS provisioning Network survivability Field trials and standards Performance modeling and analysis
NASA Astrophysics Data System (ADS)
Zheng, Jun; Ansari, Nirwan
2005-05-01
Call for Papers: Optical Access Networks With the wide deployment of fiber-optic technology over the past two decades, we have witnessed a tremendous growth of bandwidth capacity in the backbone networks of today's telecommunications infrastructure. However, access networks, which cover the "last-mile" areas and serve numerous residential and small business users, have not been scaled up commensurately. The local subscriber lines for telephone and cable television are still using twisted pairs and coaxial cables. Most residential connections to the Internet are still through dial-up modems operating at a low speed on twisted pairs. As the demand for access bandwidth increases with emerging high-bandwidth applications, such as distance learning, high-definition television (HDTV), and video on demand (VoD), the last-mile access networks have become a bandwidth bottleneck in today's telecommunications infrastructure. To ease this bottleneck, it is imperative to provide sufficient bandwidth capacity in the access networks to open the bottleneck and thus present more opportunities for the provisioning of multiservices. Optical access solutions promise huge bandwidth to service providers and low-cost high-bandwidth services to end users and are therefore widely considered the technology of choice for next-generation access networks. To realize the vision of optical access networks, however, many key issues still need to be addressed, such as network architectures, signaling protocols, and implementation standards. The major challenges lie in the fact that an optical solution must be not only robust, scalable, and flexible, but also implemented at a low cost comparable to that of existing access solutions in order to increase the economic viability of many potential high-bandwidth applications. In recent years, optical access networks have been receiving tremendous attention from both academia and industry. A large number of research activities have been carried out or are now underway this hot area. The purpose of this feature issue is to expose the networking community to the latest research breakthroughs and progresses in the area of optical access networks. This feature issue aims to present a collection of papers that focus on the state-of-the-art research in various networking aspects of optical access networks. Original papers are solicited from all researchers involved in area of optical access networks. Topics of interest include but not limited to: Optical access network architectures and protocols Passive optical networks (BPON, EPON, GPON, etc.) Active optical networks Multiple access control Multiservices and QoS provisioning Network survivability Field trials and standards Performance modeling and analysis
Collaborative development of predictive toxicology applications
2010-01-01
OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way. PMID:20807436
Collaborative development of predictive toxicology applications.
Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia
2010-08-31
OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.
Making geospatial data in ASF archive readily accessible
NASA Astrophysics Data System (ADS)
Gens, R.; Hogenson, K.; Wolf, V. G.; Drew, L.; Stern, T.; Stoner, M.; Shapran, M.
2015-12-01
The way geospatial data is searched, managed, processed and used has changed significantly in recent years. A data archive such as the one at the Alaska Satellite Facility (ASF), one of NASA's twelve interlinked Distributed Active Archive Centers (DAACs), used to be searched solely via user interfaces that were specifically developed for its particular archive and data sets. ASF then moved to using an application programming interface (API) that defined a set of routines, protocols, and tools for distributing the geospatial information stored in the database in real time. This provided a more flexible access to the geospatial data. Yet, it was up to user to develop the tools to get a more tailored access to the data they needed. We present two new approaches for serving data to users. In response to the recent Nepal earthquake we developed a data feed for distributing ESA's Sentinel data. Users can subscribe to the data feed and are provided with the relevant metadata the moment a new data set is available for download. The second approach was an Open Geospatial Consortium (OGC) web feature service (WFS). The WFS hosts the metadata along with a direct link from which the data can be downloaded. It uses the open-source GeoServer software (Youngblood and Iacovella, 2013) and provides an interface to include the geospatial information in the archive directly into the user's geographic information system (GIS) as an additional data layer. Both services are run on top of a geospatial PostGIS database, an open-source geographic extension for the PostgreSQL object-relational database (Marquez, 2015). Marquez, A., 2015. PostGIS essentials. Packt Publishing, 198 p. Youngblood, B. and Iacovella, S., 2013. GeoServer Beginner's Guide, Packt Publishing, 350 p.
An integrated content and metadata based retrieval system for art.
Lewis, Paul H; Martinez, Kirk; Abas, Fazly Salleh; Fauzi, Mohammad Faizal Ahmad; Chan, Stephen C Y; Addis, Matthew J; Boniface, Mike J; Grimwood, Paul; Stevenson, Alison; Lahanier, Christian; Stevenson, James
2004-03-01
A new approach to image retrieval is presented in the domain of museum and gallery image collections. Specialist algorithms, developed to address specific retrieval tasks, are combined with more conventional content and metadata retrieval approaches, and implemented within a distributed architecture to provide cross-collection searching and navigation in a seamless way. External systems can access the different collections using interoperability protocols and open standards, which were extended to accommodate content based as well as text based retrieval paradigms. After a brief overview of the complete system, we describe the novel design and evaluation of some of the specialist image analysis algorithms including a method for image retrieval based on sub-image queries, retrievals based on very low quality images and retrieval using canvas crack patterns. We show how effective retrieval results can be achieved by real end-users consisting of major museums and galleries, accessing the distributed but integrated digital collections.
Using cloud computing infrastructure with CloudBioLinux, CloudMan, and Galaxy.
Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James
2012-06-01
Cloud computing has revolutionized availability and access to computing and storage resources, making it possible to provision a large computational infrastructure with only a few clicks in a Web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this unit, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatic analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy, into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to set up the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command-line interface, and the Web-based Galaxy interface.
Stereoselective Luche reduction of deoxynivalenol and three of its acetylated derivatives at C8.
Fruhmann, Philipp; Hametner, Christian; Mikula, Hannes; Adam, Gerhard; Krska, Rudolf; Fröhlich, Johannes
2014-01-10
The trichothecene mycotoxin deoxynivalenol (DON) is a well known and common contaminant in food and feed. Acetylated derivatives and other biosynthetic precursors can occur together with the main toxin. A key biosynthetic step towards DON involves an oxidation of the 8-OH group of 7,8-dihydroxycalonectrin. Since analytical standards for the intermediates are not available and these intermediates are therefore rarely studied, we aimed for a synthetic method to invert this reaction, making a series of calonectrin-derived precursors accessible. We did this by developing an efficient protocol for stereoselective Luche reduction at C8. This method was used to access 3,7,8,15-tetrahydroxyscirpene, 3-deacetyl-7,8-dihydroxycalonectrin, 15-deacetyl-7,8-dihydroxycalonectrin and 7,8-dihydroxycalonectrin, which were characterized using several NMR techniques. Beside the development of a method which could basically be used for all type B trichothecenes, we opened a synthetic route towards different acetylated calonectrins.
Cool Apps: Building Cryospheric Data Applications With Standards-Based Service Oriented Architecture
NASA Astrophysics Data System (ADS)
Collins, J. A.; Truslove, I.; Billingsley, B. W.; Oldenburg, J.; Brodzik, M.; Lewis, S.; Liu, M.
2012-12-01
The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high-quality software in a timely manner, we have adopted a Service-Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-specific RESTful services. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/portal) which depends on many of the aforementioned services, and clearly exhibits many of the advantages of building applications atop a service-oriented architecture. This presentation outlines the architectural approach and components and open standards and protocols adopted at NSIDC, demonstrates the interactions and uses of public and internal service interfaces currently powering applications including the IceBridge Data Portal, and outlines the benefits and challenges of this approach.
NASA Astrophysics Data System (ADS)
Vines, Aleksander; Hansen, Morten W.; Korosov, Anton
2017-04-01
Existing infrastructure international and Norwegian projects, e.g., NorDataNet, NMDC and NORMAP, provide open data access through the OPeNDAP protocol following the conventions for CF (Climate and Forecast) metadata, designed to promote the processing and sharing of files created with the NetCDF application programming interface (API). This approach is now also being implemented in the Norwegian Sentinel Data Hub (satellittdata.no) to provide satellite EO data to the user community. Simultaneously with providing simplified and unified data access, these projects also seek to use and establish common standards for use and discovery metadata. This then allows development of standardized tools for data search and (subset) streaming over the internet to perform actual scientific analysis. A combinnation of software tools, which we call a Scientific Platform as a Service (SPaaS), will take advantage of these opportunities to harmonize and streamline the search, retrieval and analysis of integrated satellite and auxiliary observations of the oceans in a seamless system. The SPaaS is a cloud solution for integration of analysis tools with scientific datasets via an API. The core part of the SPaaS is a distributed metadata catalog to store granular metadata describing the structure, location and content of available satellite, model, and in situ datasets. The analysis tools include software for visualization (also online), interactive in-depth analysis, and server-based processing chains. The API conveys search requests between system nodes (i.e., interactive and server tools) and provides easy access to the metadata catalog, data repositories, and the tools. The SPaaS components are integrated in virtual machines, of which provisioning and deployment are automatized using existing state-of-the-art open-source tools (e.g., Vagrant, Ansible, Docker). The open-source code for scientific tools and virtual machine configurations is under version control at https://github.com/nansencenter/, and is coupled to an online continuous integration system (e.g., Travis CI).
Riera, M; Aibar, E
2013-05-01
Some studies suggest that open access articles are more often cited than non-open access articles. However, the relationship between open access and citations count in a discipline such as intensive care medicine has not been studied to date. The present article analyzes the effect of open access publishing of scientific articles in intensive care medicine journals in terms of citations count. We evaluated a total of 161 articles (76% being non-open access articles) published in Intensive Care Medicine in the year 2008. Citation data were compared between the two groups up until April 30, 2011. Potentially confounding variables for citation counts were adjusted for in a linear multiple regression model. The median number (interquartile range) of citations of non-open access articles was 8 (4-12) versus 9 (6-18) in the case of open access articles (p=0.084). In the highest citation range (>8), the citation count was 13 (10-16) and 18 (13-21) (p=0.008), respectively. The mean follow-up was 37.5 ± 3 months in both groups. In the 30-35 months after publication, the average number (mean ± standard deviation) of citations per article per month of non-open access articles was 0.28 ± 0.6 versus 0.38 ± 0.7 in the case of open access articles (p=0.043). Independent factors for citation advantage were the Hirsch index of the first signing author (β=0.207; p=0.015) and open access status (β=3.618; p=0.006). Open access publishing and the Hirsch index of the first signing author increase the impact of scientific articles. The open access advantage is greater for the more highly cited articles, and appears in the 30-35 months after publication. Copyright © 2012 Elsevier España, S.L. and SEMICYUC. All rights reserved.
Moretti, Rocco; Lyskov, Sergey; Das, Rhiju; Meiler, Jens; Gray, Jeffrey J
2018-01-01
The Rosetta molecular modeling software package provides a large number of experimentally validated tools for modeling and designing proteins, nucleic acids, and other biopolymers, with new protocols being added continually. While freely available to academic users, external usage is limited by the need for expertise in the Unix command line environment. To make Rosetta protocols available to a wider audience, we previously created a web server called Rosetta Online Server that Includes Everyone (ROSIE), which provides a common environment for hosting web-accessible Rosetta protocols. Here we describe a simplification of the ROSIE protocol specification format, one that permits easier implementation of Rosetta protocols. Whereas the previous format required creating multiple separate files in different locations, the new format allows specification of the protocol in a single file. This new, simplified protocol specification has more than doubled the number of Rosetta protocols available under ROSIE. These new applications include pK a determination, lipid accessibility calculation, ribonucleic acid redesign, protein-protein docking, protein-small molecule docking, symmetric docking, antibody docking, cyclic toxin docking, critical binding peptide determination, and mapping small molecule binding sites. ROSIE is freely available to academic users at http://rosie.rosettacommons.org. © 2017 The Protein Society.
Raspberry Pi-powered imaging for plant phenotyping.
Tovar, Jose C; Hoyer, J Steen; Lin, Andy; Tielking, Allison; Callen, Steven T; Elizabeth Castillo, S; Miller, Michael; Tessman, Monica; Fahlgren, Noah; Carrington, James C; Nusinow, Dmitri A; Gehan, Malia A
2018-03-01
Image-based phenomics is a powerful approach to capture and quantify plant diversity. However, commercial platforms that make consistent image acquisition easy are often cost-prohibitive. To make high-throughput phenotyping methods more accessible, low-cost microcomputers and cameras can be used to acquire plant image data. We used low-cost Raspberry Pi computers and cameras to manage and capture plant image data. Detailed here are three different applications of Raspberry Pi-controlled imaging platforms for seed and shoot imaging. Images obtained from each platform were suitable for extracting quantifiable plant traits (e.g., shape, area, height, color) en masse using open-source image processing software such as PlantCV. This protocol describes three low-cost platforms for image acquisition that are useful for quantifying plant diversity. When coupled with open-source image processing tools, these imaging platforms provide viable low-cost solutions for incorporating high-throughput phenomics into a wide range of research programs.
The DIAS/CEOS Water Portal, distributed system using brokering architecture
NASA Astrophysics Data System (ADS)
Miura, Satoko; Sekioka, Shinichi; Kuroiwa, Kaori; Kudo, Yoshiyuki
2015-04-01
The DIAS/CEOS Water Portal is a one of the DIAS (Data Integration and Analysis System, http://www.editoria.u-tokyo.ac.jp/projects/dias/?locale=en_US) systems for data distribution for users including, but not limited to, scientists, decision makers and officers like river administrators. This portal has two main functions; one is to search and access data and the other is to register and share use cases which use datasets provided via this portal. This presentation focuses on the first function, to search and access data. The Portal system is distributed in the sense that, while the portal system is located in Tokyo, the data is located in archive centers which are globally distributed. For example, some in-situ data is archived at the National Center for Atmospheric Research (NCAR) Earth Observing Laboratory in Boulder, Colorado, USA. The NWP station time series and global gridded model output data is archived at the Max Planck Institute for Meteorology (MPIM) in cooperation with the World Data Center for Climate in Hamburg, Germany. Part of satellite data is archived at DIAS storage at the University of Tokyo, Japan. This portal itself does not store data. Instead, according to requests made by users on the web page, it retrieves data from distributed data centers on-the-fly and lets them download and see rendered images/plots. Although some data centers have unique meta data format and/or data search protocols, our portal's brokering function enables users to search across various data centers at one time, like one-stop shopping. And this portal is also connected to other data brokering systems, including GEOSS DAB (Discovery and Access Broker). As a result, users can search over thousands of datasets, millions of files at one time. Our system mainly relies on the open source software GI-cat (http://essi-lab.eu/do/view/GIcat), Opensearch protocol and OPeNDAP protocol to enable the above functions. Details on how it works will be introduced during the presentation. Users can access the DIAS/CEOS Water Portal system at http://waterportal.ceos.org/.
Redactions in protocols for drug trials: what industry sponsors concealed.
Marquardsen, Mikkel; Ogden, Michelle; Gøtzsche, Peter C
2018-04-01
Objective To describe the redactions in contemporary protocols for industry-sponsored randomised drug trials with patient relevant outcomes and to evaluate whether there was a legitimate rationale for the redactions. Design Cohort study. Under the Freedom of Information Act, we requested access to trial protocols approved by a research ethics committee in Denmark from October 2012 to March 2013. We received 17 consecutive protocols, which had been redacted before we got them, and nine protocols without redactions. In five additional cases, the companies refused to let the committees give us access, and in three other cases, documents were missing. Participants Not applicable. Setting Not applicable. Main outcome measure Amount and nature of redactions in 22 predefined key protocol variables. Results The redactions were most widespread in those sections of the protocol where there is empirical evidence of substantial problems with the trustworthiness of published drug trials: data analysis, handling of missing data, detection and analysis of adverse events, definition of the outcomes, interim analyses and premature termination of the study, sponsor's access to incoming data while the study is running, ownership to the data and investigators' publication rights. The parts of the text that were redacted differed widely, both between companies and within the same company. Conclusions We could not identify any legitimate rationale for the redactions. The current mistrust in industry-sponsored drug trials can only change if the industry offers unconditional access to its trial protocols and other relevant documents and data.
The academic, economic and societal impacts of Open Access: an evidence-based review.
Tennant, Jonathan P; Waldner, François; Jacques, Damien C; Masuzzo, Paola; Collister, Lauren B; Hartgerink, Chris H J
2016-01-01
Ongoing debates surrounding Open Access to the scholarly literature are multifaceted and complicated by disparate and often polarised viewpoints from engaged stakeholders. At the current stage, Open Access has become such a global issue that it is critical for all involved in scholarly publishing, including policymakers, publishers, research funders, governments, learned societies, librarians, and academic communities, to be well-informed on the history, benefits, and pitfalls of Open Access. In spite of this, there is a general lack of consensus regarding the potential pros and cons of Open Access at multiple levels. This review aims to be a resource for current knowledge on the impacts of Open Access by synthesizing important research in three major areas: academic, economic and societal. While there is clearly much scope for additional research, several key trends are identified, including a broad citation advantage for researchers who publish openly, as well as additional benefits to the non-academic dissemination of their work. The economic impact of Open Access is less well-understood, although it is clear that access to the research literature is key for innovative enterprises, and a range of governmental and non-governmental services. Furthermore, Open Access has the potential to save both publishers and research funders considerable amounts of financial resources, and can provide some economic benefits to traditionally subscription-based journals. The societal impact of Open Access is strong, in particular for advancing citizen science initiatives, and leveling the playing field for researchers in developing countries. Open Access supersedes all potential alternative modes of access to the scholarly literature through enabling unrestricted re-use, and long-term stability independent of financial constraints of traditional publishers that impede knowledge sharing. However, Open Access has the potential to become unsustainable for research communities if high-cost options are allowed to continue to prevail in a widely unregulated scholarly publishing market. Open Access remains only one of the multiple challenges that the scholarly publishing system is currently facing. Yet, it provides one foundation for increasing engagement with researchers regarding ethical standards of publishing and the broader implications of 'Open Research'.
The academic, economic and societal impacts of Open Access: an evidence-based review
Tennant, Jonathan P.; Waldner, François; Jacques, Damien C.; Masuzzo, Paola; Collister, Lauren B.; Hartgerink, Chris. H. J.
2016-01-01
Ongoing debates surrounding Open Access to the scholarly literature are multifaceted and complicated by disparate and often polarised viewpoints from engaged stakeholders. At the current stage, Open Access has become such a global issue that it is critical for all involved in scholarly publishing, including policymakers, publishers, research funders, governments, learned societies, librarians, and academic communities, to be well-informed on the history, benefits, and pitfalls of Open Access. In spite of this, there is a general lack of consensus regarding the potential pros and cons of Open Access at multiple levels. This review aims to be a resource for current knowledge on the impacts of Open Access by synthesizing important research in three major areas: academic, economic and societal. While there is clearly much scope for additional research, several key trends are identified, including a broad citation advantage for researchers who publish openly, as well as additional benefits to the non-academic dissemination of their work. The economic impact of Open Access is less well-understood, although it is clear that access to the research literature is key for innovative enterprises, and a range of governmental and non-governmental services. Furthermore, Open Access has the potential to save both publishers and research funders considerable amounts of financial resources, and can provide some economic benefits to traditionally subscription-based journals. The societal impact of Open Access is strong, in particular for advancing citizen science initiatives, and leveling the playing field for researchers in developing countries. Open Access supersedes all potential alternative modes of access to the scholarly literature through enabling unrestricted re-use, and long-term stability independent of financial constraints of traditional publishers that impede knowledge sharing. However, Open Access has the potential to become unsustainable for research communities if high-cost options are allowed to continue to prevail in a widely unregulated scholarly publishing market. Open Access remains only one of the multiple challenges that the scholarly publishing system is currently facing. Yet, it provides one foundation for increasing engagement with researchers regarding ethical standards of publishing and the broader implications of 'Open Research'. PMID:27158456
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-31
... protocol in the existing regulations to contact the bridge tender for bridge openings at the Captree State... removed many years ago. This action will update the present protocol to contact the bridge for openings... contact method will be changed to calling the number posted at the bridge for bridge openings. Formerly...
Reasons to temper enthusiasm about open access nursing journals.
de Jong, Gideon
2017-04-01
Open access is a relatively new phenomenon within nursing science. Several papers from various nursing journals have been published recently on the disadvantages of the traditional model of purchasing proprietary fee-based databases to access scholarly information. Just few nursing scholars are less optimistic about the possible benefits of open access nursing journals. A critical reflection on the merits and pitfalls of open access journals along insights from the literature and personal opinion. Two arguments are discussed, providing justification for tempering enthusiasm about open access journals. First, only research groups with sufficient financial resources can publish in open access journals. Second, open access has conflicting incentives, where the aim is to expand production at the expense of publishing quality articles; a business model that fits well into a neoliberal discourse. There are valid reasons to criticise the traditional publishers for the excessive costs of a single article, therefore preventing the dissemination of scholarly nursing information. On the contrary, the business model of open access publishers is no less imbued with the neoliberal tendency of lining the pockets.
National Airspace System (NAS) open system architecture and protocols
DOT National Transportation Integrated Search
2003-08-14
This standard establishes the open systems data communications architecture and authorized protocol standards for the National Airspace System (NAS). The NAS will consist of various types of processors and communications networks procured from a vari...
Public Access and Open Access: Is There a Difference? | Poster
By Robin Meckley, Contributing Writer, and Tracie Frederick, Guest Writer Open access and public access—are they different concepts or are they the same? What do they mean for the researchers at NCI at Frederick? “Open-access (OA) literature is digital, online, free of charge, and free of most copyright and licensing restrictions. What makes it possible is the Internet and the consent of the author or copyright-holder,” according to an open access website maintained by Peter Suber, director, Harvard Open Access Project.
Deployment of Directory Service for IEEE N Bus Test System Information
NASA Astrophysics Data System (ADS)
Barman, Amal; Sil, Jaya
2008-10-01
Exchanging information over Internet and Intranet becomes a defacto standard in computer applications, among various users and organizations. Distributed system study, e-governance etc require transparent information exchange between applications, constituencies, manufacturers, and vendors. To serve these purposes database system is needed for storing system data and other relevant information. Directory service, which is a specialized database along with access protocol, could be the single solution since it runs over TCP/IP, supported by all POSIX compliance platforms and is based on open standard. This paper describes a way to deploy directory service, to store IEEE n bus test system data and integrating load flow program with it.
Addiction Science: Uncovering Neurobiological Complexity
Volkow, N. D.; Baler, R. D.
2013-01-01
Until very recently addiction-research was limited by existing tools and strategies that were inadequate for studying the inherent complexity at each of the different phenomenological levels. However, powerful new tools (e.g., optogenetics and designer drug receptors) and high throughput protocols are starting to give researchers the potential to systematically interrogate “all” genes, epigenetic marks, and neuronal circuits. These advances, combined with imaging technologies (both for preclinical and clinical studies) and a paradigm shift towards open access have spurred an unlimited growth of datasets transforming the way we investigate the neurobiology of substance use disorders (SUD) and the factors that modulate risk and resilience. PMID:23688927
Kacmarek, Robert M; Villar, Jesús; Sulemanji, Demet; Montiel, Raquel; Ferrando, Carlos; Blanco, Jesús; Koh, Younsuck; Soler, Juan Alfonso; Martínez, Domingo; Hernández, Marianela; Tucci, Mauro; Borges, Joao Batista; Lubillo, Santiago; Santos, Arnoldo; Araujo, Juan B; Amato, Marcelo B P; Suárez-Sipmann, Fernando
2016-01-01
The open lung approach is a mechanical ventilation strategy involving lung recruitment and a decremental positive end-expiratory pressure trial. We compared the Acute Respiratory Distress Syndrome network protocol using low levels of positive end-expiratory pressure with open lung approach resulting in moderate to high levels of positive end-expiratory pressure for the management of established moderate/severe acute respiratory distress syndrome. A prospective, multicenter, pilot, randomized controlled trial. A network of 20 multidisciplinary ICUs. Patients meeting the American-European Consensus Conference definition for acute respiratory distress syndrome were considered for the study. At 12-36 hours after acute respiratory distress syndrome onset, patients were assessed under standardized ventilator settings (FIO2≥0.5, positive end-expiratory pressure ≥10 cm H2O). If Pao2/FIO2 ratio remained less than or equal to 200 mm Hg, patients were randomized to open lung approach or Acute Respiratory Distress Syndrome network protocol. All patients were ventilated with a tidal volume of 4 to 8 ml/kg predicted body weight. From 1,874 screened patients with acute respiratory distress syndrome, 200 were randomized: 99 to open lung approach and 101 to Acute Respiratory Distress Syndrome network protocol. Main outcome measures were 60-day and ICU mortalities, and ventilator-free days. Mortality at day-60 (29% open lung approach vs. 33% Acute Respiratory Distress Syndrome Network protocol, p = 0.18, log rank test), ICU mortality (25% open lung approach vs. 30% Acute Respiratory Distress Syndrome network protocol, p = 0.53 Fisher's exact test), and ventilator-free days (8 [0-20] open lung approach vs. 7 [0-20] d Acute Respiratory Distress Syndrome network protocol, p = 0.53 Wilcoxon rank test) were not significantly different. Airway driving pressure (plateau pressure - positive end-expiratory pressure) and PaO2/FIO2 improved significantly at 24, 48 and 72 hours in patients in open lung approach compared with patients in Acute Respiratory Distress Syndrome network protocol. Barotrauma rate was similar in both groups. In patients with established acute respiratory distress syndrome, open lung approach improved oxygenation and driving pressure, without detrimental effects on mortality, ventilator-free days, or barotrauma. This pilot study supports the need for a large, multicenter trial using recruitment maneuvers and a decremental positive end-expiratory pressure trial in persistent acute respiratory distress syndrome.
Open Access Publishing - Strengths and Strategies
NASA Astrophysics Data System (ADS)
Rasmussen, Martin
2010-05-01
The journal crisis and the demand for free accessibility to the results of publicly funded research were the main drivers of the Open Access movement since the late 1990's. Besides many academic institutions that support the different ways of Open Access publishing, there is a growing number of publishing houses that are specialized on this new access and business model of scholarly literature. The lecture provides an overview of the different kinds of Open Access publishing, discusses the variety of underlying business models, names the advantages and potentials for researches and the public, and overcomes some objections against Open Access. Besides the increased visibility and information supply, the topic of copyrights and exploitation rights will be discussed. Furthermore, it is a central aim of the presentation to show that Open Access does not only support full peer-review, but also provides the potential for even enhanced quality assurance. The financing of business models based on open accessible literature is another important part to be outlined in the lecture.
50 CFR 660.316 - Open access fishery-observer requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Open access fishery-observer requirements. 660.316 Section 660.316 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC... West Coast Groundfish-Open Access Fisheries § 660.316 Open access fishery—observer requirements. (a...
50 CFR 660.316 - Open access fishery-observer requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Open access fishery-observer requirements. 660.316 Section 660.316 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC... West Coast Groundfish-Open Access Fisheries § 660.316 Open access fishery—observer requirements. (a...
50 CFR 660.316 - Open access fishery-observer requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 50 Wildlife and Fisheries 13 2013-10-01 2013-10-01 false Open access fishery-observer requirements. 660.316 Section 660.316 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC... West Coast Groundfish-Open Access Fisheries § 660.316 Open access fishery—observer requirements. (a...
50 CFR 660.316 - Open access fishery-observer requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 50 Wildlife and Fisheries 13 2012-10-01 2012-10-01 false Open access fishery-observer requirements. 660.316 Section 660.316 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC... West Coast Groundfish-Open Access Fisheries § 660.316 Open access fishery—observer requirements. (a...
50 CFR 660.316 - Open access fishery-observer requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 50 Wildlife and Fisheries 13 2014-10-01 2014-10-01 false Open access fishery-observer requirements. 660.316 Section 660.316 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC... West Coast Groundfish-Open Access Fisheries § 660.316 Open access fishery—observer requirements. (a...
An evaluation of emergency medicine investigators' views on open access to medical literature.
Rodriguez, R M; Wong, J; Hardy, J; Frankel, E
2006-12-01
Scientists and governmental agencies have called for free universal access to research publications via the internet--open access. To examine the current medical literature reading practices of emergency medicine investigators (EMIs) and their views towards open access. Surveys were mailed to the 212 corresponding authors of all original research articles published in years 2002 and 2003 in the Annals of Emergency Medicine, Academic Emergency Medicine and The Journal of Emergency Medicine. The most commonly read forms of medical literature reported by the 129 (61%) EMI respondents were hard-copy medical journals and online literature review services. 59% of EMIs were in favour of open access; 58% stated they would read a wider variety of medical literature; 21% believed open access would improve the quality of publications and 39% thought it would decrease the quality. When asked how a US 1500 dollars fee for open access would affect their ability to publish research, 69% said it would greatly impede and 19% said it would slightly impede their research. Despite concerns that open access may impede their ability to publish research and decrease the quality of publications, most EMIs surveyed favoured open access. They believed open access would increase and broaden their medical literature reading.
ERIC Educational Resources Information Center
Tenopir, Carol
2004-01-01
Open access publishing is a hot topic today. But open access publishing can have many different definitions, and pros and cons vary with the definitions. Open access publishing is especially attractive to companies and small colleges or universities that are likely to have many more readers than authors. A downside is that a membership fee sounds…
50 CFR 648.88 - Multispecies open access permit restrictions.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 50 Wildlife and Fisheries 10 2011-10-01 2011-10-01 false Multispecies open access permit... Management Measures for the NE Multispecies and Monkfish Fisheries § 648.88 Multispecies open access permit restrictions. (a) Handgear permit. A vessel issued a valid open access NE multispecies Handgear permit is...
50 CFR 648.88 - Multispecies open access permit restrictions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 8 2010-10-01 2010-10-01 false Multispecies open access permit... Management Measures for the NE Multispecies and Monkfish Fisheries § 648.88 Multispecies open access permit restrictions. (a) Handgear permit. A vessel issued a valid open access NE multispecies Handgear permit is...
50 CFR 660.383 - Open access fishery management measures.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Open access fishery management measures... West Coast Groundfish Fisheries § 660.383 Open access fishery management measures. (a) General. Groundfish species taken in open access fisheries will be managed with cumulative trip limits (see trip...
50 CFR 648.88 - Multispecies open access permit restrictions.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 50 Wildlife and Fisheries 12 2013-10-01 2013-10-01 false Multispecies open access permit... Management Measures for the NE Multispecies and Monkfish Fisheries § 648.88 Multispecies open access permit restrictions. (a) Handgear permit. A vessel issued a valid open access NE multispecies Handgear permit is...
50 CFR 648.88 - Multispecies open access permit restrictions.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 50 Wildlife and Fisheries 12 2014-10-01 2014-10-01 false Multispecies open access permit... Management Measures for the NE Multispecies and Monkfish Fisheries § 648.88 Multispecies open access permit restrictions. (a) Handgear permit. A vessel issued a valid open access NE multispecies Handgear permit is...
50 CFR 648.88 - Multispecies open access permit restrictions.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 50 Wildlife and Fisheries 12 2012-10-01 2012-10-01 false Multispecies open access permit... Management Measures for the NE Multispecies and Monkfish Fisheries § 648.88 Multispecies open access permit restrictions. (a) Handgear permit. A vessel issued a valid open access NE multispecies Handgear permit is...
50 CFR 648.15 - Facilitation of enforcement.
Code of Federal Regulations, 2011 CFR
2011-10-01
... surfclam and ocean quahog vessel owners and operators. (1) Surfclam and ocean quahog open access permitted vessels. Vessel owners or operators issued an open access surfclam or ocean quahog open access permit for.../or an Open Access Herring Permit that fished with midwater trawl gear pursuant to § 648.80(d). Such...
UPM: unified policy-based network management
NASA Astrophysics Data System (ADS)
Law, Eddie; Saxena, Achint
2001-07-01
Besides providing network management to the Internet, it has become essential to offer different Quality of Service (QoS) to users. Policy-based management provides control on network routers to achieve this goal. The Internet Engineering Task Force (IETF) has proposed a two-tier architecture whose implementation is based on the Common Open Policy Service (COPS) protocol and Lightweight Directory Access Protocol (LDAP). However, there are several limitations to this design such as scalability and cross-vendor hardware compatibility. To address these issues, we present a functionally enhanced multi-tier policy management architecture design in this paper. Several extensions are introduced thereby adding flexibility and scalability. In particular, an intermediate entity between the policy server and policy rule database called the Policy Enforcement Agent (PEA) is introduced. By keeping internal data in a common format, using a standard protocol, and by interpreting and translating request and decision messages from multi-vendor hardware, this agent allows a dynamic Unified Information Model throughout the architecture. We have tailor-made this unique information system to save policy rules in the directory server and allow executions of policy rules with dynamic addition of new equipment during run-time.
Hüttner, Felix J; Bruckner, Tom; Alldinger, Ingo; Hennes, Roland; Ulrich, Alexis; Büchler, Markus W; Diener, Markus K; Knebel, Phillip
2015-03-31
The insertion of central venous access devices, such as totally implantable venous access ports (TIVAPs), is routine in patients who need a safe and permanent venous access. The number of port implantations is increasing due to the development of innovative adjuvant and neo-adjuvant therapies. Currently, two different strategies are being routinely used: surgical cut-down of the cephalic vein (vena section) and direct puncture of the subclavian vein. The aim of this trial is to identify the strategy for the implantation of TIVAPs with the lowest risk of pneumothorax and haemothorax. The PORTAS-3 trial is designed as a multicentre, randomised controlled trial to compare two implantation strategies. A total of 1,154 patients will be randomised after giving written informed consent. Patients must be over 18 years of age and scheduled for primary implantation of a TIVAP on the designated side. The primary endpoint will be the frequency of pneumothorax and haemothorax after insertion of a TIVAP by one of two different strategies. The experimental intervention is as follows: open strategy, defined as surgical cut-down of the cephalic vein, supported by a rescue technique if necessary, and in the case of failure, direct puncture of the subclavian vein. The control intervention is as follows: direct puncture of the subclavian vein using the Seldinger technique guided by sonography, fluoroscopy or landmark technique. The trial duration is approximately 36 months, with a recruitment period of 18 months and a follow-up period of 30 days. The PORTAS-3 trial will compare two different TIVAP implantation strategies with regard to their individual risk of postoperative pneumothorax and haemothorax. Since TIVAP implantation is one of the most common procedures in general surgery, the results will be of interest for a large community of surgeons as well as oncologists and general practitioners. The pragmatic trial design ensures that the results will be generalizable to a wide range of patients. The trial protocol was registered on 28 August 2014 with the German Clinical Trials Register (DRKS00004900) . The World Health Organization's Universal Trial Number is U1111-1142-4420.
Real-time Electrophysiology: Using Closed-loop Protocols to Probe Neuronal Dynamics and Beyond
Linaro, Daniele; Couto, João; Giugliano, Michele
2015-01-01
Experimental neuroscience is witnessing an increased interest in the development and application of novel and often complex, closed-loop protocols, where the stimulus applied depends in real-time on the response of the system. Recent applications range from the implementation of virtual reality systems for studying motor responses both in mice1 and in zebrafish2, to control of seizures following cortical stroke using optogenetics3. A key advantage of closed-loop techniques resides in the capability of probing higher dimensional properties that are not directly accessible or that depend on multiple variables, such as neuronal excitability4 and reliability, while at the same time maximizing the experimental throughput. In this contribution and in the context of cellular electrophysiology, we describe how to apply a variety of closed-loop protocols to the study of the response properties of pyramidal cortical neurons, recorded intracellularly with the patch clamp technique in acute brain slices from the somatosensory cortex of juvenile rats. As no commercially available or open source software provides all the features required for efficiently performing the experiments described here, a new software toolbox called LCG5 was developed, whose modular structure maximizes reuse of computer code and facilitates the implementation of novel experimental paradigms. Stimulation waveforms are specified using a compact meta-description and full experimental protocols are described in text-based configuration files. Additionally, LCG has a command-line interface that is suited for repetition of trials and automation of experimental protocols. PMID:26132434
Kasschau, Margaret; Sherman, Kathleen; Haider, Lamia; Frontario, Ariana; Shaw, Michael; Datta, Abhishek; Bikson, Marom; Charvet, Leigh
2015-12-26
Transcranial direct current stimulation (tDCS) is a noninvasive brain stimulation technique that uses low amplitude direct currents to alter cortical excitability. With well-established safety and tolerability, tDCS has been found to have the potential to ameliorate symptoms such as depression and pain in a range of conditions as well as to enhance outcomes of cognitive and physical training. However, effects are cumulative, requiring treatments that can span weeks or months and frequent, repeated visits to the clinic. The cost in terms of time and travel is often prohibitive for many participants, and ultimately limits real-world access. Following guidelines for remote tDCS application, we propose a protocol that would allow remote (in-home) participation that uses specially-designed devices for supervised use with materials modified for patient use, and real-time monitoring through a telemedicine video conferencing platform. We have developed structured training procedures and clear, detailed instructional materials to allow for self- or proxy-administration while supervised remotely in real-time. The protocol is designed to have a series of checkpoints, addressing attendance and tolerability of the session, to be met in order to continue to the next step. The feasibility of this protocol was then piloted for clinical use in an open label study of remotely-supervised tDCS in multiple sclerosis (MS). This protocol can be widely used for clinical study of tDCS.
Perilous terra incognita--open-access journals.
Balon, Richard
2014-04-01
The author focuses on a new rapidly spreading practice of publication in open-access journals. The pros and cons of open-access journals are discussed. Publishing in these journals may be cost prohibitive for educators and junior faculty members. Some authors may be lured by the ease of publishing in open-access journals (and their, at times, inflated self-description, e.g., "international", "scientific"), and their possibly valuable contributions will escape the attention of Academic Psychiatry readership in the vast sea of open-access journals. The readership may be flooded with a large number of low-quality articles (maybe not even properly peer-reviewed) from open-access journals. It may take some time to sort out what is and what is not relevant and useful. Open-access publishing represents a problematic and controversial practice and may be associated with a conflict of interest for the editors and publishers of these journals.
Mir, Fatima; Nisar, Imran; Tikmani, Shiyam S; Baloch, Benazir; Shakoor, Sadia; Jehan, Fyezah; Ahmed, Imran; Cousens, Simon; Zaidi, Anita K M
2017-02-01
Parenteral antibiotic therapy for young infants (aged 0-59 days) with suspected sepsis is sometimes not available or feasible in countries with high neonatal mortality. Outpatient treatment could save lives in such settings. We aimed to assess the equivalence of two simplified antibiotic regimens, comprising fewer injections and oral rather than parenteral administration, compared with a reference treatment for young infants with clinical severe infection. We undertook the Simplified Antibiotic Therapy Trial (SATT), a three-arm, randomised, open-label, equivalence trial in five communities in Karachi, Pakistan. We enrolled young infants (aged 0-59 days) who either presented at a primary health-care clinic or were identified by a community health worker with signs of clinical severe infection. We included infants who were not critically ill and whose family refused admission. We randomly assigned infants to either intramuscular procaine benzylpenicillin and gentamicin once a day for 7 days (reference); oral amoxicillin twice daily and intramuscular gentamicin once a day for 7 days; or intramuscular procaine benzylpenicillin and gentamicin once a day for 2 days followed by oral amoxicillin twice daily for 5 days. The primary outcome was treatment failure within 7 days of enrolment and the primary analysis was per protocol. We judged experimental treatments as efficacious as the reference if the upper bound of the 95% CI for the difference in treatment failure was less than 5·0. This trial is registered at ClinicalTrials.gov, number NCT01027429. Between Jan 1, 2010, and Dec 26, 2013, 2780 infants were deemed eligible for the trial, of whom 2453 (88%) were enrolled. Because of inadequate clinical follow-up or treatment adherence, 2251 infants were included in the per-protocol analysis. 820 infants (747 per protocol) were assigned the reference treatment of procaine benzylpenicillin and gentamicin, 816 (751 per protocol) were allocated amoxicillin and gentamicin, and 817 (753 per protocol) were assigned procaine benzylpenicillin, gentamicin, and amoxicillin. Treatment failure within 7 days of enrolment was reported in 90 (12%) infants who received procaine benzylpenicillin and gentamicin (reference), 76 (10%) of those given amoxicillin and gentamicin (risk difference with reference -1·9, 95% CI -5·1 to 1·3), and 99 (13%) of those treated with procaine benzylpenicillin, gentamicin, and amoxicillin (risk difference with reference 1·1, -2·3 to 4·5). Two simplified antibiotic regimens requiring fewer injections are equivalent to a reference treatment for young infants with signs of clinical severe infection but without signs of critical illness. The use of these simplified regimens has the potential to increase access to treatment for sick young infants who cannot be referred to hospital. The Saving Newborn Lives initiative of Save the Children, through support from the Bill & Melinda Gates, and by WHO and USAID. Copyright © 2017 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY license. Published by Elsevier Ltd.. All rights reserved.
ERIC Educational Resources Information Center
Stanton, Kate Valentine; Liew, Chern Li
2011-01-01
Introduction: We examine doctoral students' awareness of and attitudes to open access forms of publication. Levels of awareness of open access and the concept of institutional repositories, publishing behaviour and perceptions of benefits and risks of open access publishing were explored. Method: Qualitative and quantitative data were collected…
Open Access Publishing: What Authors Want
ERIC Educational Resources Information Center
Nariani, Rajiv; Fernandez, Leila
2012-01-01
Campus-based open access author funds are being considered by many academic libraries as a way to support authors publishing in open access journals. Article processing fees for open access have been introduced recently by publishers and have not yet been widely accepted by authors. Few studies have surveyed authors on their reasons for publishing…
Almost Halfway There: An Analysis of the Open Access Behaviors of Academic Librarians
ERIC Educational Resources Information Center
Mercer, Holly
2011-01-01
Academic librarians are increasingly expected to advocate for scholarly communications reforms such as open access to scholarly publications, yet librarians do not always practice what they preach. Previous research examined librarian attitudes toward open access, whereas this article presents results of a study of open access publishing and…
30 CFR 291.113 - What actions may MMS take to remedy denial of open and nondiscriminatory access?
Code of Federal Regulations, 2011 CFR
2011-07-01
... open and nondiscriminatory access? 291.113 Section 291.113 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION, AND ENFORCEMENT, DEPARTMENT OF THE INTERIOR APPEALS OPEN AND NONDISCRIMINATORY ACCESS... take to remedy denial of open and nondiscriminatory access? If the MMS Director's decision under § 291...
Digital Scholarship and Open Access
ERIC Educational Resources Information Center
Losoff, Barbara; Pence, Harry E.
2010-01-01
Open access publications provide scholars with unrestricted access to the "conversation" that is the basis for the advancement of knowledge. The large number of open access journals, archives, and depositories already in existence demonstrates the technical and economic viability of providing unrestricted access to the literature that is the…
Open Access Publishing in the Field of Medical Informatics.
Kuballa, Stefanie
2017-05-01
The open access paradigm has become an important approach in today's information and communication society. Funders and governments in different countries stipulate open access publications of funded research results. Medical informatics as part of the science, technology and medicine disciplines benefits from many research funds, such as National Institutes of Health in the US, Wellcome Trust in UK, German Research Foundation in Germany and many more. In this study an overview of the current open access programs and conditions of major journals in the field of medical informatics is presented. It was investigated whether there are suitable options and how they are shaped. Therefore all journals in Thomson Reuters Web of Science that were listed in the subject category "Medical Informatics" in 2014 were examined. An Internet research was conducted by investigating the journals' websites. It was reviewed whether journals offer an open access option with a subsequent check of conditions as for example the type of open access, the fees and the licensing. As a result all journals in the field of medical informatics that had an impact factor in 2014 offer an open access option. A predominantly consistent pricing range was determined with an average fee of 2.248 € and a median fee of 2.207 €. The height of a journals' open access fee did not correlate with the height of its Impact Factor. Hence, medical informatics journals have recognized the trend of open access publishing, though the vast majority of them are working with the hybrid method. Hybrid open access may however lead to problems in questions of double dipping and the often stipulated gold open access.
Schroter, Sara; Tite, Leanne
2006-01-01
Objectives: We aimed to assess journal authors' current knowledge and perceptions of open access and author-pays publishing. Design: An electronic survey. Setting: Authors of research papers submitted to BMJ, Archives of Disease in Childhood, and Journal of Medical Genetics in 2004. Main outcome measures: Familiarity with and perceptions of open access and author-pays publishing. Results: 468/1113 (42%) responded. Prior to definitions being provided, 47% (222/468) and 38% (176/468) reported they were familiar with the terms `open access' and `author-pays' publishing, respectively. Some who did not at first recognize the terms, did claim to recognize them when they were defined. Only 10% (49/468) had submitted to an author-pays journal. Compared with non-open access subscription-based journals, 35% agreed that open access author-pays journals have a greater capacity to publish more content making it easier to get published, 27% thought they had lower impact factors, 31% thought they had faster and more timely publicaitons, and 46% agreed that people will think anyone can pay to get published. 55% (256/468) thought they would not continue to submit to their respective journal if it became open access and charged, largely because of the reputaiton of the journals. Half (54%, 255/468) said open access has `no impact' or was `low priority' in their submission decisions. Two-thirds (66%, 308/468) said they would prefer to submit to a non-open access subscription-based journal than an open access author-pays journal. Over half thought they would have to make a contribution or pay the full cost of an author charge (56%, 262/468). Conclusions: The survey yielded useful information about respondents' knowledge and perceptions of these publishing models. Authors have limited familiarity with the concept of open-access publishing and surrounding issues. Currently, open access policies have little impact on authors' decision of where to submit papers. PMID:16508053
OpenFlow arbitrated programmable network channels for managing quantum metadata
Dasari, Venkat R.; Humble, Travis S.
2016-10-10
Quantum networks must classically exchange complex metadata between devices in order to carry out information for protocols such as teleportation, super-dense coding, and quantum key distribution. Demonstrating the integration of these new communication methods with existing network protocols, channels, and data forwarding mechanisms remains an open challenge. Software-defined networking (SDN) offers robust and flexible strategies for managing diverse network devices and uses. We adapt the principles of SDN to the deployment of quantum networks, which are composed from unique devices that operate according to the laws of quantum mechanics. We show how quantum metadata can be managed within a software-definedmore » network using the OpenFlow protocol, and we describe how OpenFlow management of classical optical channels is compatible with emerging quantum communication protocols. We next give an example specification of the metadata needed to manage and control quantum physical layer (QPHY) behavior and we extend the OpenFlow interface to accommodate this quantum metadata. Here, we conclude by discussing near-term experimental efforts that can realize SDN’s principles for quantum communication.« less
OpenFlow arbitrated programmable network channels for managing quantum metadata
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasari, Venkat R.; Humble, Travis S.
Quantum networks must classically exchange complex metadata between devices in order to carry out information for protocols such as teleportation, super-dense coding, and quantum key distribution. Demonstrating the integration of these new communication methods with existing network protocols, channels, and data forwarding mechanisms remains an open challenge. Software-defined networking (SDN) offers robust and flexible strategies for managing diverse network devices and uses. We adapt the principles of SDN to the deployment of quantum networks, which are composed from unique devices that operate according to the laws of quantum mechanics. We show how quantum metadata can be managed within a software-definedmore » network using the OpenFlow protocol, and we describe how OpenFlow management of classical optical channels is compatible with emerging quantum communication protocols. We next give an example specification of the metadata needed to manage and control quantum physical layer (QPHY) behavior and we extend the OpenFlow interface to accommodate this quantum metadata. Here, we conclude by discussing near-term experimental efforts that can realize SDN’s principles for quantum communication.« less
CILogon: An Integrated Identity and Access Management Platform for Science
NASA Astrophysics Data System (ADS)
Basney, J.
2016-12-01
When scientists work together, they use web sites and other software to share their ideas and data. To ensure the integrity of their work, these systems require the scientists to log in and verify that they are part of the team working on a particular science problem. Too often, the identity and access verification process is a stumbling block for the scientists. Scientific research projects are forced to invest time and effort into developing and supporting Identity and Access Management (IAM) services, distracting them from the core goals of their research collaboration. CILogon provides an IAM platform that enables scientists to work together to meet their IAM needs more effectively so they can allocate more time and effort to their core mission of scientific research. The CILogon platform enables federated identity management and collaborative organization management. Federated identity management enables researchers to use their home organization identities to access cyberinfrastructure, rather than requiring yet another username and password to log on. Collaborative organization management enables research projects to define user groups for authorization to collaboration platforms (e.g., wikis, mailing lists, and domain applications). CILogon's IAM platform serves the unique needs of research collaborations, namely the need to dynamically form collaboration groups across organizations and countries, sharing access to data, instruments, compute clusters, and other resources to enable scientific discovery. CILogon provides a software-as-a-service platform to ease integration with cyberinfrastructure, while making all software components publicly available under open source licenses to enable re-use. Figure 1 illustrates the components and interfaces of this platform. CILogon has been operational since 2010 and has been used by over 7,000 researchers from more than 170 identity providers to access cyberinfrastructure including Globus, LIGO, Open Science Grid, SeedMe, and XSEDE. The "CILogon 2.0" platform, launched in 2016, adds support for virtual organization (VO) membership management, identity linking, international collaborations, and standard integration protocols, through integration with the Internet2 COmanage collaboration software.
Achieving open access to conservation science.
Fuller, Richard A; Lee, Jasmine R; Watson, James E M
2014-12-01
Conservation science is a crisis discipline in which the results of scientific enquiry must be made available quickly to those implementing management. We assessed the extent to which scientific research published since the year 2000 in 20 conservation science journals is publicly available. Of the 19,207 papers published, 1,667 (8.68%) are freely downloadable from an official repository. Moreover, only 938 papers (4.88%) meet the standard definition of open access in which material can be freely reused providing attribution to the authors is given. This compares poorly with a comparable set of 20 evolutionary biology journals, where 31.93% of papers are freely downloadable and 7.49% are open access. Seventeen of the 20 conservation journals offer an open access option, but fewer than 5% of the papers are available through open access. The cost of accessing the full body of conservation science runs into tens of thousands of dollars per year for institutional subscribers, and many conservation practitioners cannot access pay-per-view science through their workplace. However, important initiatives such as Research4Life are making science available to organizations in developing countries. We urge authors of conservation science to pay for open access on a per-article basis or to choose publication in open access journals, taking care to ensure the license allows reuse for any purpose providing attribution is given. Currently, it would cost $51 million to make all conservation science published since 2000 freely available by paying the open access fees currently levied to authors. Publishers of conservation journals might consider more cost effective models for open access and conservation-oriented organizations running journals could consider a broader range of options for open access to nonmembers such as sponsorship of open access via membership fees. © 2014 The Authors. Conservation Biology published by Wiley Periodicals, Inc., on behalf of the Society for Conservation Biology.
A Robust Open Ascending-price Multi-unit Auction Protocol against False-name Bids
NASA Astrophysics Data System (ADS)
Iwasaki, Atsushi; Yokoo, Makoto; Terada, Kenji
This paper develops a new ascending-price multi-unit auction protocol that has following characteristics: (i) it has an open format, (ii) sincere bidding is an equilibrium strategy even if the marginal utilities of each agent can increase and agents can submit false-name bids. False-name bids are bids submitted under fictitious names such as multiple e-mail addresses, which can be done easily in the Internet. This is the first protocol that has these two characteristics. We show that our new protocol outperforms an existing protocol, which satisfies (ii), with respect to the social surplus and the seller's revenue.
Development of EPA Protocol Information Enquiry Service System Based on Embedded ARM Linux
NASA Astrophysics Data System (ADS)
Peng, Daogang; Zhang, Hao; Weng, Jiannian; Li, Hui; Xia, Fei
Industrial Ethernet is a new technology for industrial network communications developed in recent years. In the field of industrial automation in China, EPA is the first standard accepted and published by ISO, and has been included in the fourth edition IEC61158 Fieldbus of NO.14 type. According to EPA standard, Field devices such as industrial field controller, actuator and other instruments are all able to realize communication based on the Ethernet standard. The Atmel AT91RM9200 embedded development board and open source embedded Linux are used to develop an information inquiry service system of EPA protocol based on embedded ARM Linux in this paper. The system is capable of designing an EPA Server program for EPA data acquisition procedures, the EPA information inquiry service is available for programs in local or remote host through Socket interface. The EPA client can access data and information of other EPA equipments on the EPA network when it establishes connection with the monitoring port of the server.
Applying Content Management to Automated Provenance Capture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuchardt, Karen L.; Gibson, Tara D.; Stephan, Eric G.
2008-04-10
Workflows and data pipelines are becoming increasingly valuable in both computational and experimen-tal sciences. These automated systems are capable of generating significantly more data within the same amount of time than their manual counterparts. Automatically capturing and recording data prove-nance and annotation as part of these workflows is critical for data management, verification, and dis-semination. Our goal in addressing the provenance challenge was to develop and end-to-end system that demonstrates real-time capture, persistent content management, and ad-hoc searches of both provenance and metadata using open source software and standard protocols. We describe our prototype, which extends the Kepler workflow toolsmore » for the execution environment, the Scientific Annotation Middleware (SAM) content management software for data services, and an existing HTTP-based query protocol. Our implementation offers several unique capabilities, and through the use of standards, is able to pro-vide access to the provenance record to a variety of commonly available client tools.« less
ERIC Educational Resources Information Center
Armbruster, Chris
2008-01-01
Open source, open content and open access are set to fundamentally alter the conditions of knowledge production and distribution. Open source, open content and open access are also the most tangible result of the shift towards e-science and digital networking. Yet, widespread misperceptions exist about the impact of this shift on knowledge…
Global Ocean Currents Database
NASA Astrophysics Data System (ADS)
Boyer, T.; Sun, L.
2016-02-01
The NOAA's National Centers for Environmental Information has released an ocean currents database portal that aims 1) to integrate global ocean currents observations from a variety of instruments with different resolution, accuracy and response to spatial and temporal variability into a uniform network common data form (NetCDF) format and 2) to provide a dedicated online data discovery, access to NCEI-hosted and distributed data sources for ocean currents data. The portal provides a tailored web application that allows users to search for ocean currents data by platform types and spatial/temporal ranges of their interest. The dedicated web application is available at http://www.nodc.noaa.gov/gocd/index.html. The NetCDF format supports widely-used data access protocols and catalog services such as OPeNDAP (Open-source Project for a Network Data Access Protocol) and THREDDS (Thematic Real-time Environmental Distributed Data Services), which the GOCD users can use data files with their favorite analysis and visualization client software without downloading to their local machine. The potential users of the ocean currents database include, but are not limited to, 1) ocean modelers for their model skills assessments, 2) scientists and researchers for studying the impact of ocean circulations on the climate variability, 3) ocean shipping industry for safety navigation and finding optimal routes for ship fuel efficiency, 4) ocean resources managers while planning for the optimal sites for wastes and sewages dumping and for renewable hydro-kinematic energy, and 5) state and federal governments to provide historical (analyzed) ocean circulations as an aid for search and rescue
A universal data access and protocol integration mechanism for smart home
NASA Astrophysics Data System (ADS)
Shao, Pengfei; Yang, Qi; Zhang, Xuan
2013-03-01
With the lack of standardized or completely missing communication interfaces in home electronics, there is no perfect solution to address every aspect in smart homes based on existing protocols and technologies. In addition, the central control unit (CCU) of smart home system working point-to-point between the multiple application interfaces and the underlying hardware interfaces leads to its complicated architecture and unpleasant performance. A flexible data access and protocol integration mechanism is required. The current paper offers a universal, comprehensive data access and protocol integration mechanism for a smart home. The universal mechanism works as a middleware adapter with unified agreements of the communication interfaces and protocols, offers an abstraction of the application level from the hardware specific and decoupling the hardware interface modules from the application level. Further abstraction for the application interfaces and the underlying hardware interfaces are executed based on adaption layer to provide unified interfaces for more flexible user applications and hardware protocol integration. This new universal mechanism fundamentally changes the architecture of the smart home and in some way meets the practical requirement of smart homes more flexible and desirable.
A Matter of Discipline: Open Access, the Humanities, and Art History
ERIC Educational Resources Information Center
Tomlin, Patrick
2009-01-01
Recent events suggest that open access has gained new momentum in the humanities, but the slow and uneven development of open-access initiatives in humanist fields continues to hinder the consolidation of efforts across the university. Although various studies have traced the general origins of the humanities' reticence to embrace open access, few…
Research of Ad Hoc Networks Access Algorithm
NASA Astrophysics Data System (ADS)
Xiang, Ma
With the continuous development of mobile communication technology, Ad Hoc access network has become a hot research, Ad Hoc access network nodes can be used to expand capacity of multi-hop communication range of mobile communication system, even business adjacent to the community, improve edge data rates. When the ad hoc network is the access network of the internet, the gateway discovery protocol is very important to choose the most appropriate gateway to guarantee the connectivity between ad hoc network and IP based fixed networks. The paper proposes a QoS gateway discovery protocol which uses the time delay and stable route to the gateway selection conditions. And according to the gateway discovery protocol, it also proposes a fast handover scheme which can decrease the handover time and improve the handover efficiency.
NASA Astrophysics Data System (ADS)
Lahinta, A.; Haris, I.; Abdillah, T.
2017-03-01
The aim of this paper is to describe a developed application of Simple Object Access Protocol (SOAP) as a model for improving libraries’ digital content findability on the library web. The study applies XML text-based protocol tools in the collection of data about libraries’ visibility performance in the search results of the book. Model from the integrated Web Service Document Language (WSDL) and Universal Description, Discovery and Integration (UDDI) are applied to analyse SOAP as element within the system. The results showed that the developed application of SOAP with multi-tier architecture can help people simply access the website in the library server Gorontalo Province and support access to digital collections, subscription databases, and library catalogs in each library in Regency or City in Gorontalo Province.
Browsing for the Best Internet Access Provider?
ERIC Educational Resources Information Center
Weil, Marty
1996-01-01
Highlights points to consider when choosing an Internet Service Provider. Serial Line Internet Protocol (SLIP) and Point to Point Protocol (PPP) are compared regarding price, performance, bandwidth, speed, and technical support. Obtaining access via local, national, consumer online, and telephone-company providers is discussed. A pricing chart and…
An Outline of Data Aggregation Security in Heterogeneous Wireless Sensor Networks
Boubiche, Sabrina; Boubiche, Djallel Eddine; Bilami, Azzedine; Toral-Cruz, Homero
2016-01-01
Data aggregation processes aim to reduce the amount of exchanged data in wireless sensor networks and consequently minimize the packet overhead and optimize energy efficiency. Securing the data aggregation process is a real challenge since the aggregation nodes must access the relayed data to apply the aggregation functions. The data aggregation security problem has been widely addressed in classical homogeneous wireless sensor networks, however, most of the proposed security protocols cannot guarantee a high level of security since the sensor node resources are limited. Heterogeneous wireless sensor networks have recently emerged as a new wireless sensor network category which expands the sensor nodes’ resources and capabilities. These new kinds of WSNs have opened new research opportunities where security represents a most attractive area. Indeed, robust and high security level algorithms can be used to secure the data aggregation at the heterogeneous aggregation nodes which is impossible in classical homogeneous WSNs. Contrary to the homogeneous sensor networks, the data aggregation security problem is still not sufficiently covered and the proposed data aggregation security protocols are numberless. To address this recent research area, this paper describes the data aggregation security problem in heterogeneous wireless sensor networks and surveys a few proposed security protocols. A classification and evaluation of the existing protocols is also introduced based on the adopted data aggregation security approach. PMID:27077866
Direct data access protocols benchmarking on DPM
NASA Astrophysics Data System (ADS)
Furano, Fabrizio; Devresse, Adrien; Keeble, Oliver; Mancinelli, Valentina
2015-12-01
The Disk Pool Manager is an example of a multi-protocol, multi-VO system for data access on the Grid that went though a considerable technical evolution in the last years. Among other features, its architecture offers the opportunity of testing its different data access frontends under exactly the same conditions, including hardware and backend software. This characteristic inspired the idea of collecting monitoring information from various testbeds in order to benchmark the behaviour of the HTTP and Xrootd protocols for the use case of data analysis, batch or interactive. A source of information is the set of continuous tests that are run towards the worldwide endpoints belonging to the DPM Collaboration, which accumulated relevant statistics in its first year of activity. On top of that, the DPM releases are based on multiple levels of automated testing that include performance benchmarks of various kinds, executed regularly every day. At the same time, the recent releases of DPM can report monitoring information about any data access protocol to the same monitoring infrastructure that is used to monitor the Xrootd deployments. Our goal is to evaluate under which circumstances the HTTP-based protocols can be good enough for batch or interactive data access. In this contribution we show and discuss the results that our test systems have collected under the circumstances that include ROOT analyses using TTreeCache and stress tests on the metadata performance.
Web tools for large-scale 3D biological images and atlases
2012-01-01
Background Large-scale volumetric biomedical image data of three or more dimensions are a significant challenge for distributed browsing and visualisation. Many images now exceed 10GB which for most users is too large to handle in terms of computer RAM and network bandwidth. This is aggravated when users need to access tens or hundreds of such images from an archive. Here we solve the problem for 2D section views through archive data delivering compressed tiled images enabling users to browse through very-large volume data in the context of a standard web-browser. The system provides an interactive visualisation for grey-level and colour 3D images including multiple image layers and spatial-data overlay. Results The standard Internet Imaging Protocol (IIP) has been extended to enable arbitrary 2D sectioning of 3D data as well a multi-layered images and indexed overlays. The extended protocol is termed IIP3D and we have implemented a matching server to deliver the protocol and a series of Ajax/Javascript client codes that will run in an Internet browser. We have tested the server software on a low-cost linux-based server for image volumes up to 135GB and 64 simultaneous users. The section views are delivered with response times independent of scale and orientation. The exemplar client provided multi-layer image views with user-controlled colour-filtering and overlays. Conclusions Interactive browsing of arbitrary sections through large biomedical-image volumes is made possible by use of an extended internet protocol and efficient server-based image tiling. The tools open the possibility of enabling fast access to large image archives without the requirement of whole image download and client computers with very large memory configurations. The system was demonstrated using a range of medical and biomedical image data extending up to 135GB for a single image volume. PMID:22676296
The SciELO Open Access: A Gold Way from the South
ERIC Educational Resources Information Center
Packer, Abel L.
2009-01-01
Open access has long emphasized access to scholarly materials. However, open access can also mean access to the means of producing visible and recognized journals. This issue is particularly important in developing and emergent countries. The SciELO (Scientific Electronic Library On-line) project, first started in Brazil and, shortly afterward, in…
Databases and Electronic Resources - Betty Petersen Memorial Library
of NOAA-Wide and Open Access Databases on the NOAA Central Library website. American Meteorological to a nonfederal website. Open Science Directory Open Science Directory contains collections of Open Access Journals (e.g. Directory of Open Access Journals) and journals in the special programs (Hinari
ERIC Educational Resources Information Center
Grandbois, Jennifer; Beheshti, Jamshid
2014-01-01
Introduction: This study aims to gain a greater understanding of the development of open access practices amongst library and information science authors, since their role is integral to the success of the broader open access movement. Method: Data were collected from scholarly articles about open access by library and information science authors…
Performance comparison of token ring protocols for hard-real-time communication
NASA Technical Reports Server (NTRS)
Kamat, Sanjay; Zhao, Wei
1992-01-01
The ability to guarantee the deadlines of synchronous messages while maintaining a good aggregate throughput is an important consideration in the design of distributed real-time systems. In this paper, we study two token ring protocols, the priority driven protocol and the timed token protocol, for their suitability for hard real-time systems. Both these protocols use a token to control access to the transmission medium. In a priority driven protocol, messages are assigned priorities and the protocol ensures that messages are transmitted in the order of their priorities. Timed token protocols do not provide for priority arbitration but ensure that the maximum access delay for a station is bounded. For both protocols, we first derive the schedulability conditions under which the transmission deadlines of a given set of synchronous messages can be guaranteed. Subsequently, we use these schedulability conditions to quantitatively compare the average case behavior of the protocols. This comparison demonstrates that each of the protocols has its domain of superior performance and neither dominates the other for the entire range of operating conditions.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-03
... Collection; Comment Request; Protocol for Access to Tissue Specimen Samples From the National Marine Mammal Tissue Bank AGENCY: National Oceanic and Atmospheric Administration (NOAA), Commerce. ACTION: Notice... National Marine Mammal Tissue Bank (NMMTB) was established by the National Marine Fisheries Service (NMFS...
Securing TCP/IP and Dial-up Access to Administrative Data.
ERIC Educational Resources Information Center
Conrad, L. Dean
1992-01-01
This article describes Arizona State University's solution to security risk inherent in general access systems such as TCP/IP (Transmission Control Protocol/INTERNET Protocol). Advantages and disadvantages of various options are compared, and the process of selecting a log-on authentication approach involving generation of a different password at…
Newman, Jonathan P.; Zeller-Townson, Riley; Fong, Ming-Fai; Arcot Desai, Sharanya; Gross, Robert E.; Potter, Steve M.
2013-01-01
Single neuron feedback control techniques, such as voltage clamp and dynamic clamp, have enabled numerous advances in our understanding of ion channels, electrochemical signaling, and neural dynamics. Although commercially available multichannel recording and stimulation systems are commonly used for studying neural processing at the network level, they provide little native support for real-time feedback. We developed the open-source NeuroRighter multichannel electrophysiology hardware and software platform for closed-loop multichannel control with a focus on accessibility and low cost. NeuroRighter allows 64 channels of stimulation and recording for around US $10,000, along with the ability to integrate with other software and hardware. Here, we present substantial enhancements to the NeuroRighter platform, including a redesigned desktop application, a new stimulation subsystem allowing arbitrary stimulation patterns, low-latency data servers for accessing data streams, and a new application programming interface (API) for creating closed-loop protocols that can be inserted into NeuroRighter as plugin programs. This greatly simplifies the design of sophisticated real-time experiments without sacrificing the power and speed of a compiled programming language. Here we present a detailed description of NeuroRighter as a stand-alone application, its plugin API, and an extensive set of case studies that highlight the system’s abilities for conducting closed-loop, multichannel interfacing experiments. PMID:23346047
Torgerson, Carinna M; Quinn, Catherine; Dinov, Ivo; Liu, Zhizhong; Petrosyan, Petros; Pelphrey, Kevin; Haselgrove, Christian; Kennedy, David N; Toga, Arthur W; Van Horn, John Darrell
2015-03-01
Under the umbrella of the National Database for Clinical Trials (NDCT) related to mental illnesses, the National Database for Autism Research (NDAR) seeks to gather, curate, and make openly available neuroimaging data from NIH-funded studies of autism spectrum disorder (ASD). NDAR has recently made its database accessible through the LONI Pipeline workflow design and execution environment to enable large-scale analyses of cortical architecture and function via local, cluster, or "cloud"-based computing resources. This presents a unique opportunity to overcome many of the customary limitations to fostering biomedical neuroimaging as a science of discovery. Providing open access to primary neuroimaging data, workflow methods, and high-performance computing will increase uniformity in data collection protocols, encourage greater reliability of published data, results replication, and broaden the range of researchers now able to perform larger studies than ever before. To illustrate the use of NDAR and LONI Pipeline for performing several commonly performed neuroimaging processing steps and analyses, this paper presents example workflows useful for ASD neuroimaging researchers seeking to begin using this valuable combination of online data and computational resources. We discuss the utility of such database and workflow processing interactivity as a motivation for the sharing of additional primary data in ASD research and elsewhere.
Open Core Data: Connecting scientific drilling data to scientists and community data resources
NASA Astrophysics Data System (ADS)
Fils, D.; Noren, A. J.; Lehnert, K.; Diver, P.
2016-12-01
Open Core Data (OCD) is an innovative, efficient, and scalable infrastructure for data generated by scientific drilling and coring to improve discoverability, accessibility, citability, and preservation of data from the oceans and continents. OCD is building on existing community data resources that manage, store, publish, and preserve scientific drilling data, filling a critical void that currently prevents linkages between these and other data systems and tools to realize the full potential of data generated through drilling and coring. We are developing this functionality through Linked Open Data (LOD) and semantic patterns that enable data access through the use of community ontologies such as GeoLink (geolink.org, an EarthCube Building Block), a collection of protocols, formats and vocabularies from a set of participating geoscience repositories. Common shared concepts of classes such as cruise, dataset, person and others allow easier resolution of common references through shared resource IDs. These graphs are then made available via SPARQL as well as incorporated into web pages following schema.org approaches. Additionally the W3C PROV vocabulary is under evaluation for use for documentation of provenance. Further, the application of persistent identifiers for samples (IGSNs); datasets, expeditions, and projects (DOIs); and people (ORCIDs), combined with LOD approaches, provides methods to resolve and incorporate metadata and datasets. Application Program Interfaces (APIs) complement these semantic approaches to the OCD data holdings. APIs are exposed following the Swagger guidelines (swagger.io) and will be evolved into the OpenAPI (openapis.org) approach. Currently APIs are in development for the NSF funded Flyover Country mobile geoscience app (fc.umn.edu), the Neotoma Paleoecology Database (neotomadb.org), Magnetics Information Consortium (MagIC; earthref.org/MagIC), and other community tools and data systems, as well as for internal OCD use.
Cuschieri, Sarah
2018-06-01
Academics have a duty towards peers and scholars alike to engage in research work and to publish their findings. This also assists in establishing personal academic success as well as the attainment of research grants. In the past, authors used to publish their research articles for free but access to these articles was restricted to subscription users only. Recently, open access publishing has gained momentum, whereby such articles are made freely accessible online. However open access publishing comes with a price tag for the author through article processing charges. Open access may also question a journal's credibility within the academic world if improperly implemented. This is particularly so following the unsolicited bombardment of researchers' email accounts with invitations for submissions to predatory open access journals. For these reasons, authors needs to rigorously weigh the pros and cons of whether to choose a subscription based or an open access journal for publication. Copyright © 2018 Elsevier B.V. All rights reserved.
Wagener, Johannes; Spjuth, Ola; Willighagen, Egon L; Wikberg, Jarl ES
2009-01-01
Background Life sciences make heavily use of the web for both data provision and analysis. However, the increasing amount of available data and the diversity of analysis tools call for machine accessible interfaces in order to be effective. HTTP-based Web service technologies, like the Simple Object Access Protocol (SOAP) and REpresentational State Transfer (REST) services, are today the most common technologies for this in bioinformatics. However, these methods have severe drawbacks, including lack of discoverability, and the inability for services to send status notifications. Several complementary workarounds have been proposed, but the results are ad-hoc solutions of varying quality that can be difficult to use. Results We present a novel approach based on the open standard Extensible Messaging and Presence Protocol (XMPP), consisting of an extension (IO Data) to comprise discovery, asynchronous invocation, and definition of data types in the service. That XMPP cloud services are capable of asynchronous communication implies that clients do not have to poll repetitively for status, but the service sends the results back to the client upon completion. Implementations for Bioclipse and Taverna are presented, as are various XMPP cloud services in bio- and cheminformatics. Conclusion XMPP with its extensions is a powerful protocol for cloud services that demonstrate several advantages over traditional HTTP-based Web services: 1) services are discoverable without the need of an external registry, 2) asynchronous invocation eliminates the need for ad-hoc solutions like polling, and 3) input and output types defined in the service allows for generation of clients on the fly without the need of an external semantics description. The many advantages over existing technologies make XMPP a highly interesting candidate for next generation online services in bioinformatics. PMID:19732427
Wagener, Johannes; Spjuth, Ola; Willighagen, Egon L; Wikberg, Jarl E S
2009-09-04
Life sciences make heavily use of the web for both data provision and analysis. However, the increasing amount of available data and the diversity of analysis tools call for machine accessible interfaces in order to be effective. HTTP-based Web service technologies, like the Simple Object Access Protocol (SOAP) and REpresentational State Transfer (REST) services, are today the most common technologies for this in bioinformatics. However, these methods have severe drawbacks, including lack of discoverability, and the inability for services to send status notifications. Several complementary workarounds have been proposed, but the results are ad-hoc solutions of varying quality that can be difficult to use. We present a novel approach based on the open standard Extensible Messaging and Presence Protocol (XMPP), consisting of an extension (IO Data) to comprise discovery, asynchronous invocation, and definition of data types in the service. That XMPP cloud services are capable of asynchronous communication implies that clients do not have to poll repetitively for status, but the service sends the results back to the client upon completion. Implementations for Bioclipse and Taverna are presented, as are various XMPP cloud services in bio- and cheminformatics. XMPP with its extensions is a powerful protocol for cloud services that demonstrate several advantages over traditional HTTP-based Web services: 1) services are discoverable without the need of an external registry, 2) asynchronous invocation eliminates the need for ad-hoc solutions like polling, and 3) input and output types defined in the service allows for generation of clients on the fly without the need of an external semantics description. The many advantages over existing technologies make XMPP a highly interesting candidate for next generation online services in bioinformatics.
Usage Trends of Open Access and Local Journals: A Korean Case Study.
Seo, Jeong-Wook; Chung, Hosik; Yun, Jungmin; Park, Jin Young; Park, Eunsun; Ahn, Yuri
2016-01-01
Articles from open access and local journals are important resources for research in Korea and the usage trends of these articles are important indicators for the assessment of the current research practice. We analyzed an institutional collection of published papers from 1998 to 2014 authored by researchers from Seoul National University, and their references from papers published between 1998 and 2011. The published papers were collected from Web of Science or Scopus and were analyzed according to the proportion of articles from open access journals. Their cited references from published papers in Web of Science were analyzed according to the proportion of local (South Korean) or open access journals. The proportion of open access papers was relatively stable until 2006 (2.5 ~ 5.2% in Web of Science and 2.7 ~ 4.2% in Scopus), but then increased to 15.9% (Web of Science) or 18.5% (Scopus) in 2014. We analyzed 2,750,485 cited references from 52,295 published papers. We found that the overall proportion of cited articles from local journals was 1.8% and that for open access journals was 3.0%. Citations of open access articles have increased since 2006 to 4.1% in 2011, although the increase in open access article citations was less than for open access publications. The proportion of citations from local journals was even lower. We think that the publishing / citing mismatch is a term to describe this difference, which is an issue at Seoul National University, where the number of published papers at open access or local journals is increasing but the number of citations is not. The cause of this discrepancy is multi-factorial but the governmental / institutional policies, social / cultural issues and authors' citing behaviors will explain the mismatch. However, additional measures are also necessary, such as the development of an institutional citation database and improved search capabilities with respect to local and open access documents.
Usage Trends of Open Access and Local Journals: A Korean Case Study
Chung, Hosik; Yun, Jungmin; Park, Jin Young; Park, Eunsun; Ahn, Yuri
2016-01-01
Articles from open access and local journals are important resources for research in Korea and the usage trends of these articles are important indicators for the assessment of the current research practice. We analyzed an institutional collection of published papers from 1998 to 2014 authored by researchers from Seoul National University, and their references from papers published between 1998 and 2011. The published papers were collected from Web of Science or Scopus and were analyzed according to the proportion of articles from open access journals. Their cited references from published papers in Web of Science were analyzed according to the proportion of local (South Korean) or open access journals. The proportion of open access papers was relatively stable until 2006 (2.5 ~ 5.2% in Web of Science and 2.7 ~ 4.2% in Scopus), but then increased to 15.9% (Web of Science) or 18.5% (Scopus) in 2014. We analyzed 2,750,485 cited references from 52,295 published papers. We found that the overall proportion of cited articles from local journals was 1.8% and that for open access journals was 3.0%. Citations of open access articles have increased since 2006 to 4.1% in 2011, although the increase in open access article citations was less than for open access publications. The proportion of citations from local journals was even lower. We think that the publishing / citing mismatch is a term to describe this difference, which is an issue at Seoul National University, where the number of published papers at open access or local journals is increasing but the number of citations is not. The cause of this discrepancy is multi-factorial but the governmental / institutional policies, social / cultural issues and authors' citing behaviors will explain the mismatch. However, additional measures are also necessary, such as the development of an institutional citation database and improved search capabilities with respect to local and open access documents. PMID:27195948
Karayanidis, Frini; Keuken, Max C; Wong, Aaron; Rennie, Jaime L; de Hollander, Gilles; Cooper, Patrick S; Ross Fulham, W; Lenroot, Rhoshel; Parsons, Mark; Phillips, Natalie; Michie, Patricia T; Forstmann, Birte U
2016-01-01
Our understanding of the complex interplay between structural and functional organisation of brain networks is being advanced by the development of novel multi-modal analyses approaches. The Age-ility Project (Phase 1) data repository offers open access to structural MRI, diffusion MRI, and resting-state fMRI scans, as well as resting-state EEG recorded from the same community participants (n=131, 15-35 y, 66 male). Raw imaging and electrophysiological data as well as essential demographics are made available via the NITRC website. All data have been reviewed for artifacts using a rigorous quality control protocol and detailed case notes are provided. Copyright © 2015. Published by Elsevier Inc.
18 CFR 35.28 - Non-discriminatory open access transmission tariff.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Non-discriminatory open... AND TARIFFS Other Filing Requirements § 35.28 Non-discriminatory open access transmission tariff. (a... concerns regarding unnecessary market inefficiencies. (c) Non-discriminatory open access transmission...
Groeber, F; Schober, L; Schmid, F F; Traube, A; Kolbus-Hernandez, S; Daton, K; Hoffmann, S; Petersohn, D; Schäfer-Korting, M; Walles, H; Mewes, K R
2016-10-01
To replace the Draize skin irritation assay (OECD guideline 404) several test methods based on reconstructed human epidermis (RHE) have been developed and were adopted in the OECD test guideline 439. However, all validated test methods in the guideline are linked to RHE provided by only three companies. Thus, the availability of these test models is dependent on the commercial interest of the producer. To overcome this limitation and thus to increase the accessibility of in vitro skin irritation testing, an open source reconstructed epidermis (OS-REp) was introduced. To demonstrate the capacity of the OS-REp in regulatory risk assessment, a catch-up validation study was performed. The participating laboratories used in-house generated OS-REp to assess the set of 20 reference substances according to the performance standards amending the OECD test guideline 439. Testing was performed under blinded conditions. The within-laboratory reproducibility of 87% and the inter-laboratory reproducibility of 85% prove a high reliability of irritancy testing using the OS-REp protocol. In addition, the prediction capacity was with an accuracy of 80% comparable to previous published RHE based test protocols. Taken together the results indicate that the OS-REp test method can be used as a standalone alternative skin irritation test replacing the OECD test guideline 404. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Zhang, Sheng; Liang, Fei; Li, Wenfeng
2017-11-01
The decision to make protocols of phase III randomized controlled trials (RCTs) publicly accessible by leading journals was a landmark event in clinical trial reporting. Here, we compared primary outcomes defined in protocols with those in publications describing the trials and in trial registration. We identified phase III RCTs published between January 1, 2012, and June 30, 2015, in The New England Journal of Medicine, The Lancet, The Journal of the American Medical Association, and The BMJ with available protocols. Consistency in primary outcomes between protocols and registries (articles) was evaluated. We identified 299 phase III RCTs with available protocols in this analysis. Out of them, 25 trials (8.4%) had some discrepancy for primary outcomes between publications and protocols. Types of discrepancies included protocol-defined primary outcome reported as nonprimary outcome in publication (11 trials, 3.7%), protocol-defined primary outcome omitted in publication (10 trials, 3.3%), new primary outcome introduced in publication (8 trials, 2.7%), protocol-defined nonprimary outcome reported as primary outcome in publication (4 trials, 1.3%), and different timing of assessment of primary outcome (4 trials, 1.3%). Out of trials with discrepancies in primary outcome, 15 trials (60.0%) had discrepancies that favored statistically significant results. Registration could be seen as a valid surrogate of protocol in 237 of 299 trials (79.3%) with regard to primary outcome. Despite unrestricted public access to protocols, selective outcome reporting persists in a small fraction of phase III RCTs. Only studies from four leading journals were included, which may cause selection bias and limit the generalizability of this finding. Copyright © 2017 Elsevier Inc. All rights reserved.
Koush, Yury; Ashburner, John; Prilepin, Evgeny; Sladky, Ronald; Zeidman, Peter; Bibikov, Sergei; Scharnowski, Frank; Nikonorov, Artem; De Ville, Dimitri Van
2017-08-01
Neurofeedback based on real-time functional magnetic resonance imaging (rt-fMRI) is a novel and rapidly developing research field. It allows for training of voluntary control over localized brain activity and connectivity and has demonstrated promising clinical applications. Because of the rapid technical developments of MRI techniques and the availability of high-performance computing, new methodological advances in rt-fMRI neurofeedback become possible. Here we outline the core components of a novel open-source neurofeedback framework, termed Open NeuroFeedback Training (OpenNFT), which efficiently integrates these new developments. This framework is implemented using Python and Matlab source code to allow for diverse functionality, high modularity, and rapid extendibility of the software depending on the user's needs. In addition, it provides an easy interface to the functionality of Statistical Parametric Mapping (SPM) that is also open-source and one of the most widely used fMRI data analysis software. We demonstrate the functionality of our new framework by describing case studies that include neurofeedback protocols based on brain activity levels, effective connectivity models, and pattern classification approaches. This open-source initiative provides a suitable framework to actively engage in the development of novel neurofeedback approaches, so that local methodological developments can be easily made accessible to a wider range of users. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nicklaus, Dennis J.
2013-10-13
We have developed an Erlang language implementation of the Channel Access protocol. Included are low-level functions for encoding and decoding Channel Access protocol network packets as well as higher level functions for monitoring or setting EPICS process variables. This provides access to EPICS process variables for the Fermilab Acnet control system via our Erlang-based front-end architecture without having to interface to C/C++ programs and libraries. Erlang is a functional programming language originally developed for real-time telecommunications applications. Its network programming features and list management functions make it particularly well-suited for the task of managing multiple Channel Access circuits and PVmore » monitors.« less
NASA STI Program Coordinating Council Twelfth Meeting: Standards
NASA Technical Reports Server (NTRS)
1994-01-01
The theme of this NASA Scientific and Technical Information Program Coordinating Council Meeting was standards and their formation and application. Topics covered included scientific and technical information architecture, the Open Systems Interconnection Transmission Control Protocol/Internet Protocol, Machine-Readable Cataloging (MARC) open system environment procurement, and the Government Information Locator Service.
Using Open-Book Exams to Enhance Student Learning, Performance, and Motivation
ERIC Educational Resources Information Center
Green, Steve G.; Ferrante, Claudia J.; Heppard, Kurt A.
2016-01-01
This study investigated an alternative testing protocol used in an undergraduate managerial accounting course. Specifically, we assert that consistent open-book testing approaches will enhance learning and better prepare students for the real-world decision-making they will encounter. A semester-long testing protocol was executed incorporating a…
75 FR 32937 - Combined Notice of Filings #1
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-10
... proposed amendments to its Open Access Transmission, Energy and Operating Reserve Markets Tariff. Filed... Interconnection, LLC submits the revised Open Access Tariff. Filed Date: 05/27/2010. Accession Number: 20100527... proposed revisions to its FERC Open Access Transmission Tariff to be effective 6/1/10. Filed Date: 05/27...
50 CFR 660.310 - Purpose and scope.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Access Fisheries § 660.310 Purpose and scope. This subpart covers the Pacific Coast Groundfish open access fishery. The open access fishery, as defined at § 660.11, Subpart C, is the fishery composed of commercial vessels using open access gear fished pursuant to the harvest guidelines, quotas, and other...
50 CFR 648.15 - Facilitation of enforcement.
Code of Federal Regulations, 2010 CFR
2010-10-01
... ocean quahog open access permitted vessels. Vessel owners or operators issued an open access surfclam or ocean quahog open access permit for fishing in the ITQ Program, as specified at § 648.70, are required... limited access permitted vessels. Beginning January 1, 2009, vessel owners or operators issued a limited...
50 CFR 660.24 - Limited entry and open access fisheries.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 50 Wildlife and Fisheries 13 2013-10-01 2013-10-01 false Limited entry and open access fisheries... Groundfish Fisheries § 660.24 Limited entry and open access fisheries. (a) General. All commercial fishing for groundfish must be conducted in accordance with the regulations governing limited entry and open...
The Intersystem - Internetworking for space systems
NASA Astrophysics Data System (ADS)
Landauer, C.
This paper is a description of the Intersystem, which is a mechanism for internetworking among existing and planned military satellite communication systems. The communication systems interconnected with this mechanism are called member systems, and the interconnected set of communication systems is called the Intersystem. The Intersystem is implemented with higher layer protocols that impose a common organization on the different signaling conventions, so that end users of different systems can communicate with each other. The Intersystem provides its coordination of member system access and resource requests with Intersystem Resource Controllers (IRCs), which are processors that implement the Intersystem protocols and have interfaces to the member systems' own access and resource control mechanisms. The IRCs are connected to each other to form the IRC Subnetwork. Terminals request services from the IRC Subnetwork using the Intersystem Access Control Protocols, and the IRC Subnetwork responses to the requests are coordinated using the RCRC (Resource Controller to Resource Controller) Protocols.
On the designing of a tamper resistant prescription RFID access control system.
Safkhani, Masoumeh; Bagheri, Nasour; Naderi, Majid
2012-12-01
Recently, Chen et al. have proposed a novel tamper resistant prescription RFID access control system, published in the Journal of Medical Systems. In this paper we consider the security of the proposed protocol and identify some existing weaknesses. The main attack is a reader impersonation attack which allows an active adversary to impersonate a legitimate doctor, e.g. the patient's doctor, to access the patient's tag and change the patient prescription. The presented attack is quite efficient. To impersonate a doctor, the adversary should eavesdrop one session between the doctor and the patient's tag and then she can impersonate the doctor with the success probability of '1'. In addition, we present efficient reader-tag to back-end database impersonation, de-synchronization and traceability attacks against the protocol. Finally, we propose an improved version of protocol which is more efficient compared to the original protocol while provides the desired security against the presented attacks.
Due diligence in the open-access explosion era: choosing a reputable journal for publication.
Masten, Yondell; Ashcraft, Alyce
2017-11-15
Faculty are required to publish. Naïve and "in-a-hurry-to-publish" authors seek to publish in journals where manuscripts are rapidly accepted. Others may innocently submit to one of an increasing number of questionable/predatory journals, where predatory is defined as practices of publishing journals for exploitation of author-pays, open-access publication model by charging authors publication fees for publisher profit without provision of expected services (expert peer review, editing, archiving, and indexing published manuscripts) and promising almost instant publication. Authors may intentionally submit manuscripts to predatory journals for rapid publication without concern for journal quality. A brief summary of the open access "movement," suggestions for selecting reputable open access journals, and suggestion for avoiding predatory publishers/journals are described. The purpose is to alert junior and seasoned faculty about predatory publishers included among available open access journal listings. Brief review of open access publication, predatory/questionable journal characteristics, suggestions for selecting reputable open access journals and avoiding predatory publishers/journals are described. Time is required for intentionally performing due diligence in open access journal selection, based on publisher/journal quality, prior to manuscript submission or authors must be able to successfully withdraw manuscripts when submission to a questionable or predatory journal is discovered. © FEMS 2017.
Due diligence in the open-access explosion era: choosing a reputable journal for publication
Ashcraft, Alyce
2017-01-01
Abstract Faculty are required to publish. Naïve and “in-a-hurry-to-publish” authors seek to publish in journals where manuscripts are rapidly accepted. Others may innocently submit to one of an increasing number of questionable/predatory journals, where predatory is defined as practices of publishing journals for exploitation of author-pays, open-access publication model by charging authors publication fees for publisher profit without provision of expected services (expert peer review, editing, archiving, and indexing published manuscripts) and promising almost instant publication. Authors may intentionally submit manuscripts to predatory journals for rapid publication without concern for journal quality. A brief summary of the open access “movement,” suggestions for selecting reputable open access journals, and suggestion for avoiding predatory publishers/journals are described. The purpose is to alert junior and seasoned faculty about predatory publishers included among available open access journal listings. Brief review of open access publication, predatory/questionable journal characteristics, suggestions for selecting reputable open access journals and avoiding predatory publishers/journals are described. Time is required for intentionally performing due diligence in open access journal selection, based on publisher/journal quality, prior to manuscript submission or authors must be able to successfully withdraw manuscripts when submission to a questionable or predatory journal is discovered. PMID:29040536
A Fair Contention Access Scheme for Low-Priority Traffic in Wireless Body Area Networks
Sajeel, Muhammad; Bashir, Faisal; Asfand-e-yar, Muhammad; Tauqir, Muhammad
2017-01-01
Recently, wireless body area networks (WBANs) have attracted significant consideration in ubiquitous healthcare. A number of medium access control (MAC) protocols, primarily derived from the superframe structure of the IEEE 802.15.4, have been proposed in literature. These MAC protocols aim to provide quality of service (QoS) by prioritizing different traffic types in WBANs. A contention access period (CAP)with high contention in priority-based MAC protocols can result in higher number of collisions and retransmissions. During CAP, traffic classes with higher priority are dominant over low-priority traffic; this has led to starvation of low-priority traffic, thus adversely affecting WBAN throughput, delay, and energy consumption. Hence, this paper proposes a traffic-adaptive priority-based superframe structure that is able to reduce contention in the CAP period, and provides a fair chance for low-priority traffic. Simulation results in ns-3 demonstrate that the proposed MAC protocol, called traffic- adaptive priority-based MAC (TAP-MAC), achieves low energy consumption, high throughput, and low latency compared to the IEEE 802.15.4 standard, and the most recent priority-based MAC protocol, called priority-based MAC protocol (PA-MAC). PMID:28832495
An Adaptive OFDMA-Based MAC Protocol for Underwater Acoustic Wireless Sensor Networks
Khalil, Issa M.; Gadallah, Yasser; Hayajneh, Mohammad; Khreishah, Abdallah
2012-01-01
Underwater acoustic wireless sensor networks (UAWSNs) have many applications across various civilian and military domains. However, they suffer from the limited available bandwidth of acoustic signals and harsh underwater conditions. In this work, we present an Orthogonal Frequency Division Multiple Access (OFDMA)-based Media Access Control (MAC) protocol that is configurable to suit the operating requirements of the underwater sensor network. The protocol has three modes of operation, namely random, equal opportunity and energy-conscious modes of operation. Our MAC design approach exploits the multi-path characteristics of a fading acoustic channel to convert it into parallel independent acoustic sub-channels that undergo flat fading. Communication between node pairs within the network is done using subsets of these sub-channels, depending on the configurations of the active mode of operation. Thus, the available limited bandwidth gets fully utilized while completely avoiding interference. We derive the mathematical model for optimal power loading and subcarrier selection, which is used as basis for all modes of operation of the protocol. We also conduct many simulation experiments to evaluate and compare our protocol with other Code Division Multiple Access (CDMA)-based MAC protocols. PMID:23012517
An adaptive OFDMA-based MAC protocol for underwater acoustic wireless sensor networks.
Khalil, Issa M; Gadallah, Yasser; Hayajneh, Mohammad; Khreishah, Abdallah
2012-01-01
Underwater acoustic wireless sensor networks (UAWSNs) have many applications across various civilian and military domains. However, they suffer from the limited available bandwidth of acoustic signals and harsh underwater conditions. In this work, we present an Orthogonal Frequency Division Multiple Access (OFDMA)-based Media Access Control (MAC) protocol that is configurable to suit the operating requirements of the underwater sensor network. The protocol has three modes of operation, namely random, equal opportunity and energy-conscious modes of operation. Our MAC design approach exploits the multi-path characteristics of a fading acoustic channel to convert it into parallel independent acoustic sub-channels that undergo flat fading. Communication between node pairs within the network is done using subsets of these sub-channels, depending on the configurations of the active mode of operation. Thus, the available limited bandwidth gets fully utilized while completely avoiding interference. We derive the mathematical model for optimal power loading and subcarrier selection, which is used as basis for all modes of operation of the protocol. We also conduct many simulation experiments to evaluate and compare our protocol with other Code Division Multiple Access (CDMA)-based MAC protocols.
Access control mechanism of wireless gateway based on open flow
NASA Astrophysics Data System (ADS)
Peng, Rong; Ding, Lei
2017-08-01
In order to realize the access control of wireless gateway and improve the access control of wireless gateway devices, an access control mechanism of SDN architecture which is based on Open vSwitch is proposed. The mechanism utilizes the features of the controller--centralized control and programmable. Controller send access control flow table based on the business logic. Open vSwitch helps achieve a specific access control strategy based on the flow table.
A Free and Open Source Web-based Data Catalog Evaluation Tool
NASA Astrophysics Data System (ADS)
O'Brien, K.; Schweitzer, R.; Burger, E. F.
2015-12-01
For many years, the Unified Access Framework (UAF) project has worked to provide improved access to scientific data by leveraging widely used data standards and conventions. These standards include the Climate and Forecast (CF) metadata conventions, the Data Access Protocol (DAP) and various Open Geospatial Consortium (OGC) standards such as WMS and WCS. The UAF has also worked to create a unified access point for scientific data access through THREDDS and ERDDAP catalogs. A significant effort was made by the UAF project to build a catalog-crawling tool that was designed to crawl remote catalogs, analyze their content and then build a clean catalog that 1) represented only CF compliant data; 2) provided a uniform set of access services and 3) where possible, aggregated data in time. That catalog is available at http://ferret.pmel.noaa.gov/geoide/geoIDECleanCatalog.html.Although this tool has proved immensely valuable in allowing the UAF project to create a high quality data catalog, the need for a catalog evaluation service or tool to operate on a more local level also exists. Many programs that generate data of interest to the public are recognizing the utility and power of using the THREDDS data server (TDS) to serve that data. However, for some groups that lack the resources to maintain dedicated IT personnel, it can be difficult to set up a properly configured TDS. The TDS catalog evaluating service that is under development and will be discussed in this presentation is an effort, through the UAF project, to bridge that gap. Based upon the power of the original UAF catalog cleaner, the web evaluator will have the ability to scan and crawl a local TDS catalog, evaluate the contents for compliance with CF standards, analyze the services offered, and identify datasets where possible temporal aggregation would benefit data access. The results of the catalog evaluator will guide the configuration of the dataset in TDS to ensure that it meets the standards as promoted by the UAF framework.
The maxillary molar endodontic access opening: A microscope-based approach
Mamoun, John Sami
2016-01-01
This article reviews the basic clinical techniques of performing a maxillary molar endodontic access opening, starting from the initial access opening into the pulp chamber, to the point where a size #10 file has been advanced to the apices of all three or four (or more) canals. The article explains how the use of the dental surgical operating microscope or microscope-level loupes magnification of ×6–8 or greater, combined with head-mounted or coaxial illumination, improve the ability of a dentist to identify microscopic root canal orifices, which facilitates the efficient creation of conservative access openings with adequate straight-line access in maxillary molars. Magnified photos illustrate various microscopic anatomical structures or landmarks of the initial access opening. Techniques are explored for implementing an access opening for teeth with vital versus necrotic pulpal tissues. The article also explores the use of piezoelectric or ultrasonic instruments for revealing root canal orifices and for removing pulp stones or calcified pulpal tissue inside the pulp chamber. PMID:27403069
Open versus Controlled-Access Data | Office of Cancer Genomics
OCG employs stringent human subjects’ protection and data access policies to protect the privacy and confidentiality of the research participants. Depending on the risk of patient identification, OCG programs data are available to the scientific community in two tiers: open or controlled access. Both types of data can be accessed through its corresponding OCG program-specific data matrix or portal. Open-access Data
Supporting Access to Open Online Courses for Learners of Developing Countries
ERIC Educational Resources Information Center
Nti, Kwame
2015-01-01
This paper examines how access to, and use of, open online courses may be enhanced for learners of developing countries from a learner perspective. Using analysis of the open education concept, factors that affect access to open educational resources content, and universal standards for delivering online learning, the author demonstrates that the…
Code of Federal Regulations, 2010 CFR
2010-07-01
... resolve an allegation that open and nondiscriminatory access was denied? 291.102 Section 291.102 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR APPEALS OPEN AND NONDISCRIMINATORY ACCESS... Hotline to informally resolve an allegation that open and nondiscriminatory access was denied? Before...
Development of a Web-Based Visualization Platform for Climate Research Using Google Earth
NASA Technical Reports Server (NTRS)
Sun, Xiaojuan; Shen, Suhung; Leptoukh, Gregory G.; Wang, Panxing; Di, Liping; Lu, Mingyue
2011-01-01
Recently, it has become easier to access climate data from satellites, ground measurements, and models from various data centers, However, searching. accessing, and prc(essing heterogeneous data from different sources are very tim -consuming tasks. There is lack of a comprehensive visual platform to acquire distributed and heterogeneous scientific data and to render processed images from a single accessing point for climate studies. This paper. documents the design and implementation of a Web-based visual, interoperable, and scalable platform that is able to access climatological fields from models, satellites, and ground stations from a number of data sources using Google Earth (GE) as a common graphical interface. The development is based on the TCP/IP protocol and various data sharing open sources, such as OPeNDAP, GDS, Web Processing Service (WPS), and Web Mapping Service (WMS). The visualization capability of integrating various measurements into cE extends dramatically the awareness and visibility of scientific results. Using embedded geographic information in the GE, the designed system improves our understanding of the relationships of different elements in a four dimensional domain. The system enables easy and convenient synergistic research on a virtual platform for professionals and the general public, gr$tly advancing global data sharing and scientific research collaboration.
Channel MAC Protocol for Opportunistic Communication in Ad Hoc Wireless Networks
NASA Astrophysics Data System (ADS)
Ashraf, Manzur; Jayasuriya, Aruna; Perreau, Sylvie
2008-12-01
Despite significant research effort, the performance of distributed medium access control methods has failed to meet theoretical expectations. This paper proposes a protocol named "Channel MAC" performing a fully distributed medium access control based on opportunistic communication principles. In this protocol, nodes access the channel when the channel quality increases beyond a threshold, while neighbouring nodes are deemed to be silent. Once a node starts transmitting, it will keep transmitting until the channel becomes "bad." We derive an analytical throughput limit for Channel MAC in a shared multiple access environment. Furthermore, three performance metrics of Channel MAC—throughput, fairness, and delay—are analysed in single hop and multihop scenarios using NS2 simulations. The simulation results show throughput performance improvement of up to 130% with Channel MAC over IEEE 802.11. We also show that the severe resource starvation problem (unfairness) of IEEE 802.11 in some network scenarios is reduced by the Channel MAC mechanism.
Traffic Adaptive Energy Efficient and Low Latency Medium Access Control for Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Yadav, Rajesh; Varma, Shirshu; Malaviya, N.
2008-05-01
Medium access control for wireless sensor networks has been a very active research area in the recent years. The traditional wireless medium access control protocol such as IEEE 802.11 is not suitable for the sensor network application because these are battery powered. The recharging of these sensor nodes is expensive and also not possible. The most of the literature in the medium access for the sensor network focuses on the energy efficiency. The proposed MAC protocol solves the energy inefficiency caused by idle listening, control packet overhead and overhearing taking nodes latency into consideration based on the network traffic. Simulation experiments have been performed to demonstrate the effectiveness of the proposed approach. The validation of the simulation results of the proposed MAC has been done by comparing it with the analytical model. This protocol has been simulated in Network Simulator ns-2.
Simple Spectral Lines Data Model Version 1.0
NASA Astrophysics Data System (ADS)
Osuna, Pedro; Salgado, Jesus; Guainazzi, Matteo; Dubernet, Marie-Lise; Roueff, Evelyne; Osuna, Pedro; Salgado, Jesus
2010-12-01
This document presents a Data Model to describe Spectral Line Transitions in the context of the Simple Line Access Protocol defined by the IVOA (c.f. Ref[13] IVOA Simple Line Access protocol) The main objective of the model is to integrate with and support the Simple Line Access Protocol, with which it forms a compact unit. This integration allows seamless access to Spectral Line Transitions available worldwide in the VO context. This model does not provide a complete description of Atomic and Molecular Physics, which scope is outside of this document. In the astrophysical sense, a line is considered as the result of a transition between two energy levels. Under the basis of this assumption, a whole set of objects and attributes have been derived to define properly the necessary information to describe lines appearing in astrophysical contexts. The document has been written taking into account available information from many different Line data providers (see acknowledgments section).
Embedded systems for supporting computer accessibility.
Mulfari, Davide; Celesti, Antonio; Fazio, Maria; Villari, Massimo; Puliafito, Antonio
2015-01-01
Nowadays, customized AT software solutions allow their users to interact with various kinds of computer systems. Such tools are generally available on personal devices (e.g., smartphones, laptops and so on) commonly used by a person with a disability. In this paper, we investigate a way of using the aforementioned AT equipments in order to access many different devices without assistive preferences. The solution takes advantage of open source hardware and its core component consists of an affordable Linux embedded system: it grabs data coming from the assistive software, which runs on the user's personal device, then, after processing, it generates native keyboard and mouse HID commands for the target computing device controlled by the end user. This process supports any operating system available on the target machine and it requires no specialized software installation; therefore the user with a disability can rely on a single assistive tool to control a wide range of computing platforms, including conventional computers and many kinds of mobile devices, which receive input commands through the USB HID protocol.
3D-Lab: a collaborative web-based platform for molecular modeling.
Grebner, Christoph; Norrby, Magnus; Enström, Jonatan; Nilsson, Ingemar; Hogner, Anders; Henriksson, Jonas; Westin, Johan; Faramarzi, Farzad; Werner, Philip; Boström, Jonas
2016-09-01
The use of 3D information has shown impact in numerous applications in drug design. However, it is often under-utilized and traditionally limited to specialists. We want to change that, and present an approach making 3D information and molecular modeling accessible and easy-to-use 'for the people'. A user-friendly and collaborative web-based platform (3D-Lab) for 3D modeling, including a blazingly fast virtual screening capability, was developed. 3D-Lab provides an interface to automatic molecular modeling, like conformer generation, ligand alignments, molecular dockings and simple quantum chemistry protocols. 3D-Lab is designed to be modular, and to facilitate sharing of 3D-information to promote interactions between drug designers. Recent enhancements to our open-source virtual reality tool Molecular Rift are described. The integrated drug-design platform allows drug designers to instantaneously access 3D information and readily apply advanced and automated 3D molecular modeling tasks, with the aim to improve decision-making in drug design projects.
2012-01-01
Copyright and licensing of scientific data, internationally, are complex and present legal barriers to data sharing, integration and reuse, and therefore restrict the most efficient transfer and discovery of scientific knowledge. Much data are included within scientific journal articles, their published tables, additional files (supplementary material) and reference lists. However, these data are usually published under licenses which are not appropriate for data. Creative Commons CC0 is an appropriate and increasingly accepted method for dedicating data to the public domain, to enable data reuse with the minimum of restrictions. BioMed Central is committed to working towards implementation of open data-compliant licensing in its publications. Here we detail a protocol for implementing a combined Creative Commons Attribution license (for copyrightable material) and Creative Commons CC0 waiver (for data) agreement for content published in peer-reviewed open access journals. We explain the differences between legal requirements for attribution in copyright, and cultural requirements in scholarship for giving individuals credit for their work through citation. We argue that publishing data in scientific journals under CC0 will have numerous benefits for individuals and society, and yet will have minimal implications for authors and minimal impact on current publishing and research workflows. We provide practical examples and definitions of data types, such as XML and tabular data, and specific secondary use cases for published data, including text mining, reproducible research, and open bibliography. We believe this proposed change to the current copyright and licensing structure in science publishing will help clarify what users – people and machines – of the published literature can do, legally, with journal articles and make research using the published literature more efficient. We further believe this model could be adopted across multiple publishers, and invite comment on this article from all stakeholders in scientific research. PMID:22958225
Oliveira, Leandro Benetti de; Gabrielli, Marisa Aparecida Cabrini; Gabrielli, Mario Francisco Real; Pereira-Filho, Valfrido Antonio Pereira
2015-12-01
The objective of this article is to present options of rehabilitation with dental implants in two cases of severely atrophic mandibles (<10 mm) after rigid internal fixation of fractures. Two patients who sustained fractures in severely atrophic mandibles with less than 10 mm of bone height were treated by open reduction and internal fixation through a transcervical access. Internal fixation was obtained with 2.4-mm locking reconstruction plates. The first patient presented satisfactory bone height at the area between the mental foramens and after 2 years, received flapless guided implants in the anterior mandible and an immediate protocol prosthesis. The second patient received a tent pole iliac crest autogenous graft after 2 years of fracture treatment and immediate implants. After 5 months, a protocol prosthesis was installed in the second patient. In both cases, the internal fixation followed AO principles for load-bearing osteosynthesis. Both prosthetic devices were Branemark protocol prosthesis. The mandibular reconstruction plates were not removed. Both patients are rehabilitated without complications and satisfied with esthetic and functional results. With the current techniques of internal fixation, grafting, and guided implants, the treatment of atrophic mandible fractures can achieve very good results, which were previously not possible.
Open Governance in Higher Education: Extending the Past to the Future
ERIC Educational Resources Information Center
Masson, Patrick
2011-01-01
Open educational resources, open content, open access, open research, open courseware--all of these open initiatives share, and benefit from, a vision of access and a collaborative framework that often result in improved outcomes. Many of these open initiatives have gained adoption within higher education and are now serving in mission-critical…
A proposal for an SDN-based SIEPON architecture
NASA Astrophysics Data System (ADS)
Khalili, Hamzeh; Sallent, Sebastià; Piney, José Ramón; Rincón, David
2017-11-01
Passive Optical Network (PON) elements such as Optical Line Terminal (OLT) and Optical Network Units (ONUs) are currently managed by inflexible legacy network management systems. Software-Defined Networking (SDN) is a new networking paradigm that improves the operation and management of networks. In this paper, we propose a novel architecture, based on the SDN concept, for Ethernet Passive Optical Networks (EPON) that includes the Service Interoperability standard (SIEPON). In our proposal, the OLT is partially virtualized and some of its functionalities are allocated to the core network management system, while the OLT itself is replaced by an OpenFlow (OF) switch. A new MultiPoint MAC Control (MPMC) sublayer extension based on the OpenFlow protocol is presented. This would allow the SDN controller to manage and enhance the resource utilization, flow monitoring, bandwidth assignment, quality-of-service (QoS) guarantees, and energy management of the optical network access, to name a few possibilities. The OpenFlow switch is extended with synchronous ports to retain the time-critical nature of the EPON network. OpenFlow messages are also extended with new functionalities to implement the concept of EPON Service Paths (ESPs). Our simulation-based results demonstrate the effectiveness of the new architecture, while retaining a similar (or improved) performance in terms of delay and throughput when compared to legacy PONs.
Optimizing the NASA Technical Report Server
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Maa, Ming-Hokng
1996-01-01
The NASA Technical Report Server (NTRS), a World Wide Web report distribution NASA technical publications service, is modified for performance enhancement, greater protocol support, and human interface optimization. Results include: Parallel database queries, significantly decreasing user access times by an average factor of 2.3; access from clients behind firewalls and/ or proxies which truncate excessively long Uniform Resource Locators (URLs); access to non-Wide Area Information Server (WAIS) databases and compatibility with the 239-50.3 protocol; and a streamlined user interface.
Open Access: "à consommer avec modération"
NASA Astrophysics Data System (ADS)
Mahoney, Terence J.
There is increasing pressure on academics and researchers to publish the results of their investigations in open access journals. Indeed, some funding agencies make open access publishing a basic requirement for funding projects, and the EU is considering taking firm steps in this direction. I argue that astronomy is already one of the most open of disciplines, and that access - both to the general public (in terms of a significantly growing outreach effort) and to developing countries (through efforts to provide computing facilities and Internet access, as well as schemes to provide research centres of limited resources with journals) - is becoming more and more open in a genuine and lasting way. I further argue that sudden switches to more formal kinds of open access schemes could cause irreparable harm to astronomical publishing. Several of the most prestigious astronomical research journals (e.g. MN, ApJ, AJ) have for more than a century met the publishing needs of the research community and continue to adapt successfully to changing demands on the part of that community. The after-effects of abrupt changes in publishing practices - implemented through primarily political concerns - are hard to predict and could be severely damaging. I conclude that open access, in its current acceptation, should be studied with great care and with sufficient time before any consideration is given to its implementation. If forced on the publishing and research communities, open access could well result in much more limited access to properly vetted research results.
Farret, Milton Meri Benitez; Farret, Marcel Marchiori; Farret, Alessandro Marchiori
2012-09-01
The treatment of skeletal class III and anterior open bite can be unstable and orthodontists frequently observe relapse. Here, we report on the management of three patients with skeletal class III profiles and open bites treated by orthodontic camouflage. Each received a retention protocol involving the use of two separate appliances during the night and day accompanied by myofunctional therapy. Long-term follow-up revealed a stable outcome.
USDA-ARS?s Scientific Manuscript database
The US Culture Collection Network held a meeting to share information about how collections are responding to the requirements of the recently enacted Nagoya Protocol on Access to Genetic Resources and the Fair and Equitable Sharing of Benefits Arising from their Utilization to the Convention on Bio...
A Combined Fabrication and Instrumentation Platform for Sample Preparation.
Guckenberger, David J; Thomas, Peter C; Rothbauer, Jacob; LaVanway, Alex J; Anderson, Meghan; Gilson, Dan; Fawcett, Kevin; Berto, Tristan; Barrett, Kevin; Beebe, David J; Berry, Scott M
2014-06-01
While potentially powerful, access to molecular diagnostics is substantially limited in the developing world. Here we present an approach to reduced cost molecular diagnostic instrumentation that has the potential to empower developing world communities by reducing costs through streamlining the sample preparation process. In addition, this instrument is capable of producing its own consumable devices on demand, reducing reliance on assay suppliers. Furthermore, this instrument is designed with an "open" architecture, allowing users to visually observe the assay process and make modifications as necessary (as opposed to traditional "black box" systems). This open environment enables integration of microfluidic fabrication and viral RNA purification onto an easy-to-use modular system via the use of interchangeable trays. Here we employ this system to develop a protocol to fabricate microfluidic devices and then use these devices to isolate viral RNA from serum for the measurement of human immunodeficiency virus (HIV) viral load. Results obtained from this method show significantly reduced error compared with similar nonautomated sample preparation processes. © 2014 Society for Laboratory Automation and Screening.
KeyWare: an open wireless distributed computing environment
NASA Astrophysics Data System (ADS)
Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir
1995-12-01
Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.
NASA Astrophysics Data System (ADS)
Kravitz, David William
This paper presents an insider's view of the rationale and the cryptographic mechanics of some principal elements of the Open Mobile Alliance (OMA) Secure Content Exchange (SCE) Technical Specifications. A primary goal is to enable implementation of a configurable methodology that quarantines the effects that unknown-compromised entities have on still-compliant entities in the system, while allowing import from upstream protection systems and multi-client reuse of Rights Objects that grant access to plaintext content. This has to be done without breaking compatibility with the underlying legacy OMA DRM v2.0/v2.1 Technical Specifications. It is also required that legacy devices can take at least partial advantage of the new import functionality, and can request the creation of SCE-compatible Rights Objects and utilize Rights Objects created upon request of SCE-conformant devices. This must be done in a way that the roles played by newly defined entities unrecognizable by legacy devices remain hidden.
Duregger, Katharina; Hayn, Dieter; Nitzlnader, Michael; Kropf, Martin; Falgenhauer, Markus; Ladenstein, Ruth; Schreier, Günter
2016-01-01
Electronic Patient Reported Outcomes (ePRO) gathered using telemonitoring solutions might be a valuable source of information in rare cancer research. The objective of this paper was to develop a concept and implement a prototype for introducing ePRO into the existing neuroblastoma research network by applying Near Field Communication and mobile technology. For physicians, an application was developed for registering patients within the research network and providing patients with an ID card and a PIN for authentication when transmitting telemonitoring data to the Electronic Data Capture system OpenClinica. For patients, a previously developed telemonitoring system was extended by a Simple Object Access Protocol (SOAP) interface for transmitting nine different health parameters and toxicities. The concept was fully implemented on the front-end side. The developed application for physicians was prototypically implemented and the mobile application of the telemonitoring system was successfully connected to OpenClinica. Future work will focus on the implementation of the back-end features.
VISIBIOweb: visualization and layout services for BioPAX pathway models
Dilek, Alptug; Belviranli, Mehmet E.; Dogrusoz, Ugur
2010-01-01
With recent advancements in techniques for cellular data acquisition, information on cellular processes has been increasing at a dramatic rate. Visualization is critical to analyzing and interpreting complex information; representing cellular processes or pathways is no exception. VISIBIOweb is a free, open-source, web-based pathway visualization and layout service for pathway models in BioPAX format. With VISIBIOweb, one can obtain well-laid-out views of pathway models using the standard notation of the Systems Biology Graphical Notation (SBGN), and can embed such views within one's web pages as desired. Pathway views may be navigated using zoom and scroll tools; pathway object properties, including any external database references available in the data, may be inspected interactively. The automatic layout component of VISIBIOweb may also be accessed programmatically from other tools using Hypertext Transfer Protocol (HTTP). The web site is free and open to all users and there is no login requirement. It is available at: http://visibioweb.patika.org. PMID:20460470
Measuring, Rating, Supporting, and Strengthening Open Access Scholarly Publishing in Brazil
ERIC Educational Resources Information Center
Neto, Silvio Carvalho; Willinsky, John; Alperin, Juan Pablo
2016-01-01
This study assesses the extent and nature of open access scholarly publishing in Brazil, one of the world's leaders in providing universal access to its research and scholarship. It utilizes Brazil's Qualis journal evaluation system, along with other relevant data bases to address the association between scholarly quality and open access in the…
ERIC Educational Resources Information Center
Haggerty, Kevin D.
2008-01-01
Introduction: Presents a personal account of the transfer to open access of the leading Canadian journal of sociology. Background: The Canadian Journal of Sociology had established a strong position, internationally, among sociology journals. However, subscriptions were falling as readers increasingly accessed the resource through libraries and a…
A Comparison of Student Confidence Levels in Open Access and Undergraduate University Courses
ERIC Educational Resources Information Center
Atherton, Mirella
2017-01-01
Confidence levels of students enrolled in open access programs and undergraduate courses were measured at the University of Newcastle. The open access science students aimed to gain access to undergraduate studies in various disciplines at University. The undergraduate students were enrolled in a variety of degrees and were surveyed during their…
Ricci, William M.; Collinge, Cory; Streubel, Philipp N.; McAndrew, Christopher M.; Gardner, Michael J.
2014-01-01
Objectives This study compared results of aggressive and nonaggressive debridement protocols for the treatment of high energy open supracondylar femur fractures after the primary procedure, with respect to the requirement for secondary bone grafting procedures, and deep infection. Design Retrospective review Setting Level I and Level II Trauma Centers Patients/Participants Twenty-nine consecutive patients with high grade open (Gustilo Types II and III) supracondylar femur fractures (OTA/AO 33A and C) treated with debridement and locked plating. Intervention Surgeons at two different Level I trauma centers had different debridement protocols for open supracondylar femur fractures. One center used a More Aggressive (MA)protocol in their patients (n=17) that included removal of all devitalized bone and placement of antibiotic cement spacers to fill large segmental defects. The other center used a Less Aggressive (LA) protocol in their patients (n=12) that included debridement of grossly contaminated bone with retention of other bone fragments and no use of antibiotic cement spacers. All other aspects of the treatment protocol at the two centers were similar: definitive fixation with locked plates in all cases; IV antibiotics were used until definitive wound closure; and weight bearing was advanced upon clinical and radiographic evidence of fracture healing. Main Outcome Measurements Healing after the primary procedure, requirement for secondary bone grafting procedures, and the presence of deep infection. Results Demographics were similar between included patients at each center with regard to: age; gender; rate of open fractures; open fracture classification; mechanism; and smoking (p>.05). Patients at the MA center were more often diabetic (p<.05).Cement spacers to fill segmental defects were used more often after MA debridement (35% vs 0%, p<0.006) and more patients had a plan for staged bone grafting after MA debridement (71% vs 8%, p<0.006). Healing after the index fixation procedure occurred more often after LA debridement (92% vs 35%, p<0.003). There was no difference in infection rate between the two protocols: 25% with the LA protocol; and 18% with the MA protocol, (p=0.63). All patients in both groups eventually healed and were without evidence of infection at an average of 1.8 years of follow-up. Conclusion The degree to which bone should be debrided after a high energy, high grade, open supracondylar femur fracture is a matter of surgeon judgment and falls along a continuous spectrum. Based on the results of the current study, the theoretic tradeoff between infection risk and osseous healing potential, seems to favor a less aggressive approach towards bone debridement in the initial treatment. PMID:23760177
Ricci, William M; Collinge, Cory; Streubel, Philipp N; McAndrew, Christopher M; Gardner, Michael J
2013-12-01
This study compared results of aggressive and nonaggressive debridement protocols for the treatment of high-energy, open supracondylar femur fractures after the primary procedure, with respect to the requirement for secondary bone grafting procedures, and deep infection. Retrospective review. Level I and level II trauma centers. Twenty-nine consecutive patients with high-grade, open (Gustilo types II and III) supracondylar femur fractures (OTA/AO 33A and C) treated with debridement and locked plating. Surgeons at 2 different level I trauma centers had different debridement protocols for open supracondylar femur fractures. One center used a more aggressive (MA) protocol in their patients (n = 17) that included removal of all devitalized bone and placement of antibiotic cement spacers to fill large segmental defects. The other center used a less aggressive (LA) protocol in their patients (n = 12) that included debridement of grossly contaminated bone with retention of other bone fragments and no use of antibiotic cement spacers. All other aspects of the treatment protocol at the 2 centers were similar: definitive fixation with locked plates in all cases, IV antibiotics were used until definitive wound closure, and weight bearing was advanced upon clinical and radiographic evidence of fracture healing. Healing after the primary procedure, requirement for secondary bone grafting procedures, and the presence of deep infection. Demographics were similar between included patients at each center with regard to age, gender, rate of open fractures, open fracture classification, mechanism, and smoking (P > 0.05). Patients at the MA center were more often diabetic (P < 0.05). Cement spacers to fill segmental defects were used more often after MA debridement (35% vs. 0%, P < 0.006), and more patients had a plan for staged bone grafting after MA debridement (71% vs. 8%, P < 0.006). Healing after the index fixation procedure occurred more often after LA debridement (92% vs. 35%, P < 0.003). There was no difference in infection rate between the 2 protocols: 25% with the LA protocol and 18% with the MA protocol (P = 0.63). All patients in both groups eventually healed and were without evidence of infection at an average of 1.8 years of follow-up. The degree to which bone should be debrided after a high-energy, high-grade, open supracondylar femur fracture is a matter of surgeon judgment and falls along a continuous spectrum. Based on the results of the current study, the theoretic trade-off between infection risk and osseous healing potential seems to favor an LA approach toward bone debridement in the initial treatment. Therapeutic level III.
The future of academic publishing: what is open access?
Collins, Jannette
2005-04-01
For more than 200 years, publishers have been charging users (i.e., subscribers) for access to scientific information to make a profit. Authors have been required to grant copyright ownership to the publisher. This system was not questioned until the Internet popularized electronic publishing. The Internet allows for rapid dissemination of information to millions of readers. Some people have seen this as an opportunity to revolutionize the system of scientific publishing and to make it one that provides free, open access to all scientific information to all persons everywhere in the world. Such systems have been launched and have instigated a wave of dialogue among proponents and opponents alike. At the center of the controversy is the issue of who will pay for the costs of publishing, because an open-access system is not free, and this threatens the backbone of the traditional publishing industry. Currently, open-access publishers charge authors a fee to have their articles published. Because of this and the uncertainty of the sustainability of the open-access system, some authors are hesitant to participate in the new system. This article reviews the events that led to the creation of open-access publishing, the arguments for and against it, and the implications of open access for the future of academic publishing.
NASA Astrophysics Data System (ADS)
Zhang, Yichen; Li, Zhengyu; Zhao, Yijia; Yu, Song; Guo, Hong
2017-02-01
We analyze the security of the two-way continuous-variable quantum key distribution protocol in reverse reconciliation against general two-mode attacks, which represent all accessible attacks at fixed channel parameters. Rather than against one specific attack model, the expression of secret key rates of the two-way protocol are derived against all accessible attack models. It is found that there is an optimal two-mode attack to minimize the performance of the protocol in terms of both secret key rates and maximal transmission distances. We identify the optimal two-mode attack, give the specific attack model of the optimal two-mode attack and show the performance of the two-way protocol against the optimal two-mode attack. Even under the optimal two-mode attack, the performances of two-way protocol are still better than the corresponding one-way protocol, which shows the advantage of making double use of the quantum channel and the potential of long-distance secure communication using a two-way protocol.
Data aggregation in wireless sensor networks using the SOAP protocol
NASA Astrophysics Data System (ADS)
Al-Yasiri, A.; Sunley, A.
2007-07-01
Wireless sensor networks (WSN) offer an increasingly attractive method of data gathering in distributed system architectures and dynamic access via wireless connectivity. Wireless sensor networks have physical and resource limitations, this leads to increased complexity for application developers and often results in applications that are closely coupled with network protocols. In this paper, a data aggregation framework using SOAP (Simple Object Access Protocol) on wireless sensor networks is presented. The framework works as a middleware for aggregating data measured by a number of nodes within a network. The aim of the study is to assess the suitability of the protocol in such environments where resources are limited compared to traditional networks.
New Version of SeismicHandler (SHX) based on ObsPy
NASA Astrophysics Data System (ADS)
Stammler, Klaus; Walther, Marcus
2016-04-01
The command line version of SeismicHandler (SH), a scientific analysis tool for seismic waveform data developed around 1990, has been redesigned in the recent years, based on a project funded by the Deutsche Forschungsgemeinschaft (DFG). The aim was to address new data access techniques, simplified metadata handling and a modularized software design. As a result the program was rewritten in Python in its main parts, taking advantage of simplicity of this script language and its variety of well developed software libraries, including ObsPy. SHX provides an easy access to waveforms and metadata via arclink and FDSN webservice protocols, also access to event catalogs is implemented. With single commands whole networks or stations within a certain area may be read in, the metadata are retrieved from the servers and stored in a local database. For data processing the large set of SH commands is available, as well as the SH scripting language. Via this SH language scripts or additional Python modules the command set of SHX is easily extendable. The program is open source, tested on Linux operating systems, documentation and download is found at URL "https://www.seismic-handler.org/".
Cool Apps: Building Cryospheric Data Applications with Standards-Based Service Oriented Architecture
NASA Astrophysics Data System (ADS)
Oldenburg, J.; Truslove, I.; Collins, J. A.; Liu, M.; Lewis, S.; Brodzik, M.
2012-12-01
The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high- quality software in a timely manner, we have adopted a Service- Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-defined service endpoints which follow a RESTful architecture. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/ portal) which depends on many of the aforementioned services, retrieving data in several ways. The maps it displays are obtained through the use of WMS and WFS protocols from a MapServer instance hosted at NSIDC. Links to the scientific data collected on Operation IceBridge campaigns are obtained through ESIP OpenSearch requests service providers that encapsulate our metadata databases. These standards-based web services are also developed at NSIDC and are designed to be used independently of the Portal. This poster provides a visual representation of the relationships described above, with additional details and examples, and more generally outlines the benefits and challenges of this SOA approach.
An Open Access future? Report from the eurocancercoms project
Kenney, R; Warden, R
2011-01-01
In March 2011, as part of the background research to the FP7 Eurocancercoms project, the European Association for Cancer Research (EACR) conducted an online survey of its members working in Europe to discover their experiences of and attitudes to the issues surrounding academic publishing and Open Access. This paper presents the results from this survey and compares them to the results from a much larger survey on the same topic from the Study of Open Access Publishing (SOAP). The responses from both surveys show very positive attitudes to the Open Access publishing route; perhaps the most challenging statistic from the EACR survey is that 88% of respondents believe that publicly funded research should be made available to be read and used without access barriers As a conclusion and invitation to further discussion, this paper also contributes to the debate around subscription and Open Access publishing, supporting the case for accelerating the progress towards Open Access publishing of cancer research articles as a particularly supportive way of assisting all researchers to make unhindered progress with their work. PMID:22276063
ERIC Educational Resources Information Center
Mathuews, Katy; Pulcini, Brad
2017-01-01
For the purposes of this article, open access universities are defined as bachelor's degree-granting institutions that do not restrict admission on the basis of ACT/SAT scores, high school grade point average, and the like. Typically, the mission of an open access university is to provide all students with the opportunity to pursue a degree. The…
NASA Technical Reports Server (NTRS)
Randolph, Lynwood P.
1994-01-01
The Open Systems Interconnection Transmission Control Protocol/Internet Protocol (OSI TCP/IP) and the Government Open Systems Interconnection Profile (GOSIP) are compared and described in terms of Federal internetworking. The organization and functions of the Federal Internetworking Requirements Panel (FIRP) are discussed and the panel's conclusions and recommendations with respect to the standards and implementation of the National Information Infrastructure (NII) are presented.
Code of Federal Regulations, 2011 CFR
2011-07-01
... resolve an allegation that open and nondiscriminatory access was denied? 291.102 Section 291.102 Mineral... OPEN AND NONDISCRIMINATORY ACCESS TO OIL AND GAS PIPELINES UNDER THE OUTER CONTINENTAL SHELF LANDS ACT... allegation concerning open and nondiscriminatory access by calling the toll-free MMS Hotline at 1-888-232...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 2 2014-07-01 2014-07-01 false May I call the BSEE Hotline to informally... the BSEE Hotline to informally resolve an allegation that open and nondiscriminatory access was denied... open and nondiscriminatory access by calling the toll-free BSEE Pipeline Open Access Hotline at 1-888...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 2 2013-07-01 2013-07-01 false May I call the BSEE Hotline to informally... the BSEE Hotline to informally resolve an allegation that open and nondiscriminatory access was denied... open and nondiscriminatory access by calling the toll-free BSEE Pipeline Open Access Hotline at 1-888...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 2 2012-07-01 2012-07-01 false May I call the BSEE Hotline to informally... the BSEE Hotline to informally resolve an allegation that open and nondiscriminatory access was denied... open and nondiscriminatory access by calling the toll-free BSEE Pipeline Open Access Hotline at 1-888...
Collaborative Science Using Web Services and the SciFlo Grid Dataflow Engine
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Manipon, G.; Xing, Z.; Yunck, T.
2006-12-01
The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* &Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client &server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data Access Protocol (OpenDAP) servers. The scientist injects a distributed computation into the Grid by simply filling out an HTML form or directly authoring the underlying XML dataflow document, and results are returned directly to the scientist's desktop. Once an analysis has been specified for a chunk or day of data, it can be easily repeated with different control parameters or over months of data. Recently, the Earth Science Information Partners (ESIP) Federation sponsored a collaborative activity in which several ESIP members advertised their respective WMS/WCS and SOAP services, developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. For several scenarios, the same collaborative workflow was executed in three ways: using hand-coded scripts, by executing a SciFlo document, and by executing a BPEL workflow document. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, and further collaborations that are being pursued.
Extending the GI Brokering Suite to Support New Interoperability Specifications
NASA Astrophysics Data System (ADS)
Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.
2014-12-01
The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by public administrations. CERIF: used by CRIS (Current Research Information System) instances. HYRAX Server: a scientific dataset publishing component. This presentation will discuss these and other latest GI suite extensions implemented to support new interoperability protocols in use by the Earth Science Communities.
Opening our science: Open science and cyanobacterial research at the US EPA
In this blog post we introduce the idea of Open Science and discuss multiple ways we are implementing these concepts in our cyanobacteria research. We give examples of our open access publications, open source code that support our research, and provide open access to our resear...
Corredor, Iván; Metola, Eduardo; Bernardos, Ana M; Tarrío, Paula; Casar, José R
2014-04-29
In the last few years, many health monitoring systems have been designed to fullfil the needs of a large range of scenarios. Although many of those systems provide good ad hoc solutions, most of them lack of mechanisms that allow them to be easily reused. This paper is focused on describing an open platform, the micro Web of Things Open Platform (µWoTOP), which has been conceived to improve the connectivity and reusability of context data to deliver different kinds of health, wellness and ambient home care services. µWoTOP is based on a resource-oriented architecture which may be embedded in mobile and resource constrained devices enabling access to biometric, ambient or activity sensors and actuator resources through uniform interfaces defined according to a RESTful fashion. Additionally, µWoTOP manages two communication modes which allow delivering user context information according to different methods, depending on the requirements of the consumer application. It also generates alert messages based on standards related to health care and risk management, such as the Common Alerting Protocol, in order to make its outputs compatible with existing systems.
Secure Web-based Ground System User Interfaces over the Open Internet
NASA Technical Reports Server (NTRS)
Langston, James H.; Murray, Henry L.; Hunt, Gary R.
1998-01-01
A prototype has been developed which makes use of commercially available products in conjunction with the Java programming language to provide a secure user interface for command and control over the open Internet. This paper reports successful demonstration of: (1) Security over the Internet, including encryption and certification; (2) Integration of Java applets with a COTS command and control product; (3) Remote spacecraft commanding using the Internet. The Java-based Spacecraft Web Interface to Telemetry and Command Handling (Jswitch) ground system prototype provides these capabilities. This activity demonstrates the use and integration of current technologies to enable a spacecraft engineer or flight operator to monitor and control a spacecraft from a user interface communicating over the open Internet using standard World Wide Web (WWW) protocols and commercial off-the-shelf (COTS) products. The core command and control functions are provided by the COTS Epoch 2000 product. The standard WWW tools and browsers are used in conjunction with the Java programming technology. Security is provided with the current encryption and certification technology. This system prototype is a step in the direction of giving scientist and flight operators Web-based access to instrument, payload, and spacecraft data.
Corredor, Iván; Metola, Eduardo; Bernardos, Ana M.; Tarrío, Paula; Casar, José R.
2014-01-01
In the last few years, many health monitoring systems have been designed to fullfil the needs of a large range of scenarios. Although many of those systems provide good ad hoc solutions, most of them lack of mechanisms that allow them to be easily reused. This paper is focused on describing an open platform, the micro Web of Things Open Platform (µWoTOP), which has been conceived to improve the connectivity and reusability of context data to deliver different kinds of health, wellness and ambient home care services. µWoTOP is based on a resource-oriented architecture which may be embedded in mobile and resource constrained devices enabling access to biometric, ambient or activity sensors and actuator resources through uniform interfaces defined according to a RESTful fashion. Additionally, µWoTOP manages two communication modes which allow delivering user context information according to different methods, depending on the requirements of the consumer application. It also generates alert messages based on standards related to health care and risk management, such as the Common Alerting Protocol, in order to make its outputs compatible with existing systems. PMID:24785542
Runs [ Open Access : Password Protected ] CESM Development CESM Runs [ Open Access : Password Protected ] WRF Development WRF Runs [ Open Access : Password Protected ] Climate Modeling Home Projects Links Literature Manuscripts Publications Polar Group Meeting (2012) ASGC Home ASGC Jobs Web Calendar Wiki Internal
Wireless Distribution Systems To Support Medical Response to Disasters
Arisoylu, Mustafa; Mishra, Rajesh; Rao, Ramesh; Lenert, Leslie A.
2005-01-01
We discuss the design of multi-hop access networks with multiple gateways that supports medical response to disasters. We examine and implement protocols to ensure high bandwidth, robust, self-healing and secure wireless multi-hop access networks for extreme conditions. Address management, path setup, gateway discovery and selection protocols are described. Future directions and plans are also considered. PMID:16779171
Open-access databases as unprecedented resources and drivers of cultural change in fisheries science
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A; Utz, Ryan
2014-01-01
Open-access databases with utility in fisheries science have grown exponentially in quantity and scope over the past decade, with profound impacts to our discipline. The management, distillation, and sharing of an exponentially growing stream of open-access data represents several fundamental challenges in fisheries science. Many of the currently available open-access resources may not be universally known among fisheries scientists. We therefore introduce many national- and global-scale open-access databases with applications in fisheries science and provide an example of how they can be harnessed to perform valuable analyses without additional field efforts. We also discuss how the development, maintenance, and utilizationmore » of open-access data are likely to pose technical, financial, and educational challenges to fisheries scientists. Such cultural implications that will coincide with the rapidly increasing availability of free data should compel the American Fisheries Society to actively address these problems now to help ease the forthcoming cultural transition.« less
Open Access Publishing in High-Energy Physics: the SCOAP3 Initiative
NASA Astrophysics Data System (ADS)
Mele, S.
2010-10-01
Scholarly communication in High-Energy Physics (HEP) shows traits very similar to Astronomy and Astrophysics: pervasiveness of Open Access to preprints through community-based services; a culture of openness and sharing among its researchers; a compact number of yearly articles published by a relatively small number of journals which are dear to the community. These aspects have led HEP to spearhead an innovative model for the transition of its scholarly publishing to Open Access. The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP) aims to be a central body to finance peer-review service rather than the purchase of access to information as in the traditional subscription model, with all articles in the discipline eventually available in Open Access. Sustainable funding to SCOAP would come from libraries, library consortia and HEP funding agencies, through a re-direction of funds currently spent for subscriptions to HEP journals. This paper presents the cultural and bibliometric factors at the roots of SCOAP and the current status of this worldwide initiative.
False gold: Safely navigating open access publishing to avoid predatory publishers and journals.
McCann, Terence V; Polacsek, Meg
2018-04-01
The aim of this study was to review and discuss predatory open access publishing in the context of nursing and midwifery and develop a set of guidelines that serve as a framework to help clinicians, educators and researchers avoid predatory publishers. Open access publishing is increasingly common across all academic disciplines. However, this publishing model is vulnerable to exploitation by predatory publishers, posing a threat to nursing and midwifery scholarship and practice. Guidelines are needed to help researchers recognize predatory journals and publishers and understand the negative consequences of publishing in them. Discussion paper. A literature search of BioMed Central, CINAHL, MEDLINE with Full Text and PubMed for terms related to predatory publishing, published in the period 2007-2017. Lack of awareness of the risks and pressure to publish in international journals, may result in nursing and midwifery researchers publishing their work in dubious open access journals. Caution should be taken prior to writing and submitting a paper, to avoid predatory publishers. The advantage of open access publishing is that it provides readers with access to peer-reviewed research as soon as it is published online. However, predatory publishers use deceptive methods to exploit open access publishing for their own profit. Clear guidelines are needed to help researchers navigate safely open access publishing. A deeper understanding of the risks of predatory publishing is needed. Clear guidelines should be followed by nursing and midwifery researchers seeking to publish their work in open access journals. © 2017 John Wiley & Sons Ltd.
A Surgical Procedure for Resecting the Mouse Rib: A Model for Large-Scale Long Bone Repair
Funnell, John W.; Thein, Thu Zan Tun; Mariani, Francesca V.
2015-01-01
This protocol introduces researchers to a new model for large-scale bone repair utilizing the mouse rib. The procedure details the following: preparation of the animal for surgery, opening the thoracic body wall, exposing the desired rib from the surrounding intercostal muscles, excising the desired section of rib without inducing a pneumothorax, and closing the incisions. Compared to the bones of the appendicular skeleton, the ribs are highly accessible. In addition, no internal or external fixator is necessary since the adjacent ribs provide a natural fixation. The surgery uses commercially available supplies, is straightforward to learn, and well-tolerated by the animal. The procedure can be carried out with or without removing the surrounding periosteum, and therefore the contribution of the periosteum to repair can be assessed. Results indicate that if the periosteum is retained, robust repair occurs in 1 - 2 months. We expect that use of this protocol will stimulate research into rib repair and that the findings will facilitate the development of new ways to stimulate bone repair in other locations around the body. PMID:25651082
New Generation Sensor Web Enablement
Bröring, Arne; Echterhoff, Johannes; Jirka, Simon; Simonis, Ingo; Everding, Thomas; Stasch, Christoph; Liang, Steve; Lemmens, Rob
2011-01-01
Many sensor networks have been deployed to monitor Earth’s environment, and more will follow in the future. Environmental sensors have improved continuously by becoming smaller, cheaper, and more intelligent. Due to the large number of sensor manufacturers and differing accompanying protocols, integrating diverse sensors into observation systems is not straightforward. A coherent infrastructure is needed to treat sensors in an interoperable, platform-independent and uniform way. The concept of the Sensor Web reflects such a kind of infrastructure for sharing, finding, and accessing sensors and their data across different applications. It hides the heterogeneous sensor hardware and communication protocols from the applications built on top of it. The Sensor Web Enablement initiative of the Open Geospatial Consortium standardizes web service interfaces and data encodings which can be used as building blocks for a Sensor Web. This article illustrates and analyzes the recent developments of the new generation of the Sensor Web Enablement specification framework. Further, we relate the Sensor Web to other emerging concepts such as the Web of Things and point out challenges and resulting future work topics for research on Sensor Web Enablement. PMID:22163760
NASA Astrophysics Data System (ADS)
Bradford, Rick
2013-01-01
Your December 2012 issue contains both a short news article on the progress of the open-access movement ("UK open access gains ground", p11) and a lengthy feature about the visionary Irish physicist Edward Hutchinson Synge ("Unknown genius", pp26-29). I find the combination of these articles ironic.
Code of Federal Regulations, 2010 CFR
2010-04-01
... ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT OPEN ACCESS SAME-TIME INFORMATION SYSTEMS § 37.2 Purpose. (a) The purpose of this part is to ensure that potential customers of open access transmission... Transmission Provider (or its agent) to create and operate an Open Access Same-time Information System (OASIS...
Code of Federal Regulations, 2011 CFR
2011-04-01
... ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT OPEN ACCESS SAME-TIME INFORMATION SYSTEMS § 37.2 Purpose. (a) The purpose of this part is to ensure that potential customers of open access transmission... Transmission Provider (or its agent) to create and operate an Open Access Same-time Information System (OASIS...
Public Access and Open Access: Is There a Difference? | Poster
By Robin Meckley, Contributing Writer, and Tracie Frederick, Guest Writer Open access and public access—are they different concepts or are they the same? What do they mean for the researchers at NCI at Frederick? “Open-access (OA) literature is digital, online, free of charge, and free of most copyright and licensing restrictions. What makes it possible is the Internet and the
[The Open Access Initiative (OAI) in the scientific literature].
Sánchez-Martín, Francisco M; Millán Rodríguez, Félix; Villavicencio Mavrich, Humberto
2009-01-01
According to the declaration of the Budapest Open Access Initiative (OAI) is defined as a editorial model in which access to scientific journal literature and his use are free. Free flow of information allowed by Internet has been the basis of this initiative. The Bethesda and the Berlin declarations, supported by some international agencies, proposes to require researchers to deposit copies of all articles published in a self-archive or an Open Access repository, and encourage researchers to publish their research papers in journals Open Access. This paper reviews the keys of the OAI, with their strengths and controversial aspects; and it discusses the position of databases, search engines and repositories of biomedical information, as well as the attitude of the scientists, publishers and journals. So far the journal Actas Urológicas Españolas (Act Urol Esp) offer their contents on Open Access as On Line in Spanish and English.
Pencina, Michael J; Louzao, Darcy M; McCourt, Brian J; Adams, Monique R; Tayyabkhan, Rehbar H; Ronco, Peter; Peterson, Eric D
2016-02-01
There are growing calls for sponsors to increase transparency by providing access to clinical trial data. In response, Bristol-Myers Squibb and the Duke Clinical Research Institute have collaborated on a new initiative, Supporting Open Access to Researchers. The aim is to facilitate open sharing of Bristol-Myers Squibb trial data with interested researchers. Key features of the Supporting Open Access to Researchers data sharing model include an independent review committee that ensures expert consideration of each proposal, stringent data deidentification/anonymization and protection of patient privacy, requirement of prespecified statistical analysis plans, and independent review of manuscripts before submission for publication. We believe that these approaches will promote open science by allowing investigators to verify trial results as well as to pursue interesting secondary uses of trial data without compromising scientific integrity. Copyright © 2015 Elsevier Inc. All rights reserved.
Intro and Recent Advances: Remote Data Access via OPeNDAP Web Services
NASA Technical Reports Server (NTRS)
Fulker, David
2016-01-01
During the upcoming Summer 2016 meeting of the ESIP Federation (July 19-22), OpenDAP will hold a Developers and Users Workshop. While a broad set of topics will be covered, a key focus is capitalizing on recent EOSDIS-sponsored advances in Hyrax, OPeNDAPs own software for server-side realization of the DAP2 and DAP4 protocols. These Hyrax advances are as important to data users as to data providers, and the workshop will include hands-on experiences of value to both. Specifically, a balanced set of presentations and hands-on tutorials will address advances in1.server installation,2.server configuration,3.Hyrax aggregation capabilities,4.support for data-access from clients that are HTTP-based, JSON-based or OGC-compliant (especially WCS and WMS),5.support for DAP4,6.use and extension of server-side computational capabilities, and7.several performance-affecting matters. Topics 2 through 7 will be relevant to data consumers, data providers and notably, due to the open-source nature of all OPeNDAP software to developers wishing to extend Hyrax, to build compatible clients and servers, and/or to employ Hyrax as middleware that enables interoperability across a variety of end-user and source-data contexts. A session for contributed talks will elaborate the topics listed above and embrace additional ones.
NASA Astrophysics Data System (ADS)
Xing, Fangyuan; Wang, Honghuan; Yin, Hongxi; Li, Ming; Luo, Shenzi; Wu, Chenguang
2016-02-01
With the extensive application of cloud computing and data centres, as well as the constantly emerging services, the big data with the burst characteristic has brought huge challenges to optical networks. Consequently, the software defined optical network (SDON) that combines optical networks with software defined network (SDN), has attracted much attention. In this paper, an OpenFlow-enabled optical node employed in optical cross-connect (OXC) and reconfigurable optical add/drop multiplexer (ROADM), is proposed. An open source OpenFlow controller is extended on routing strategies. In addition, the experiment platform based on OpenFlow protocol for software defined optical network, is designed. The feasibility and availability of the OpenFlow-enabled optical nodes and the extended OpenFlow controller are validated by the connectivity test, protection switching and load balancing experiments in this test platform.
The Interlibrary Loan Protocol: An OSI Solution to ILL Messaging.
ERIC Educational Resources Information Center
Turner, Fay
1990-01-01
Discusses the interlibrary loan (ILL) protocol, a standard based on the principles of the Open Systems Interconnection (OSI) Reference Model. Benefits derived from protocol use are described, the status of the protocol as an international standard is reviewed, and steps taken by the National Library of Canada to facilitate migration to an ILL…
Empowering Learners with Mobile Open-Access Learning Initiatives
ERIC Educational Resources Information Center
Mills, Michael, Ed.; Wake, Donna, Ed.
2017-01-01
Education has been progressing at a rapid pace ever since educators have been able to harness the power of mobile technology. Open-access learning techniques provide more students with the opportunity to engage in educational opportunities that may have been previously restricted. "Empowering Learners with Mobile Open-Access Learning…
ERIC Educational Resources Information Center
Suber, Peter
2012-01-01
The Internet lets us share perfect copies of our work with a worldwide audience at virtually no cost. We take advantage of this revolutionary opportunity when we make our work "open access": digital, online, free of charge, and free of most copyright and licensing restrictions. Open access is made possible by the Internet and copyright-holder…
Open Access Publishing in Indian Premier Research Institutions
ERIC Educational Resources Information Center
Bhat, Mohammad Hanief
2009-01-01
Introduction: Publishing research findings in open access journals is a means of enhancing visibility and consequently increasing the impact of publications. This study provides an overview of open access publishing in premier research institutes of India. Method: The publication output of each institution from 2003 to 2007 was ascertained through…
Analysis of FERC's Final EIS for Electricity Open Access & Recovery of Stranded Costs
1996-01-01
Reviews the Final Environmental Impact Statement (FEIS) prepared by the Federal Energy Regulatory Commission for its electricity transmission system open access prepared in April 1996 and uses the National Energy Modeling System (NEMS) to analyze the open access rule (Orders 888 and 889).
Prospect for Development of Open Access in Argentina
ERIC Educational Resources Information Center
Miguel, Sandra; Bongiovani, Paola C.; Gomez, Nancy D.; Bueno-de-la-Fuente, Gema
2013-01-01
This perspective article presents an overview of the Open Access movement in Argentina, from a global and regional (Latin American) context. The article describes the evolution and current state of initiatives by examining two principal approaches to Open Access in Argentina: "golden" and "green roads". The article will then…
Open Access and Civic Scientific Information Literacy
ERIC Educational Resources Information Center
Zuccala, Alesia
2010-01-01
Introduction: We examine how residents and citizens of The Netherlands perceive open access to acquire preliminary insight into the role it might play in cultivating civic scientific literacy. Open access refers to scientific or scholarly research literature available on the Web to scholars and the general public in free online journals and…
50 CFR 660.310 - Purpose and scope.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., DEPARTMENT OF COMMERCE (CONTINUED) FISHERIES OFF WEST COAST STATES West Coast Groundfish-Open Access Fisheries § 660.310 Purpose and scope. This subpart covers the Pacific Coast Groundfish open access fishery. The open access fishery, as defined at § 660.11, Subpart C, is the fishery composed of commercial...
Open Access, Education Research, and Discovery
ERIC Educational Resources Information Center
Furlough, Michael
2010-01-01
Background/Context: The open access movement has successfully drawn attention to economic and political aspects of scholarly communication through a significant body of commentary that debates the merits of open access and the potential damage it may do to scholarly publishing. Researchers within the field of education research, notably John…
Open-Access Electronic Textbooks: An Overview
ERIC Educational Resources Information Center
Ovadia, Steven
2011-01-01
Given the challenging economic climate in the United States, many academics are looking to open-access electronic textbooks as a way to provide students with traditional textbook content at a more financially advantageous price. Open access refers to "the free and widely available information throughout the World Wide Web. Once an article's…
Ricci, Natalia Aquaroni; Aratani, Mayra Cristina; Caovilla, Heloísa Helena; Ganança, Fernando Freitas
2016-04-01
The aim of this study was to compare the effects of vestibular rehabilitation protocols on balance control in elderly with dizziness. This is a randomized clinical trial with 3-mo follow-up period. The sample was composed of 82 older individuals with chronic dizziness from vestibular disorders. The control group was treated according to the Conventional Cawthorne & Cooksey protocol (n = 40), and the experimental group was submitted to a Multimodal Cawthorne & Cooksey protocol (n = 42). Measures included Dynamic Gait Index, fall history, hand grip strength, Time Up-and-Go Test, sit-to-stand test, multidirectional reach, and static balance tests. With the exception of history of falls, Forward Functional Reach, Unipedal Right and Left Leg Eyes Closed, and Sensorial Romberg Eyes Open, all outcomes improved after treatments. Such results persisted at follow-up period, with the exception of the Tandem Eyes Open and the Timed Up-and-Go manual. The between-group differences for Sensorial Romberg Eyes Closed (4.27 secs) and Unipedal Left Leg Eyes Open (4.08 secs) were significant after treatment, favoring the Multimodal protocol. Both protocols resulted in improvement on elderly's balance control, which was maintained during a short-term period. The multimodal protocol presented better performance on specific static balance tests.
Bandwidth Management in Resource Constrained Networks
2012-03-01
Postgraduate School OSI Open Systems Interconnection QoS Quality of Service TCP Transmission Control Protocol/Internet Protocol TCP/IP Transmission...filtering. B. NORMAL TCP/IP COMMUNICATIONS The Internet is a “complex network WAN that connects LANs and clients around the globe” (Dean, 2009...of the Open Systems Interconnection ( OSI ) model allowing them to route traffic based on MAC address (Kurose & Ross, 2009). While switching
A study of institutional spending on open access publication fees in Germany.
Jahn, Najko; Tullney, Marco
2016-01-01
Publication fees as a revenue source for open access publishing hold a prominent place on the agendas of researchers, policy makers, and academic publishers. This study contributes to the evolving empirical basis for funding these charges and examines how much German universities and research organisations spent on open access publication fees. Using self-reported cost data from the Open APC initiative, the analysis focused on the amount that was being spent on publication fees, and compared these expenditure with data from related Austrian (FWF) and UK (Wellcome Trust, Jisc) initiatives, in terms of both size and the proportion of articles being published in fully and hybrid open access journals. We also investigated how thoroughly self-reported articles were indexed in Crossref, a DOI minting agency for scholarly literature, and analysed how the institutional spending was distributed across publishers and journal titles. According to self-reported data from 30 German universities and research organisations between 2005 and 2015, expenditures on open access publication fees increased over the years in Germany and amounted to € 9,627,537 for 7,417 open access journal articles. The average payment was € 1,298, and the median was € 1,231. A total of 94% of the total article volume included in the study was supported in accordance with the price cap of € 2,000, a limit imposed by the Deutsche Forschungsgemeinschaft (DFG) as part of its funding activities for open access funding at German universities. Expenditures varied considerably at the institutional level. There were also differences in how much the institutions spent per journal and publisher. These differences reflect, at least in part, the varying pricing schemes in place including discounted publication fees. With an indexing coverage of 99%, Crossref thoroughly indexed the open access journals articles included in the study. A comparison with the related openly available cost data from Austria and the UK revealed that German universities and research organisations primarily funded articles in fully open access journals. By contrast, articles in hybrid journal accounted for the largest share of spending according to the Austrian and UK data. Fees paid for hybrid journals were on average more expensive than those paid for fully open access journals.
A study of institutional spending on open access publication fees in Germany
Tullney, Marco
2016-01-01
Publication fees as a revenue source for open access publishing hold a prominent place on the agendas of researchers, policy makers, and academic publishers. This study contributes to the evolving empirical basis for funding these charges and examines how much German universities and research organisations spent on open access publication fees. Using self-reported cost data from the Open APC initiative, the analysis focused on the amount that was being spent on publication fees, and compared these expenditure with data from related Austrian (FWF) and UK (Wellcome Trust, Jisc) initiatives, in terms of both size and the proportion of articles being published in fully and hybrid open access journals. We also investigated how thoroughly self-reported articles were indexed in Crossref, a DOI minting agency for scholarly literature, and analysed how the institutional spending was distributed across publishers and journal titles. According to self-reported data from 30 German universities and research organisations between 2005 and 2015, expenditures on open access publication fees increased over the years in Germany and amounted to € 9,627,537 for 7,417 open access journal articles. The average payment was € 1,298, and the median was € 1,231. A total of 94% of the total article volume included in the study was supported in accordance with the price cap of € 2,000, a limit imposed by the Deutsche Forschungsgemeinschaft (DFG) as part of its funding activities for open access funding at German universities. Expenditures varied considerably at the institutional level. There were also differences in how much the institutions spent per journal and publisher. These differences reflect, at least in part, the varying pricing schemes in place including discounted publication fees. With an indexing coverage of 99%, Crossref thoroughly indexed the open access journals articles included in the study. A comparison with the related openly available cost data from Austria and the UK revealed that German universities and research organisations primarily funded articles in fully open access journals. By contrast, articles in hybrid journal accounted for the largest share of spending according to the Austrian and UK data. Fees paid for hybrid journals were on average more expensive than those paid for fully open access journals. PMID:27602289
Optimizing the Use of Storage Systems Provided by Cloud Computing Environments
NASA Astrophysics Data System (ADS)
Gallagher, J. H.; Potter, N.; Byrne, D. A.; Ogata, J.; Relph, J.
2013-12-01
Cloud computing systems present a set of features that include familiar computing resources (albeit augmented to support dynamic scaling of processing power) bundled with a mix of conventional and unconventional storage systems. The linux base on which many Cloud environments (e.g., Amazon) are based make it tempting to assume that any Unix software will run efficiently in this environment efficiently without change. OPeNDAP and NODC collaborated on a short project to explore how the S3 and Glacier storage systems provided by the Amazon Cloud Computing infrastructure could be used with a data server developed primarily to access data stored in a traditional Unix file system. Our work used the Amazon cloud system, but we strived for designs that could be adapted easily to other systems like OpenStack. Lastly, we evaluated different architectures from a computer security perspective. We found that there are considerable issues associated with treating S3 as if it is a traditional file system, even though doing so is conceptually simple. These issues include performance penalties because using a software tool that emulates a traditional file system to store data in S3 performs poorly when compared to a storing data directly in S3. We also found there are important benefits beyond performance to ensuring that data written to S3 can directly accessed without relying on a specific software tool. To provide a hierarchical organization to the data stored in S3, we wrote 'catalog' files, using XML. These catalog files map discrete files to S3 access keys. Like a traditional file system's directories, the catalogs can also contain references to other catalogs, providing a simple but effective hierarchy overlaid on top of S3's flat storage space. An added benefit to these catalogs is that they can be viewed in a web browser; our storage scheme provides both efficient access for the data server and access via a web browser. We also looked at the Glacier storage system and found that the system's response characteristics are very different from a traditional file system or database; it behaves like a near-line storage system. To be used by a traditional data server, the underlying access protocol must support asynchronous accesses. This is because the Glacier system takes a minimum of four hours to deliver any data object, so systems built with the expectation of instant access (i.e., most web systems) must be fundamentally changed to use Glacier. Part of a related project has been to develop an asynchronous access mode for OPeNDAP, and we have developed a design using that new addition to the DAP protocol with Glacier as a near-line mass store. In summary, we found that both S3 and Glacier require special treatment to be effectively used by a data server. It is important to add (new) interfaces to data servers that enable them to use these storage devices through their native interfaces. We also found that our designs could easily map to a cloud environment based on OpenStack. Lastly, we noted that while these designs invited more liberal use of remote references for data objects, that can expose software to new security risks.
Kevin McCluskey; Katharine B. Barker; Hazel A. Barton; Kyria Boundy-Mills; Daniel R. Brown; Jonathan A. Coddington; Kevin Cook; Philippe Desmeth; David Geiser; Jessie A. Glaeser; Stephanie Greene; Seogchan Kang; Michael W. Lomas; Ulrich Melcher; Scott E. Miller; David R. Nobles; Kristina J. Owens; Jerome H. Reichman; Manuela da Silva; John Wertz; Cale Whitworth; David Smith; Steven E. Lindow
2017-01-01
The U.S. Culture Collection Network held a meeting to share information about how culture collections are responding to the requirements of the recently enacted Nagoya Protocol on Access to Genetic Resources and the Fair and Equitable Sharing of Benefits Arising from their Utilization to the Convention on Biological Diversity (CBD). The meeting included representatives...
Development and validation of a remote home safety protocol.
Romero, Sergio; Lee, Mi Jung; Simic, Ivana; Levy, Charles; Sanford, Jon
2018-02-01
Environmental assessments and subsequent modifications conducted by healthcare professionals can enhance home safety and promote independent living. However, travel time, expense and the availability of qualified professionals can limit the broad application of this intervention. Remote technology has the potential to increase access to home safety evaluations. This study describes the development and validation of a remote home safety protocol that can be used by a caregiver of an elderly person to video-record their home environment for later viewing and evaluation by a trained professional. The protocol was developed based on literature reviews and evaluations from clinical and content experts. Cognitive interviews were conducted with a group of six caregivers to validate the protocol. The final protocol included step-by-step directions to record indoor and outdoor areas of the home. The validation process resulted in modifications related to safety, clarity of the protocol, readability, visual appearance, technical descriptions and usability. Our final protocol includes detailed instructions that a caregiver should be able to follow to record a home environment for subsequent evaluation by a home safety professional. Implications for Rehabilitation The results of this study have several implications for rehabilitation practice The remote home safety evaluation protocol can potentially improve access to rehabilitation services for clients in remote areas and prevent unnecessary delays for needed care. Using our protocol, a patient's caregiver can partner with therapists to quickly and efficiently evaluate a patient's home before they are released from the hospital. Caregiver narration, which reflects a caregiver's own perspective, is critical to evaluating home safety. In-home safety evaluations, currently not available to all who need them due to access barriers, can enhance a patient's independence and provide a safer home environment.
Leong, T Y; Kaiser, K; Miksch, S
2007-01-01
Guideline-based clinical decision support is an emerging paradigm to help reduce error, lower cost, and improve quality in evidence-based medicine. The free and open source (FOS) approach is a promising alternative for delivering cost-effective information technology (IT) solutions in health care. In this paper, we survey the current FOS enabling technologies for patient-centric, guideline-based care, and discuss the current trends and future directions of their role in clinical decision support. We searched PubMed, major biomedical informatics websites, and the web in general for papers and links related to FOS health care IT systems. We also relied on our background and knowledge for specific subtopics. We focused on the functionalities of guideline modeling tools, and briefly examined the supporting technologies for terminology, data exchange and electronic health record (EHR) standards. To effectively support patient-centric, guideline-based care, the computerized guidelines and protocols need to be integrated with existing clinical information systems or EHRs. Technologies that enable such integration should be accessible, interoperable, and scalable. A plethora of FOS tools and techniques for supporting different knowledge management and quality assurance tasks involved are available. Many challenges, however, remain in their implementation. There are active and growing trends of deploying FOS enabling technologies for integrating clinical guidelines, protocols, and pathways into the main care processes. The continuing development and maturation of such technologies are likely to make increasingly significant contributions to patient-centric, guideline-based clinical decision support.
Dynamic federation of grid and cloud storage
NASA Astrophysics Data System (ADS)
Furano, Fabrizio; Keeble, Oliver; Field, Laurence
2016-09-01
The Dynamic Federations project ("Dynafed") enables the deployment of scalable, distributed storage systems composed of independent storage endpoints. While the Uniform Generic Redirector at the heart of the project is protocol-agnostic, we have focused our effort on HTTP-based protocols, including S3 and WebDAV. The system has been deployed on testbeds covering the majority of the ATLAS and LHCb data, and supports geography-aware replica selection. The work done exploits the federation potential of HTTP to build systems that offer uniform, scalable, catalogue-less access to the storage and metadata ensemble and the possibility of seamless integration of other compatible resources such as those from cloud providers. Dynafed can exploit the potential of the S3 delegation scheme, effectively federating on the fly any number of S3 buckets from different providers and applying a uniform authorization to them. This feature has been used to deploy in production the BOINC Data Bridge, which uses the Uniform Generic Redirector with S3 buckets to harmonize the BOINC authorization scheme with the Grid/X509. The Data Bridge has been deployed in production with good results. We believe that the features of a loosely coupled federation of open-protocolbased storage elements open many possibilities of smoothly evolving the current computing models and of supporting new scientific computing projects that rely on massive distribution of data and that would appreciate systems that can more easily be interfaced with commercial providers and can work natively with Web browsers and clients.
Ó Súilleabháin, Páraic S; Howard, Siobhán; Hughes, Brian M
2018-05-01
Underlying psychophysiological mechanisms of effect linking openness to experience to health outcomes, and particularly cardiovascular well-being, are unknown. This study examined the role of openness in the context of cardiovascular responsivity to acute psychological stress. Continuous cardiovascular response data were collected for 74 healthy young female adults across an experimental protocol, including differing counterbalanced acute stressors. Openness was measured via self-report questionnaire. Analysis of covariance revealed openness was associated with systolic blood pressure (SBP; p = .016), and diastolic blood pressure (DBP; p = .036) responsivity across the protocol. Openness was also associated with heart rate (HR) responding to the initial stress exposure (p = .044). Examination of cardiovascular adaptation revealed that higher openness was associated with significant SBP (p = .001), DBP (p = .009), and HR (p = .002) habituation in response to the second differing acute stress exposure. Taken together, the findings suggest persons higher in openness are characterized by an adaptive cardiovascular stress response profile within the context of changing acute stress exposures. This study is also the first to demonstrate individual differences in cardiovascular adaptation across a protocol consisting of differing stress exposures. More broadly, this research also suggests that future research may benefit from conceptualizing an adaptive fitness of openness within the context of change. In summary, the present study provides evidence that higher openness stimulates short-term stress responsivity, while ensuring cardiovascular habituation to change in stress across time. © 2017 Society for Psychophysiological Research.
Cyr, Mireille; Lamb, Michael E
2009-05-01
The study was designed to assess the effectiveness of the flexibly structured NICHD Investigative Interview Protocol for child sexual abuse (CSA) investigative interviews by police officers and mental health workers in Quebec. The NICHD Protocol was designed to operationalize "best practice" guidelines and to help forensic interviewers use open-ended prompts to facilitate free recall by alleged victims. A total of 83 interviews with 3- to 13-year-old alleged victims were matched with 83 interviews conducted by the same interviewers before they were trained to use the Protocol. Interviews were matched with respect to the children's ages, children-perpetrator relationships, and the types and frequency of abuse. Coders categorized each of the prompts used to elicit information about the abuse and tabulated the numbers of new forensically relevant details provided in each response. Interviewers used three times as many open-ended prompts in Protocol interviews than in non-Protocol interviews, whereas use of all other types of questions was halved, and the total number of questions asked decreased by 25%. Protocol-guided interviews yielded more details than comparison interviews. The mean number of details per prompt increased from 3 to 5 details when the Protocol was used. Even with young children, interviewers using the Protocol employed more invitations to elicit forensically relevant details. French-speaking investigators using the NICHD Protocol used open-ended prompts rather than focused questions when interviewing alleged victims. In addition, these interviewers needed fewer questions to get relevant information when using the Protocol. A French version of the NICHD Protocol is now available to police officers and social workers who investigate the alleged sexual abuse of young children in French-speaking countries. This French version allowed trained interviewers to increase the use of invitations and reduce the use of more focused and risky questions. When the number of questions was controlled, more central details and more details in total were obtained in Protocol interviews, because the average prompt elicited more detailed answers in Protocol interviews. However, learning to use the NICHD Protocol required extended training and continued feedback sessions to maintain the high quality of interviewing.
AccessMRS: integrating OpenMRS with smart forms on Android.
Fazen, Louis E; Chemwolo, Benjamin T; Songok, Julia J; Ruhl, Laura J; Kipkoech, Carolyne; Green, James M; Ikemeri, Justus E; Christoffersen-Deb, Astrid
2013-01-01
We present a new open-source Android application, AccessMRS, for interfacing with an electronic medical record system (OpenMRS) and loading 'Smart Forms' on a mobile device. AccessMRS functions as a patient-centered interface for viewing OpenMRS data; managing patient information in reminders, task lists, and previous encounters; and launching patient-specific 'Smart Forms' for electronic data collection and dissemination of health information. We present AccessMRS in the context of related software applications we developed to serve Community Health Workers, including AccessInfo, AccessAdmin, AccessMaps, and AccessForms. The specific features and design of AccessMRS are detailed in relationship to the requirements that drove development: the workflows of the Kenyan Ministry of Health Community Health Volunteers (CHVs) supported by the AMPATH Primary Health Care Program. Specifically, AccessMRS was designed to improve the quality of community-based Maternal and Child Health services delivered by CHVs in Kosirai Division. AccessMRS is currently in use by more than 80 CHVs in Kenya and undergoing formal assessment of acceptability, effectiveness, and cost.
Analyzing the effect of routing protocols on media access control protocols in radio networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, C. L.; Drozda, M.; Marathe, A.
2002-01-01
We study the effect of routing protocols on the performance of media access control (MAC) protocols in wireless radio networks. Three well known MAC protocols: 802.11, CSMA, and MACA are considered. Similarly three recently proposed routing protocols: AODV, DSR and LAR scheme 1 are considered. The experimental analysis was carried out using GloMoSim: a tool for simulating wireless networks. The main focus of our experiments was to study how the routing protocols affect the performance of the MAC protocols when the underlying network and traffic parameters are varied. The performance of the protocols was measured w.r.t. five important parameters: (i)more » number of received packets, (ii) average latency of each packet, (iii) throughput (iv) long term fairness and (v) number of control packets at the MAC layer level. Our results show that combinations of routing and MAC protocols yield varying performance under varying network topology and traffic situations. The result has an important implication; no combination of routing protocol and MAC protocol is the best over all situations. Also, the performance analysis of protocols at a given level in the protocol stack needs to be studied not locally in isolation but as a part of the complete protocol stack. A novel aspect of our work is the use of statistical technique, ANOVA (Analysis of Variance) to characterize the effect of routing protocols on MAC protocols. This technique is of independent interest and can be utilized in several other simulation and empirical studies.« less
Sandia National Laboratories: Livermore Valley Open Campus (LVOC)
Visiting the LVOC Locations Livermore Valley Open Campus (LVOC) Open engagement Expanding opportunities for open engagement of the broader scientific community. Building on success Sandia's Combustion Research Facility pioneered open collaboration over 30 years ago. Access to DOE-funded capabilities Expanding access
Open access, open education resources and open data in Uganda.
Salvo, Ivana Di; Mwoka, Meggie; Kwaga, Teddy; Rukundo, Priscilla Aceng; Ernest, Dennis Ssesanga; Osaheni, Louis Aikoriogie; John, Kasibante; Shafik, Kasirye; de Sousa, Agostinho Moreira
2015-01-01
As a follow up to OpenCon 2014, International Federation of Medical Students' Associations (IFMSA) students organized a 3 day workshop Open Access, Open Education Resources and Open Data in Kampala from 15-18 December 2014. One of the aims of the workshop was to engage the Open Access movement in Uganda which encompasses the scientific community, librarians, academia, researchers and students. The IFMSA students held the workshop with the support of: Consortium for Uganda University Libraries (CUUL), The Right to Research Coalition, Electronic Information for Libraries (EIFL), Makerere University, International Health Sciences University (IHSU), Pan African Medical Journal (PAMJ) and the Centre for Health Human Rights and Development (CEHURD). All these organizations are based or have offices in Kampala. The event culminated in a meeting with the Science and Technology Committee of Parliament of Uganda in order to receive the support of the Ugandan Members of Parliament and to make a concrete change for Open Access in the country.
ERIC Educational Resources Information Center
Elliott, Colin; Fabbro, Elaine
2015-01-01
To address challenges that learners, course creators, librarians and academics involved with OER and MOOCs are facing when looking for scholarly materials, Athabasca University Library has initiated the development of "the Open Library at AU." This open library is a full library website that provides easy access to open and free…
Ogunbodede, E O; Rudolph, M J
2002-12-01
Human immunodeficiency virus (HIV) infection constitutes an unparalleled public health challenge. The unique nature of most oral health procedures, instrumentation and patient-care settings requires specific strategies and protocols aimed at preventing the transmission of HIV/AIDS between oral health care providers and patients, as well as between patients themselves. The present study investigated the level of information and training about protocols and policies for preventing the transmission of HIV/AIDS in oral health care settings in South Africa. The data collection techniques utilised available information, in-depth interviews and an open-ended questionnaire. The respondents were 20 purposively selected key informants who were senior officers for HIV/AIDS programmes and/or oral health organisations. Sixteen (80%) of the respondents reported that there were no existing oral health policies on HIV/AIDS in their health care institutions or organisations. None of the interviewees knew of any specific protocols on HIV/AIDS in the oral health care setting that emanated from South Africa. In addition, none of the dental professional associations had established an infection control committee or a support system for members who might become infected with HIV and develop AIDS. Territorial boundaries existed between sectors within the medical disciplines, as well as between the medical and oral health disciplines. Numerous general impediments were identified, such as prejudice, denial and fear, inadequate training and/or information about the infection, lack of representation and resources for policy planning, a lack of interest from the business sector, and approaching HIV/AIDS in the workplace as a 'one-time issue' Other obstacles identified included unemployment, poverty, illiteracy, disempowerment of women and inadequate communication of policies to service providers. Additional issues raised included the migrant labour systeM, complexities of language and culture, the large unstructured sex industry, high prevalence of sexually transmitted infections and lack of funding. All of these have an impact on oral health. Future policy directions identified included 'increasing access to HIV information and postexposure prophylaxis' 'shift towards care and support for those living with HIV/AIDS with emphasis on community and home-based care' and 'improving intersectoral co-ordination and collaboration'. The study demonstrated gaps in availability and access to policies and protocols on HIV/AIDS by managers and health workers. Specific strategic recommendations are made for oral health.
The Oxford Probe: an open access five-hole probe for aerodynamic measurements
NASA Astrophysics Data System (ADS)
Hall, B. F.; Povey, T.
2017-03-01
The Oxford Probe is an open access five-hole probe designed for experimental aerodynamic measurements. The open access probe can be manufactured by the end user via additive manufacturing (metal or plastic). The probe geometry, drawings, calibration maps, and software are available under a creative commons license. The purpose is to widen access to aerodynamic measurement techniques in education and research environments. There are many situations in which the open access probe will allow results of comparable accuracy to a well-calibrated commercial probe. We discuss the applications and limitations of the probe, and compare the calibration maps for 16 probes manufactured in different materials and at different scales, but with the same geometrical design.
Strategies for Success: Open Access Policies at North American Educational Institutions
ERIC Educational Resources Information Center
Fruin, Christine; Sutton, Shan
2016-01-01
Recognizing the paucity of quantitative and qualitative data from North American educational institutions that have pursued open access policies, the authors devised a survey to collect information on the characteristics of these institutions, as well as the elements of the open access policies, the methods of promoting these policies, faculty…
Students' Experiences with Community in an Open Access Course
ERIC Educational Resources Information Center
Blackmon, Stephanie J.; Cullen, Theresa A.
2016-01-01
Online open access courses have become regular offerings of many universities. Building community and connectedness is an important part of branding and success of such offerings. Our goal was to investigate students' experiences with community in an open access course. Therefore, in this study, we explored the sense of community of 342…
EUA's Open Access Checklist for Universities: A Practical Guide on Implementation
ERIC Educational Resources Information Center
Morais, Rita; Lourenço, Joana; Smith, John H.; Borrell-Damian, Lidia
2015-01-01
Open Access (OA) to research publications has received increased attention from the academic community, scientific publishers, research funding agencies and governments. This movement has been growing exponentially in recent years, both in terms of the increasing number of Open Access journals and the proliferation of policies on this topic. The…
Perspective on Open-Access Publishing: An Interview with Peter Suber
ERIC Educational Resources Information Center
Cornwell, Reid; Suber, Peter
2008-01-01
In this edition of Perspectives, Reid Cornwell discusses open-access publishing with Peter Suber, senior researcher at the Scholarly Publishing and Academic Resources Coalition, senior research professor of philosophy at Earlham College, and currently visiting fellow at Yale Law School. Open access means that scholarly work is freely and openly…
Frame of Reference: Open Access Starts with You
ERIC Educational Resources Information Center
Goetsch, Lori A.
2010-01-01
Federal legislation now requires the deposit of some taxpayer-funded research in "open-access" repositories--that is, sites where scholarship and research are made freely available over the Internet. The institutions whose faculty produce the research have begun to see the benefit of open-access publication as well. From the perspective of faculty…
Barker, Katharine B.; Barton, Hazel A.; Boundy-Mills, Kyria; Brown, Daniel R.; Coddington, Jonathan A.; Cook, Kevin; Desmeth, Philippe; Geiser, David; Glaeser, Jessie A.; Greene, Stephanie; Kang, Seogchan; Lomas, Michael W.; Melcher, Ulrich; Miller, Scott E.; Nobles, David R.; Owens, Kristina J.; Reichman, Jerome H.; da Silva, Manuela; Wertz, John; Whitworth, Cale; Smith, David
2017-01-01
ABSTRACT The U.S. Culture Collection Network held a meeting to share information about how culture collections are responding to the requirements of the recently enacted Nagoya Protocol on Access to Genetic Resources and the Fair and Equitable Sharing of Benefits Arising from their Utilization to the Convention on Biological Diversity (CBD). The meeting included representatives of many culture collections and other biological collections, the U.S. Department of State, U.S. Department of Agriculture, Secretariat of the CBD, interested scientific societies, and collection groups, including Scientific Collections International and the Global Genome Biodiversity Network. The participants learned about the policies of the United States and other countries regarding access to genetic resources, the definition of genetic resources, and the status of historical materials and genetic sequence information. Key topics included what constitutes access and how the CBD Access and Benefit-Sharing Clearing-House can help guide researchers through the process of obtaining Prior Informed Consent on Mutually Agreed Terms. U.S. scientists and their international collaborators are required to follow the regulations of other countries when working with microbes originally isolated outside the United States, and the local regulations required by the Nagoya Protocol vary by the country of origin of the genetic resource. Managers of diverse living collections in the United States described their holdings and their efforts to provide access to genetic resources. This meeting laid the foundation for cooperation in establishing a set of standard operating procedures for U.S. and international culture collections in response to the Nagoya Protocol. PMID:28811341
McCluskey, Kevin; Barker, Katharine B; Barton, Hazel A; Boundy-Mills, Kyria; Brown, Daniel R; Coddington, Jonathan A; Cook, Kevin; Desmeth, Philippe; Geiser, David; Glaeser, Jessie A; Greene, Stephanie; Kang, Seogchan; Lomas, Michael W; Melcher, Ulrich; Miller, Scott E; Nobles, David R; Owens, Kristina J; Reichman, Jerome H; da Silva, Manuela; Wertz, John; Whitworth, Cale; Smith, David
2017-08-15
The U.S. Culture Collection Network held a meeting to share information about how culture collections are responding to the requirements of the recently enacted Nagoya Protocol on Access to Genetic Resources and the Fair and Equitable Sharing of Benefits Arising from their Utilization to the Convention on Biological Diversity (CBD). The meeting included representatives of many culture collections and other biological collections, the U.S. Department of State, U.S. Department of Agriculture, Secretariat of the CBD, interested scientific societies, and collection groups, including Scientific Collections International and the Global Genome Biodiversity Network. The participants learned about the policies of the United States and other countries regarding access to genetic resources, the definition of genetic resources, and the status of historical materials and genetic sequence information. Key topics included what constitutes access and how the CBD Access and Benefit-Sharing Clearing-House can help guide researchers through the process of obtaining Prior Informed Consent on Mutually Agreed Terms. U.S. scientists and their international collaborators are required to follow the regulations of other countries when working with microbes originally isolated outside the United States, and the local regulations required by the Nagoya Protocol vary by the country of origin of the genetic resource. Managers of diverse living collections in the United States described their holdings and their efforts to provide access to genetic resources. This meeting laid the foundation for cooperation in establishing a set of standard operating procedures for U.S. and international culture collections in response to the Nagoya Protocol.
NASA Astrophysics Data System (ADS)
Zheng, Jun; Ansari, Nirwan
2005-02-01
Call for Papers: Optical Access Networks With the wide deployment of fiber-optic technology over the past two decades, we have witnessed a tremendous growth of bandwidth capacity in the backbone networks of today's telecommunications infrastructure. However, access networks, which cover the "last-mile" areas and serve numerous residential and small business users, have not been scaled up commensurately. The local subscriber lines for telephone and cable television are still using twisted pairs and coaxial cables. Most residential connections to the Internet are still through dial-up modems operating at a low speed on twisted pairs. As the demand for access bandwidth increases with emerging high-bandwidth applications, such as distance learning, high-definition television (HDTV), and video on demand (VoD), the last-mile access networks have become a bandwidth bottleneck in today's telecommunications infrastructure. To ease this bottleneck, it is imperative to provide sufficient bandwidth capacity in the access networks to open the bottleneck and thus present more opportunities for the provisioning of multiservices. Optical access solutions promise huge bandwidth to service providers and low-cost high-bandwidth services to end users and are therefore widely considered the technology of choice for next-generation access networks. To realize the vision of optical access networks, however, many key issues still need to be addressed, such as network architectures, signaling protocols, and implementation standards. The major challenges lie in the fact that an optical solution must be not only robust, scalable, and flexible, but also implemented at a low cost comparable to that of existing access solutions in order to increase the economic viability of many potential high-bandwidth applications. In recent years, optical access networks have been receiving tremendous attention from both academia and industry. A large number of research activities have been carried out or are now underway this hot area. The purpose of this feature issue is to expose the networking community to the latest research breakthroughs and progresses in the area of optical access networks.
A Mobile Satellite Experiment (MSAT-X) network definition
NASA Technical Reports Server (NTRS)
Wang, Charles C.; Yan, Tsun-Yee
1990-01-01
The network architecture development of the Mobile Satellite Experiment (MSAT-X) project for the past few years is described. The results and findings of the network research activities carried out under the MSAT-X project are summarized. A framework is presented upon which the Mobile Satellite Systems (MSSs) operator can design a commercial network. A sample network configuration and its capability are also included under the projected scenario. The Communication Interconnection aspect of the MSAT-X network is discussed. In the MSAT-X network structure two basic protocols are presented: the channel access protocol, and the link connection protocol. The error-control techniques used in the MSAT-X project and the packet structure are also discussed. A description of two testbeds developed for experimentally simulating the channel access protocol and link control protocol, respectively, is presented. A sample network configuration and some future network activities of the MSAT-X project are also presented.
Distributed reservation-based code division multiple access
NASA Astrophysics Data System (ADS)
Wieselthier, J. E.; Ephremides, A.
1984-11-01
The use of spread spectrum signaling, motivated primarily by its antijamming capabilities in military applications, leads naturally to the use of Code Division Multiple Access (CDMA) techniques that permit the successful simultaneous transmission by a number of users over a wideband channel. In this paper we address some of the major issues that are associated with the design of multiple access protocols for spread spectrum networks. We then propose, analyze, and evaluate a distributed reservation-based multiple access protocol that does in fact exploit CDMA properties. Especially significant is the fact that no acknowledgment or feedback information from the destination is required (thus facilitating communication with a radio-silent mode), nor is any form of coordination among the users necessary.
CSMA/RN: A universal protocol for gigabit networks
NASA Technical Reports Server (NTRS)
Foudriat, E. C.; Maly, Kurt J.; Overstreet, C. Michael; Khanna, S.; Paterra, Frank
1990-01-01
Networks must provide intelligent access for nodes to share the communications resources. In the range of 100 Mbps to 1 Gbps, the demand access class of protocols were studied extensively. Many use some form of slot or reservation system and many the concept of attempt and defer to determine the presence or absence of incoming information. The random access class of protocols like shared channel systems (Ethernet), also use the concept of attempt and defer in the form of carrier sensing to alleviate the damaging effects of collisions. In CSMA/CD, the sensing of interference is on a global basis. All systems discussed above have one aspect in common, they examine activity on the network either locally or globally and react in an attempt and whatever mechanism. Of the attempt + mechanisms discussed, one is obviously missing; that is attempt and truncate. Attempt and truncate was studied in a ring configuration called the Carrier Sensed Multiple Access Ring Network (CSMA/RN). The system features of CSMA/RN are described including a discussion of the node operations for inserting and removing messages and for handling integrated traffic. The performance and operational features based on analytical and simulation studies which indicate that CSMA/RN is a useful and adaptable protocol over a wide range of network conditions are discussed. Finally, the research and development activities necessary to demonstrate and realize the potential of CSMA/RN as a universal, gigabit network protocol is outlined.
Tag Content Access Control with Identity-based Key Exchange
NASA Astrophysics Data System (ADS)
Yan, Liang; Rong, Chunming
2010-09-01
Radio Frequency Identification (RFID) technology that used to identify objects and users has been applied to many applications such retail and supply chain recently. How to prevent tag content from unauthorized readout is a core problem of RFID privacy issues. Hash-lock access control protocol can make tag to release its content only to reader who knows the secret key shared between them. However, in order to get this shared secret key required by this protocol, reader needs to communicate with a back end database. In this paper, we propose to use identity-based secret key exchange approach to generate the secret key required for hash-lock access control protocol. With this approach, not only back end database connection is not needed anymore, but also tag cloning problem can be eliminated at the same time.
Zdrazil, B.; Neefs, J.-M.; Van Vlijmen, H.; Herhaus, C.; Caracoti, A.; Brea, J.; Roibás, B.; Loza, M. I.; Queralt-Rosinach, N.; Furlong, L. I.; Gaulton, A.; Bartek, L.; Senger, S.; Chichester, C.; Engkvist, O.; Evelo, C. T.; Franklin, N. I.; Marren, D.; Ecker, G. F.
2016-01-01
Phenotypic screening is in a renaissance phase and is expected by many academic and industry leaders to accelerate the discovery of new drugs for new biology. Given that phenotypic screening is per definition target agnostic, the emphasis of in silico and in vitro follow-up work is on the exploration of possible molecular mechanisms and efficacy targets underlying the biological processes interrogated by the phenotypic screening experiments. Herein, we present six exemplar computational protocols for the interpretation of cellular phenotypic screens based on the integration of compound, target, pathway, and disease data established by the IMI Open PHACTS project. The protocols annotate phenotypic hit lists and allow follow-up experiments and mechanistic conclusions. The annotations included are from ChEMBL, ChEBI, GO, WikiPathways and DisGeNET. Also provided are protocols which select from the IUPHAR/BPS Guide to PHARMACOLOGY interaction file selective compounds to probe potential targets and a correlation robot which systematically aims to identify an overlap of active compounds in both the phenotypic as well as any kinase assay. The protocols are applied to a phenotypic pre-lamin A/C splicing assay selected from the ChEMBL database to illustrate the process. The computational protocols make use of the Open PHACTS API and data and are built within the Pipeline Pilot and KNIME workflow tools. PMID:27774140
Clinical Investigation Program Annual Progress Report
1993-09-30
272 93/200A 0 Comparison of Healing Rates of Bones Plated Following Fracture, Among Yucatan Swine Having Open and Closed Physes...Study Objective: Compare two enteral formulas in respect to nutritional aspects. (16) Technical Approach: Protocol will take place in SICU. (17...Protocol #: 93/200A (3) Status: Ongoing (4) Title: Comparison of Healing Rates of Bones Plated Following Fractures, Among Yucatan Swine Having Open and
Publish or perish, and pay--the new paradigm of open-access journals.
Tzarnas, Stephanie; Tzarnas, Chris D
2015-01-01
The new open-access journal business model is changing the publication landscape and residents and junior faculty should be aware of these changes. A national survey of surgery program directors and residents was performed. Open-access journals have been growing over the past decade, and many traditional printed journals are also sponsoring open-access options (the hybrid model) for accepted articles. Authors need to be aware of the new publishing paradigm and potential costs involved in publishing their work. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
International Convergence on Geoscience Cyberinfrastructure
NASA Astrophysics Data System (ADS)
Allison, M. L.; Atkinson, R.; Arctur, D. K.; Cox, S.; Jackson, I.; Nativi, S.; Wyborn, L. A.
2012-04-01
There is growing international consensus on addressing the challenges to cyber(e)-infrastructure for the geosciences. These challenges include: Creating common standards and protocols; Engaging the vast number of distributed data resources; Establishing practices for recognition of and respect for intellectual property; Developing simple data and resource discovery and access systems; Building mechanisms to encourage development of web service tools and workflows for data analysis; Brokering the diverse disciplinary service buses; Creating sustainable business models for maintenance and evolution of information resources; Integrating the data management life-cycle into the practice of science. Efforts around the world are converging towards de facto creation of an integrated global digital data network for the geosciences based on common standards and protocols for data discovery and access, and a shared vision of distributed, web-based, open source interoperable data access and integration. Commonalities include use of Open Geospatial Consortium (OGC) and ISO specifications and standardized data interchange mechanisms. For multidisciplinarity, mediation, adaptation, and profiling services have been successfully introduced to leverage the geosciences standards which are commonly used by the different geoscience communities -introducing a brokering approach which extends the basic SOA archetype. Principal challenges are less technical than cultural, social, and organizational. Before we can make data interoperable, we must make people interoperable. These challenges are being met by increased coordination of development activities (technical, organizational, social) among leaders and practitioners in national and international efforts across the geosciences to foster commonalities across disparate networks. In doing so, we will 1) leverage and share resources, and developments, 2) facilitate and enhance emerging technical and structural advances, 3) promote interoperability across scientific domains, 4) support the promulgation and institutionalization of agreed-upon standards, protocols, and practice, and 5) enhance knowledge transfer not only across the community, but into the domain sciences, 6) lower existing entry barriers for users and data producers, 7) build on the existing disciplinary infrastructures leveraging their service buses. . All of these objectives are required for establishing a permanent and sustainable cyber(e)-infrastructure for the geosciences. The rationale for this approach is well articulated in the AuScope mission statement: "Many of these problems can only be solved on a national, if not global scale. No single researcher, research institution, discipline or jurisdiction can provide the solutions. We increasingly need to embrace e-Research techniques and use the internet not only to access nationally distributed datasets, instruments and compute infrastructure, but also to build online, 'virtual' communities of globally dispersed researchers." Multidisciplinary interoperability can be successfully pursued by adopting a "system of systems" or a "Network of Networks" philosophy. This approach aims to: (a) supplement but not supplant systems mandates and governance arrangements; (b) keep the existing capacities as autonomous as possible; (c) lower entry barriers; (d) Build incrementally on existing infrastructures (information systems); (e) incorporate heterogeneous resources by introducing distribution and mediation functionalities. This approach has been adopted by the European INSPIRE (Infrastructure for Spatial Information in the European Community) initiative and by the international GEOSS (Global Earth Observation System of Systems) programme.
NASA GES DISC support of CO2 Data from OCO-2, ACOS, and AIRS
NASA Technical Reports Server (NTRS)
Wei, Jennifer C; Vollmer, Bruce E.; Savtchenko, Andrey K.; Hearty, Thomas J; Albayrak, Rustem Arif; Deshong, Barbara E.
2013-01-01
NASA Goddard Earth Sciences Data and Information Services Centers (GES DISC) is the data center assigned to archive and distribute current AIRS, ACOS data and data from the upcoming OCO-2 mission. The GES DISC archives and supports data containing information on CO2 as well as other atmospheric composition, atmospheric dynamics, modeling and precipitation. Along with the data stewardship, an important mission of GES DISC is to facilitate access to and enhance the usability of data as well as to broaden the user base. GES DISC strives to promote the awareness of science content and novelty of the data by working with Science Team members and releasing news articles as appropriate. Analysis of events that are of interest to the general public, and that help in understanding the goals of NASA Earth Observing missions, have been among most popular practices.Users have unrestricted access to a user-friendly search interface, Mirador, that allows temporal, spatial, keyword and event searches, as well as an ontology-driven drill down. Variable subsetting, format conversion, quality screening, and quick browse, are among the services available in Mirador. The majority of the GES DISC data are also accessible through OPeNDAP (Open-source Project for a Network Data Access Protocol) and WMS (Web Map Service). These services add more options for specialized subsetting, format conversion, image viewing and contributing to data interoperability.
NASA Astrophysics Data System (ADS)
Weber, J.; Domenico, B.
2004-12-01
This paper is an example of what we call data interactive publications. With a properly configured workstation, the readers can click on "hotspots" in the document that launches an interactive analysis tool called the Unidata Integrated Data Viewer (IDV). The IDV will enable the readers to access, analyze and display datasets on remote servers as well as documents describing them. Beyond the parameters and datasets initially configured into the paper, the analysis tool will have access to all the other dataset parameters as well as to a host of other datasets on remote servers. These data interactive publications are built on top of several data delivery, access, discovery, and visualization tools developed by Unidata and its partner organizations. For purposes of illustrating this integrative technology, we will use data from the event of Hurricane Charley over Florida from August 13-15, 2004. This event illustrates how components of this process fit together. The Local Data Manager (LDM), Open-source Project for a Network Data Access Protocol (OPeNDAP) and Abstract Data Distribution Environment (ADDE) services, Thematic Realtime Environmental Distributed Data Service (THREDDS) cataloging services, and the IDV are highlighted in this example of a publication with embedded pointers for accessing and interacting with remote datasets. An important objective of this paper is to illustrate how these integrated technologies foster the creation of documents that allow the reader to learn the scientific concepts by direct interaction with illustrative datasets, and help build a framework for integrated Earth System science.
Securing the AliEn File Catalogue - Enforcing authorization with accountable file operations
NASA Astrophysics Data System (ADS)
Schreiner, Steffen; Bagnasco, Stefano; Sankar Banerjee, Subho; Betev, Latchezar; Carminati, Federico; Vladimirovna Datskova, Olga; Furano, Fabrizio; Grigoras, Alina; Grigoras, Costin; Mendez Lorenzo, Patricia; Peters, Andreas Joachim; Saiz, Pablo; Zhu, Jianlin
2011-12-01
The AliEn Grid Services, as operated by the ALICE Collaboration in its global physics analysis grid framework, is based on a central File Catalogue together with a distributed set of storage systems and the possibility to register links to external data resources. This paper describes several identified vulnerabilities in the AliEn File Catalogue access protocol regarding fraud and unauthorized file alteration and presents a more secure and revised design: a new mechanism, called LFN Booking Table, is introduced in order to keep track of access authorization in the transient state of files entering or leaving the File Catalogue. Due to a simplification of the original Access Envelope mechanism for xrootd-protocol-based storage systems, fundamental computational improvements of the mechanism were achieved as well as an up to 50% reduction of the credential's size. By extending the access protocol with signed status messages from the underlying storage system, the File Catalogue receives trusted information about a file's size and checksum and the protocol is no longer dependent on client trust. Altogether, the revised design complies with atomic and consistent transactions and allows for accountable, authentic, and traceable file operations. This paper describes these changes as part and beyond the development of AliEn version 2.19.
OSI Upper Layers Support for Applications.
ERIC Educational Resources Information Center
Davison, Wayne
1990-01-01
Discusses how various Open Systems Interconnection (OSI) application layer protocols can be used together, along with the Presentation and Session protocols, to support the interconnection requirements of applications. Application layer protocol standards that are currently available or under development are reviewed, and the File, Transfer,…
The GeoDataPortal: A Standards-based Environmental Modeling Data Access and Manipulation Toolkit
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Kunicki, T.; Booth, N.; Suftin, I.; Zoerb, R.; Walker, J.
2010-12-01
Environmental modelers from fields of study such as climatology, hydrology, geology, and ecology rely on many data sources and processing methods that are common across these disciplines. Interest in inter-disciplinary, loosely coupled modeling and data sharing is increasing among scientists from the USGS, other agencies, and academia. For example, hydrologic modelers need downscaled climate change scenarios and land cover data summarized for the watersheds they are modeling. Subsequently, ecological modelers are interested in soil moisture information for a particular habitat type as predicted by the hydrologic modeler. The USGS Center for Integrated Data Analytics Geo Data Portal (GDP) project seeks to facilitate this loose model coupling data sharing through broadly applicable open-source web processing services. These services simplify and streamline the time consuming and resource intensive tasks that are barriers to inter-disciplinary collaboration. The GDP framework includes a catalog describing projects, models, data, processes, and how they relate. Using newly introduced data, or sources already known to the catalog, the GDP facilitates access to sub-sets and common derivatives of data in numerous formats on disparate web servers. The GDP performs many of the critical functions needed to summarize data sources into modeling units regardless of scale or volume. A user can specify their analysis zones or modeling units as an Open Geospatial Consortium (OGC) standard Web Feature Service (WFS). Utilities to cache Shapefiles and other common GIS input formats have been developed to aid in making the geometry available for processing via WFS. Dataset access in the GDP relies primarily on the Unidata NetCDF-Java library’s common data model. Data transfer relies on methods provided by Unidata’s Thematic Real-time Environmental Data Distribution System Data Server (TDS). TDS services of interest include the Open-source Project for a Network Data Access Protocol (OPeNDAP) standard for gridded time series, the OGC’s Web Coverage Service for high-density static gridded data, and Unidata’s CDM-remote for point time series. OGC WFS and Sensor Observation Service (SOS) are being explored as mechanisms to serve and access static or time series data attributed to vector geometry. A set of standardized XML-based output formats allows easy transformation into a wide variety of “model-ready” formats. Interested users will have the option of submitting custom transformations to the GDP or transforming the XML output as a post-process. The GDP project aims to support simple, rapid development of thin user interfaces (like web portals) to commonly needed environmental modeling-related data access and manipulation tools. Standalone, service-oriented components of the GDP framework provide the metadata cataloging, data subset access, and spatial-statistics calculations needed to support interdisciplinary environmental modeling.
Public storage for the Open Science Grid
NASA Astrophysics Data System (ADS)
Levshina, T.; Guru, A.
2014-06-01
The Open Science Grid infrastructure doesn't provide efficient means to manage public storage offered by participating sites. A Virtual Organization that relies on opportunistic storage has difficulties finding appropriate storage, verifying its availability, and monitoring its utilization. The involvement of the production manager, site administrators and VO support personnel is required to allocate or rescind storage space. One of the main requirements for Public Storage implementation is that it should use SRM or GridFTP protocols to access the Storage Elements provided by the OSG Sites and not put any additional burden on sites. By policy, no new services related to Public Storage can be installed and run on OSG sites. Opportunistic users also have difficulties in accessing the OSG Storage Elements during the execution of jobs. A typical users' data management workflow includes pre-staging common data on sites before a job's execution, then storing for a subsequent download to a local institution the output data produced by a job on a worker node. When the amount of data is significant, the only means to temporarily store the data is to upload it to one of the Storage Elements. In order to do that, a user's job should be aware of the storage location, availability, and free space. After a successful data upload, users must somehow keep track of the data's location for future access. In this presentation we propose solutions for storage management and data handling issues in the OSG. We are investigating the feasibility of using the integrated Rule-Oriented Data System developed at RENCI as a front-end service to the OSG SEs. The current architecture, state of deployment and performance test results will be discussed. We will also provide examples of current usage of the system by beta-users.
ERIC Educational Resources Information Center
Sambe, Manasseh Tyungu; Raphael, Gabriel Okplogidi
2015-01-01
This study examines the kinds of open access scholarly publication or information resources accepted and adopted by federal university libraries in South East Nigeria. The purpose was to determine the factors that affect open access scholarly publication or information resources acceptance and adoption in university libraries. The study adopted…
The Open Access Availability of Library and Information Science Literature
ERIC Educational Resources Information Center
Way, Doug
2010-01-01
To examine the open access availability of Library and Information Science (LIS) research, a study was conducted using Google Scholar to search for articles from 20 top LIS journals. The study examined whether Google Scholar was able to find any links to full text, if open access versions of the articles were available and where these articles…
Code of Federal Regulations, 2011 CFR
2011-07-01
... decision on whether a grantee or transporter has provided open and nondiscriminatory access? 291.112..., DEPARTMENT OF THE INTERIOR APPEALS OPEN AND NONDISCRIMINATORY ACCESS TO OIL AND GAS PIPELINES UNDER THE OUTER... grantee or transporter has provided open and nondiscriminatory access? MMS will begin processing a...
BMC Medicine editorial board members on open access publishing.
Carmont, Michael R; Lawn, Stephen D; Stray-Pedersen, Babill; Shoenfeld, Yehuda; Meier, Pascal
2014-01-21
In recognition of Open Access week (21st-27th October 2013), we asked some BMC Medicine Editorial Board Members to share their views and experiences on open access publishing. In this short video, they highlight the benefits of visibility and dissemination of their research, and discuss the future directions for this model of publishing.
BMC medicine editorial board members on open access publishing
2014-01-01
In recognition of Open Access week (21st-27th October 2013), we asked some BMC Medicine Editorial Board Members to share their views and experiences on open access publishing. In this short video, they highlight the benefits of visibility and dissemination of their research, and discuss the future directions for this model of publishing. PMID:24447778
Accessing Multi-Dimensional Images and Data Cubes in the Virtual Observatory
NASA Astrophysics Data System (ADS)
Tody, Douglas; Plante, R. L.; Berriman, G. B.; Cresitello-Dittmar, M.; Good, J.; Graham, M.; Greene, G.; Hanisch, R. J.; Jenness, T.; Lazio, J.; Norris, P.; Pevunova, O.; Rots, A. H.
2014-01-01
Telescopes across the spectrum are routinely producing multi-dimensional images and datasets, such as Doppler velocity cubes, polarization datasets, and time-resolved “movies.” Examples of current telescopes producing such multi-dimensional images include the JVLA, ALMA, and the IFU instruments on large optical and near-infrared wavelength telescopes. In the near future, both the LSST and JWST will also produce such multi-dimensional images routinely. High-energy instruments such as Chandra produce event datasets that are also a form of multi-dimensional data, in effect being a very sparse multi-dimensional image. Ensuring that the data sets produced by these telescopes can be both discovered and accessed by the community is essential and is part of the mission of the Virtual Observatory (VO). The Virtual Astronomical Observatory (VAO, http://www.usvao.org/), in conjunction with its international partners in the International Virtual Observatory Alliance (IVOA), has developed a protocol and an initial demonstration service designed for the publication, discovery, and access of arbitrarily large multi-dimensional images. The protocol describing multi-dimensional images is the Simple Image Access Protocol, version 2, which provides the minimal set of metadata required to characterize a multi-dimensional image for its discovery and access. A companion Image Data Model formally defines the semantics and structure of multi-dimensional images independently of how they are serialized, while providing capabilities such as support for sparse data that are essential to deal effectively with large cubes. A prototype data access service has been deployed and tested, using a suite of multi-dimensional images from a variety of telescopes. The prototype has demonstrated the capability to discover and remotely access multi-dimensional data via standard VO protocols. The prototype informs the specification of a protocol that will be submitted to the IVOA for approval, with an operational data cube service to be delivered in mid-2014. An associated user-installable VO data service framework will provide the capabilities required to publish VO-compatible multi-dimensional images or data cubes.
An Ultra-low-power Medium Access Control Protocol for Body Sensor Network.
Li, Huaming; Tan, Jindong
2005-01-01
In this paper, a medium access control (MAC) protocol designed for Body Sensor Network (BSN-MAC) is proposed. BSN-MAC is an adaptive, feedback-based and IEEE 802.15.4-compatible MAC protocol. Due to the traffic coupling and sensor diversity characteristics of BSNs, common MAC protocols can not satisfy the unique requirements of the biomedical sensors in BSN. BSN-MAC exploits the feedback information from the deployed sensors to form a closed-loop control of the MAC parameters. A control algorithm is proposed to enable the BSN coordinator to adjust parameters of the IEEE 802.15.4 superframe to achieve both energy efficiency and low latency on energy critical nodes. We evaluate the performance of BSN-MAC using energy efficiency as the primary metric.
NASA Technical Reports Server (NTRS)
Israel, David J.
2005-01-01
The NASA Space Network (SN) supports a variety of missions using the Tracking and Data Relay Satellite System (TDRSS), which includes ground stations in White Sands, New Mexico and Guam. A Space Network IP Services (SNIS) architecture is being developed to support future users with requirements for end-to-end Internet Protocol (IP) communications. This architecture will support all IP protocols, including Mobile IP, over TDRSS Single Access, Multiple Access, and Demand Access Radio Frequency (RF) links. This paper will describe this architecture and how it can enable Low Earth Orbiting IP satellite missions.
Mumford, Leslie; Lam, Rachel; Wright, Virginia; Chau, Tom
2014-08-01
This study applied response efficiency theory to create the Access Technology Delivery Protocol (ATDP), a child and family-centred collaborative approach to the implementation of access technologies. We conducted a descriptive, mixed methods case study to demonstrate the ATDP method with a 12-year-old boy with no reliable means of access to an external device. Evaluations of response efficiency, satisfaction, goal attainment, technology use and participation were made after 8 and 16 weeks of training with a custom smile-based access technology. At the 16 week mark, the new access technology offered better response quality; teacher satisfaction was high; average technology usage was 3-4 times per week for up to 1 h each time; switch sensitivity and specificity reached 78% and 64%, respectively, and participation scores increased by 38%. This case supports further development and testing of the ATDP with additional children with multiple or severe disabilities.
Application of an access technology delivery protocol to two children with cerebral palsy.
Mumford, Leslie; Chau, Tom
2015-07-14
This study further delineates the merits and limitations of the Access Technology Delivery Protocol (ATDP) through its application to two children with severe disabilities. We conducted mixed methods case studies to demonstrate the ATDP with two children with no reliable means of access to an external device. Evaluations of response efficiency, satisfaction, goal attainment, technology use and participation were made after 8 and 16 weeks of training with custom access technologies. After 16 weeks, one child's switch offered improved response efficiency, high teacher satisfaction and increased participation. The other child's switch resulted in improved satisfaction and switch effectiveness but lower overall efficiency. The latter child was no longer using his switch by the end of the study. These contrasting findings indicate that changes to any contextual factors that may impact the user's switch performance should mandate a reassessment of the access pathway. Secondly, it is important to ensure that individuals who will be responsible for switch training be identified at the outset and engaged throughout the ATDP. Finally, the ATDP should continue to be tested with individuals with severe disabilities to build an evidence base for the delivery of response efficient access solutions. Implications for Rehabilitation A data-driven, comprehensive access technology delivery protocol for children with complex communication needs could help to mitigate technology abandonment. Successful adoption of an access technology requires personalized design, training of the technology user, the teaching staff, the caregivers and other communication partners, and integration with functional activities.
NASA Astrophysics Data System (ADS)
Brumana, R.; Santana Quintero, M.; Barazzetti, L.; Previtali, M.; Banfi, F.; Oreni, D.; Roels, D.; Roncoroni, F.
2015-08-01
Landscapes are dynamic entities, stretching and transforming across space and time, and need to be safeguarded as living places for the future, with interaction of human, social and economic dimensions. To have a comprehensive landscape evaluation several open data are needed, each one characterized by its own protocol, service interface, limiting or impeding this way interoperability and their integration. Indeed, nowadays the development of websites targeted to landscape assessment and touristic purposes requires many resources in terms of time, cost and IT skills to be implemented at different scales. For this reason these applications are limited to few cases mainly focusing on worldwide known touristic sites. The capability to spread the development of web-based multimedia virtual museum based on geospatial data relies for the future being on the possibility to discover the needed geo-spatial data through a single point of access in an homogenous way. In this paper the proposed innovative approach may facilitate the access to open data in a homogeneous way by means of specific components (the brokers) performing interoperability actions required to interconnect heterogeneous data sources. In the specific case study here analysed it has been implemented an interface to migrate a geo-swat chart based on local and regional geographic information into an user friendly Google Earth©-based infrastructure, integrating ancient cadastres and modern cartography, accessible by professionals and tourists via web and also via portable devices like tables and smartphones. The general aim of this work on the case study on the Lake of Como (Tremezzina municipality), is to boost the integration of assessment methodologies with digital geo-based technologies of map correlation for the multimedia ecomuseum system accessible via web. The developed WebGIS system integrates multi-scale and multi-temporal maps with different information (cultural, historical, landscape levels) represented by thematic icons allowing to transfer the richness of the landscape value to both tourists and professionals.
Radioactive hot cell access hole decontamination machine
Simpson, William E.
1982-01-01
Radioactive hot cell access hole decontamination machine. A mobile housing has an opening large enough to encircle the access hole and has a shielding door, with a door opening and closing mechanism, for uncovering and covering the opening. The housing contains a shaft which has an apparatus for rotating the shaft and a device for independently translating the shaft from the housing through the opening and access hole into the hot cell chamber. A properly sized cylindrical pig containing wire brushes and cloth or other disks, with an arrangement for releasably attaching it to the end of the shaft, circumferentially cleans the access hole wall of radioactive contamination and thereafter detaches from the shaft to fall into the hot cell chamber.
Honest broker protocol streamlines research access to data while safeguarding patient privacy.
Silvey, Scott A; Silvey, Scott Andrew; Schulte, Janet; Smaltz, Detlev H; Smaltz, Detlev Herb; Kamal, Jyoti
2008-11-06
At Ohio State University Medical Center, The Honest Broker Protocol provides a streamlined mechanism whereby investigators can obtain de-identified clinical data for non-FDA research without having to invest the significant time and effort necessary to craft a formalized protocol for IRB approval.
28 CFR 115.221 - Evidence protocol and forensic medical examinations.
Code of Federal Regulations, 2013 CFR
2013-07-01
.... Department of Justice's Office on Violence Against Women publication, “A National Protocol for Sexual Assault... for investigating allegations of sexual abuse, the agency shall follow a uniform evidence protocol... developed after 2011. (c) The agency shall offer all victims of sexual abuse access to forensic medical...
28 CFR 115.21 - Evidence protocol and forensic medical examinations.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Office on Violence Against Women publication, “A National Protocol for Sexual Assault Medical Forensic... allegations of sexual abuse, the agency shall follow a uniform evidence protocol that maximizes the potential.... (c) The agency shall offer all victims of sexual abuse access to forensic medical examinations...
28 CFR 115.221 - Evidence protocol and forensic medical examinations.
Code of Federal Regulations, 2014 CFR
2014-07-01
.... Department of Justice's Office on Violence Against Women publication, “A National Protocol for Sexual Assault... for investigating allegations of sexual abuse, the agency shall follow a uniform evidence protocol... developed after 2011. (c) The agency shall offer all victims of sexual abuse access to forensic medical...
28 CFR 115.221 - Evidence protocol and forensic medical examinations.
Code of Federal Regulations, 2012 CFR
2012-07-01
.... Department of Justice's Office on Violence Against Women publication, “A National Protocol for Sexual Assault... for investigating allegations of sexual abuse, the agency shall follow a uniform evidence protocol... developed after 2011. (c) The agency shall offer all victims of sexual abuse access to forensic medical...
28 CFR 115.21 - Evidence protocol and forensic medical examinations.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Office on Violence Against Women publication, “A National Protocol for Sexual Assault Medical Forensic... allegations of sexual abuse, the agency shall follow a uniform evidence protocol that maximizes the potential.... (c) The agency shall offer all victims of sexual abuse access to forensic medical examinations...
28 CFR 115.21 - Evidence protocol and forensic medical examinations.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Office on Violence Against Women publication, “A National Protocol for Sexual Assault Medical Forensic... allegations of sexual abuse, the agency shall follow a uniform evidence protocol that maximizes the potential.... (c) The agency shall offer all victims of sexual abuse access to forensic medical examinations...
[Open availability of articles and raw research data in Spanish pediatrics journals].
Aleixandre-Benavent, R; Vidal-Infer, A; Alonso-Arroyo, A; González de Dios, J; Ferrer-Sapena, A; Peset, F
2015-01-01
The open Access to publications and the raw data allows its re-use and enhances the advancement of science. The aim of this paper is to identify these practices in Spanish pediatrics journals. We reviewed the author's instructions in 13 Spanish pediatrics journals, identifying their open access and deposit policy. Eight journals allow open access without restriction, and 5 provide information on the ability to re-use and depositing data in repositories or websites. Most of the journals have open access, but do not promote the deposit of additional material or articles in repositories or websites. Copyright © 2013 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.
Government Open Systems Interconnection Profile (GOSIP) transition strategy
NASA Astrophysics Data System (ADS)
Laxen, Mark R.
1993-09-01
This thesis analyzes the Government Open Systems Interconnection Profile (GOSIP) and the requirements of the Federal Information Processing Standard (FIPS) Publication 146-1. It begins by examining the International Organization for Standardization (ISO) Open Systems Interconnection (OSI) architecture and protocol suites and the distinctions between GOSIP version one and two. Additionally, it explores some of the GOSIP protocol details and discusses the process by which standards organizations have developed their recommendations. Implementation considerations from both government and vendor perspectives illustrate the barriers and requirements faced by information systems managers, as well as basic transition strategies. The result of this thesis is to show a transition strategy through an extended and coordinated period of coexistence due to extensive legacy systems and GOSIP product unavailability. Recommendations for GOSIP protocol standards to include capabilities outside the OSI model are also presented.
MAC Protocol for Ad Hoc Networks Using a Genetic Algorithm
Elizarraras, Omar; Panduro, Marco; Méndez, Aldo L.
2014-01-01
The problem of obtaining the transmission rate in an ad hoc network consists in adjusting the power of each node to ensure the signal to interference ratio (SIR) and the energy required to transmit from one node to another is obtained at the same time. Therefore, an optimal transmission rate for each node in a medium access control (MAC) protocol based on CSMA-CDMA (carrier sense multiple access-code division multiple access) for ad hoc networks can be obtained using evolutionary optimization. This work proposes a genetic algorithm for the transmission rate election considering a perfect power control, and our proposition achieves improvement of 10% compared with the scheme that handles the handshaking phase to adjust the transmission rate. Furthermore, this paper proposes a genetic algorithm that solves the problem of power combining, interference, data rate, and energy ensuring the signal to interference ratio in an ad hoc network. The result of the proposed genetic algorithm has a better performance (15%) compared to the CSMA-CDMA protocol without optimizing. Therefore, we show by simulation the effectiveness of the proposed protocol in terms of the throughput. PMID:25140339
Potential of Wake-Up Radio-Based MAC Protocols for Implantable Body Sensor Networks (IBSN)—A Survey
Karuppiah Ramachandran, Vignesh Raja; Ayele, Eyuel D.; Meratnia, Nirvana; Havinga, Paul J. M.
2016-01-01
With the advent of nano-technology, medical sensors and devices are becoming highly miniaturized. Consequently, the number of sensors and medical devices being implanted to accurately monitor and diagnose a disease is increasing. By measuring the symptoms and controlling a medical device as close as possible to the source, these implantable devices are able to save lives. A wireless link between medical sensors and implantable medical devices is essential in the case of closed-loop medical devices, in which symptoms of the diseases are monitored by sensors that are not placed in close proximity of the therapeutic device. Medium Access Control (MAC) is crucial to make it possible for several medical devices to communicate using a shared wireless medium in such a way that minimum delay, maximum throughput, and increased network life-time are guaranteed. To guarantee this Quality of Service (QoS), the MAC protocols control the main sources of limited resource wastage, namely the idle-listening, packet collisions, over-hearing, and packet loss. Traditional MAC protocols designed for body sensor networks are not directly applicable to Implantable Body Sensor Networks (IBSN) because of the dynamic nature of the radio channel within the human body and the strict QoS requirements of IBSN applications. Although numerous MAC protocols are available in the literature, the majority of them are designed for Body Sensor Network (BSN) and Wireless Sensor Network (WSN). To the best of our knowledge, there is so far no research paper that explores the impact of these MAC protocols specifically for IBSN. MAC protocols designed for implantable devices are still in their infancy and one of their most challenging objectives is to be ultra-low-power. One of the technological solutions to achieve this objective so is to integrate the concept of Wake-up radio (WuR) into the MAC design. In this survey, we present a taxonomy of MAC protocols based on their use of WuR technology and identify their bottlenecks to be used in IBSN applications. Furthermore, we present a number of open research challenges and requirements for designing an energy-efficient and reliable wireless communication protocol for IBSN. PMID:27916822
Potential of Wake-Up Radio-Based MAC Protocols for Implantable Body Sensor Networks (IBSN)-A Survey.
Karuppiah Ramachandran, Vignesh Raja; Ayele, Eyuel D; Meratnia, Nirvana; Havinga, Paul J M
2016-11-29
With the advent of nano-technology, medical sensors and devices are becoming highly miniaturized. Consequently, the number of sensors and medical devices being implanted to accurately monitor and diagnose a disease is increasing. By measuring the symptoms and controlling a medical device as close as possible to the source, these implantable devices are able to save lives. A wireless link between medical sensors and implantable medical devices is essential in the case of closed-loop medical devices, in which symptoms of the diseases are monitored by sensors that are not placed in close proximity of the therapeutic device. Medium Access Control (MAC) is crucial to make it possible for several medical devices to communicate using a shared wireless medium in such a way that minimum delay, maximum throughput, and increased network life-time are guaranteed. To guarantee this Quality of Service (QoS), the MAC protocols control the main sources of limited resource wastage, namely the idle-listening, packet collisions, over-hearing, and packet loss. Traditional MAC protocols designed for body sensor networks are not directly applicable to Implantable Body Sensor Networks (IBSN) because of the dynamic nature of the radio channel within the human body and the strict QoS requirements of IBSN applications. Although numerous MAC protocols are available in the literature, the majority of them are designed for Body Sensor Network (BSN) and Wireless Sensor Network (WSN). To the best of our knowledge, there is so far no research paper that explores the impact of these MAC protocols specifically for IBSN. MAC protocols designed for implantable devices are still in their infancy and one of their most challenging objectives is to be ultra-low-power. One of the technological solutions to achieve this objective so is to integrate the concept of Wake-up radio (WuR) into the MAC design. In this survey, we present a taxonomy of MAC protocols based on their use of WuR technology and identify their bottlenecks to be used in IBSN applications. Furthermore, we present a number of open research challenges and requirements for designing an energy-efficient and reliable wireless communication protocol for IBSN.
Positioning Your Library in an Open-Access Environment
ERIC Educational Resources Information Center
Bhatt, Anjana H.
2010-01-01
This paper is a summary of the project that the author completed at Florida Gulf Coast University (FGCU) library for providing online access to 80 open access E-journals and digital collections. Although FGCU uses SerialsSolutions products to establish online access, any one can provide access to these collections as they are free for all. Paper…
Is open access sufficient? A review of the quality of open-access nursing journals.
Crowe, Marie; Carlyle, Dave
2015-02-01
The present study aims to review the quality of open-access nursing journals listed in the Directory of Open Access Journals that published papers in 2013 with a nursing focus, written in English, and were freely accessible. Each journal was reviewed in relation to their publisher, year of commencement, number of papers published in 2013, fee for publication, indexing, impact factor, and evidence of requirements for ethics and disclosure statements. The quality of the journals was assessed by impact factors and the requirements for indexing in PubMed. A total of 552 were published in 2013 in the 19 open-access nursing journals that met the inclusion criteria. No journals had impact factors listed in Web of Knowledge, but three had low Scopus impact factors. Only five journals were indexed with PubMed. The quality of the 19 journals included in the review was evaluated as inferior to most subscription-fee journals. Mental health nursing has some responsibility to the general public, and in particular, consumers of mental health services and their families, for the quality of papers published in open-access journals. The way forward might involve dual-platform publication or a process that enables assessment of how research has improved clinical outcomes. © 2014 Australian College of Mental Health Nurses Inc.
ERIC Educational Resources Information Center
Aagaard, James S.; And Others
This two-volume document specifies a protocol that was developed using the Reference Model for Open Systems Interconnection (OSI), which provides a framework for communications within a heterogeneous network environment. The protocol implements the features necessary for bibliographic searching, record maintenance, and mail transfer between…
NASA Astrophysics Data System (ADS)
Näthe, Paul; Becker, Rolf
2014-05-01
Soil moisture and plant available water are important environmental parameters that affect plant growth and crop yield. Hence, they are significant parameters for vegetation monitoring and precision agriculture. However, validation through ground-based soil moisture measurements is necessary for accessing soil moisture, plant canopy temperature, soil temperature and soil roughness with airborne hyperspectral imaging systems in a corresponding hyperspectral imaging campaign as a part of the INTERREG IV A-Project SMART INSPECTORS. At this point, commercially available sensors for matric potential, plant available water and volumetric water content are utilized for automated measurements with smart sensor nodes which are developed on the basis of open-source 868MHz radio modules, featuring a full-scale microcontroller unit that allows an autarkic operation of the sensor nodes on batteries in the field. The generated data from each of these sensor nodes is transferred wirelessly with an open-source protocol to a central node, the so-called "gateway". This gateway collects, interprets and buffers the sensor readings and, eventually, pushes the data-time series onto a server-based database. The entire data processing chain from the sensor reading to the final storage of data-time series on a server is realized with open-source hardware and software in such a way that the recorded data can be accessed from anywhere through the internet. It will be presented how this open-source based wireless sensor network is developed and specified for the application of ground truthing. In addition, the system's perspectives and potentials with respect to usability and applicability for vegetation monitoring and precision agriculture shall be pointed out. Regarding the corresponding hyperspectral imaging campaign, results from ground measurements will be discussed in terms of their contributing aspects to the remote sensing system. Finally, the significance of the wireless sensor network for the application of ground truthing shall be determined.
Arkheia: Data Management and Communication for Open Computational Neuroscience
Antolík, Ján; Davison, Andrew P.
2018-01-01
Two trends have been unfolding in computational neuroscience during the last decade. First, a shift of focus to increasingly complex and heterogeneous neural network models, with a concomitant increase in the level of collaboration within the field (whether direct or in the form of building on top of existing tools and results). Second, a general trend in science toward more open communication, both internally, with other potential scientific collaborators, and externally, with the wider public. This multi-faceted development toward more integrative approaches and more intense communication within and outside of the field poses major new challenges for modelers, as currently there is a severe lack of tools to help with automatic communication and sharing of all aspects of a simulation workflow to the rest of the community. To address this important gap in the current computational modeling software infrastructure, here we introduce Arkheia. Arkheia is a web-based open science platform for computational models in systems neuroscience. It provides an automatic, interactive, graphical presentation of simulation results, experimental protocols, and interactive exploration of parameter searches, in a web browser-based application. Arkheia is focused on automatic presentation of these resources with minimal manual input from users. Arkheia is written in a modular fashion with a focus on future development of the platform. The platform is designed in an open manner, with a clearly defined and separated API for database access, so that any project can write its own backend translating its data into the Arkheia database format. Arkheia is not a centralized platform, but allows any user (or group of users) to set up their own repository, either for public access by the general population, or locally for internal use. Overall, Arkheia provides users with an automatic means to communicate information about not only their models but also individual simulation results and the entire experimental context in an approachable graphical manner, thus facilitating the user's ability to collaborate in the field and outreach to a wider audience. PMID:29556187
Arkheia: Data Management and Communication for Open Computational Neuroscience.
Antolík, Ján; Davison, Andrew P
2018-01-01
Two trends have been unfolding in computational neuroscience during the last decade. First, a shift of focus to increasingly complex and heterogeneous neural network models, with a concomitant increase in the level of collaboration within the field (whether direct or in the form of building on top of existing tools and results). Second, a general trend in science toward more open communication, both internally, with other potential scientific collaborators, and externally, with the wider public. This multi-faceted development toward more integrative approaches and more intense communication within and outside of the field poses major new challenges for modelers, as currently there is a severe lack of tools to help with automatic communication and sharing of all aspects of a simulation workflow to the rest of the community. To address this important gap in the current computational modeling software infrastructure, here we introduce Arkheia. Arkheia is a web-based open science platform for computational models in systems neuroscience. It provides an automatic, interactive, graphical presentation of simulation results, experimental protocols, and interactive exploration of parameter searches, in a web browser-based application. Arkheia is focused on automatic presentation of these resources with minimal manual input from users. Arkheia is written in a modular fashion with a focus on future development of the platform. The platform is designed in an open manner, with a clearly defined and separated API for database access, so that any project can write its own backend translating its data into the Arkheia database format. Arkheia is not a centralized platform, but allows any user (or group of users) to set up their own repository, either for public access by the general population, or locally for internal use. Overall, Arkheia provides users with an automatic means to communicate information about not only their models but also individual simulation results and the entire experimental context in an approachable graphical manner, thus facilitating the user's ability to collaborate in the field and outreach to a wider audience.
Vierhout, Bastiaan P; Saleem, Ben R; Ott, Alewijn; van Dijl, Jan Maarten; de Kempenaer, Ties D van Andringa; Pierie, Maurice E N; Bottema, Jan T; Zeebregts, Clark J
2015-09-14
Access for endovascular repair of abdominal aortic aneurysms (EVAR) is obtained through surgical cutdown or percutaneously. The only devices suitable for percutaneous closure of the 20 French arteriotomies of the common femoral artery (CFA) are the Prostar(™) and Proglide(™) devices (Abbott Vascular). Positive effects of these devices seem to consist of a lower infection rate, and shorter operation time and hospital stay. This conclusion was published in previous reports comparing techniques in patients in two different groups (cohort or randomized). Access techniques were never compared in one and the same patient; this research simplifies comparison because patient characteristics will be similar in both groups. Percutaneous access of the CFA is compared to surgical cutdown in a single patient; in EVAR surgery, access is necessary in both groins in each patient. Randomization is performed on the introduction site of the larger main device of the endoprosthesis. The contralateral device of the endoprosthesis is smaller. When we use this type of randomization, both groups will contain a similar number of main and contralateral devices. Preoperative nose cultures and perineal cultures are obtained, to compare colonization with postoperative wound cultures (in case of a surgical site infection). Furthermore, patient comfort will be considered, using VAS-scores (Visual analog scale). Punch biopsies of the groin will be harvested to retrospectively compare skin of patients who suffered a surgical site infection (SSI) to patients who did not have an SSI. The PiERO trial is a multicenter randomized controlled clinical trial designed to show the consequences of using percutaneous access in EVAR surgery and focuses on the occurrence of surgical site infections. NTR4257 10 November 2013, NL44578.042.13.
Web accessibility and open source software.
Obrenović, Zeljko
2009-07-01
A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.
One Approach for Transitioning the iNET Standards into the IRIG 106 Telemetry Standards
2015-05-26
Protocol Suite. Figure 1 illustrates the Open Systems Interconnection ( OSI ) Model, the corresponding TCP/IP Model, and the major components of the TCP...IP Protocol Suite. Figure 2 represents the iNET-specific protocols layered onto the TCP/IP Model. Figure 1. OSI and TCP/IP Model with TCP/IP...Protocol Suite TCP/IP Protocol Suite Major Components IPv4 IPv6 TCP/IP Model OSI Model Application Presentation
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Manipon, G.; Xing, Z.
2007-12-01
The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* & Globus Alliance toolkits), and enables scientists to do multi- instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data Access Protocol (OpenDAP) servers. SciFlo also publishes its own SOAP services for space/time query and subsetting of Earth Science datasets, and automated access to large datasets via lists of (FTP, HTTP, or DAP) URLs which point to on-line HDF or netCDF files. Typical distributed workflows obtain datasets by calling standard WMS/WCS servers or discovering and fetching data granules from ftp sites; invoke remote analysis operators available as SOAP services (interface described by a WSDL document); and merge results into binary containers (netCDF or HDF files) for further analysis using local executable operators. Naming conventions (HDFEOS and CF-1.0 for netCDF) are exploited to automatically understand and read on-line datasets. More interoperable conventions, and broader adoption of existing converntions, are vital if we are to "scale up" automated choreography of Web Services beyond toy applications. Recently, the ESIP Federation sponsored a collaborative activity in which several ESIP members developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, the benefits of doing collaborative science analysis at the "touch of a button" once services are connected, and further collaborations that are being pursued.
Security Analysis and Improvements of Authentication and Access Control in the Internet of Things
Ndibanje, Bruce; Lee, Hoon-Jae; Lee, Sang-Gon
2014-01-01
Internet of Things is a ubiquitous concept where physical objects are connected over the internet and are provided with unique identifiers to enable their self-identification to other devices and the ability to continuously generate data and transmit it over a network. Hence, the security of the network, data and sensor devices is a paramount concern in the IoT network as it grows very fast in terms of exchanged data and interconnected sensor nodes. This paper analyses the authentication and access control method using in the Internet of Things presented by Jing et al (Authentication and Access Control in the Internet of Things. In Proceedings of the 2012 32nd International Conference on Distributed Computing Systems Workshops, Macau, China, 18–21 June 2012, pp. 588–592). According to our analysis, Jing et al.'s protocol is costly in the message exchange and the security assessment is not strong enough for such a protocol. Therefore, we propose improvements to the protocol to fill the discovered weakness gaps. The protocol enhancements facilitate many services to the users such as user anonymity, mutual authentication, and secure session key establishment. Finally, the performance and security analysis show that the improved protocol possesses many advantages against popular attacks, and achieves better efficiency at low communication cost. PMID:25123464
Security analysis and improvements of authentication and access control in the Internet of Things.
Ndibanje, Bruce; Lee, Hoon-Jae; Lee, Sang-Gon
2014-08-13
Internet of Things is a ubiquitous concept where physical objects are connected over the internet and are provided with unique identifiers to enable their self-identification to other devices and the ability to continuously generate data and transmit it over a network. Hence, the security of the network, data and sensor devices is a paramount concern in the IoT network as it grows very fast in terms of exchanged data and interconnected sensor nodes. This paper analyses the authentication and access control method using in the Internet of Things presented by Jing et al. (Authentication and Access Control in the Internet of Things. In Proceedings of the 2012 32nd International Conference on Distributed Computing Systems Workshops, Macau, China, 18-21 June 2012, pp. 588-592). According to our analysis, Jing et al.'s protocol is costly in the message exchange and the security assessment is not strong enough for such a protocol. Therefore, we propose improvements to the protocol to fill the discovered weakness gaps. The protocol enhancements facilitate many services to the users such as user anonymity, mutual authentication, and secure session key establishment. Finally, the performance and security analysis show that the improved protocol possesses many advantages against popular attacks, and achieves better efficiency at low communication cost.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Wholesale Electric Quadrant, which are incorporated herein by reference: (1) Open Access Same-Time....13, 001-1.0, 001-9.7, 001-14.1.3, and 001-15.1.2); (2) Open Access Same-Time Information Systems... minor corrections applied May 29, 2009 and September 8, 2009); (3) Open Access Same-Time Information...
Code of Federal Regulations, 2012 CFR
2012-04-01
... Wholesale Electric Quadrant, which are incorporated herein by reference: (1) Open Access Same-Time....13, 001-1.0, 001-9.7, 001-14.1.3, and 001-15.1.2); (2) Open Access Same-Time Information Systems... minor corrections applied May 29, 2009 and September 8, 2009); (3) Open Access Same-Time Information...
"I've Never Heard of It Before": Awareness of Open Access at a Small Liberal Arts University
ERIC Educational Resources Information Center
Kocken, Gregory J.; Wical, Stephanie H.
2013-01-01
Small colleges and universities, often late adopters of institutional repositories and open access initiatives, face challenges that have not fully been explored in the professional literature. In an effort to gauge the level of awareness of open access and institutional repositories at the University of Wisconsin-Eau Claire (UWEC), the authors of…
The Impact Factor: Implications of Open Access on Quality
ERIC Educational Resources Information Center
Grozanick, Sara E.
2010-01-01
There has been debate about the extent to which open access affects the quality of scholarly work. At the same time, researchers have begun to look for ways to evaluate the quality of open access publications. Dating back to the growth of citation indexes during the 1960s and 1970s, citation analysis--examining citation statistics--has since been…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-10
... Fisheries Act Catcher Vessels in the Inshore Open Access Fishery in the Bering Sea and Aleutian Islands... allowable catch (TAC) allocated to the inshore open access fishery in the BSAI. DATES: Effective 1200 hrs... open access fishery in the BSAI is 2,762 metric tons (mt) as established by the final 2009 and 2010...
The Future of Open Access Publishing in the Netherlands: Constant Dripping Wears Away the Stone
ERIC Educational Resources Information Center
Woutersen-Windhouwer, Saskia
2013-01-01
At present, about 20% of the scientific publications worldwide are freely (open-access) available (Bjork, Welling, Laakso, Majlender, Hedlund, & Guonason, 2010) and this percentage is constantly on the rise. In the Netherlands, a similar trend is visible (see Fig. 1). Why is open-access (OA) publishing important, and why will it become even…
47 CFR 79.109 - Activating accessibility features.
Code of Federal Regulations, 2014 CFR
2014-10-01
... ACCESSIBILITY OF VIDEO PROGRAMMING Apparatus § 79.109 Activating accessibility features. (a) Requirements... video programming transmitted in digital format simultaneously with sound, including apparatus designed to receive or display video programming transmitted in digital format using Internet protocol, with...
ERIC Educational Resources Information Center
Packer, Jaclyn; Reuschel, William
2018-01-01
Introduction: Accessibility of Voice over Internet Protocol (VoIP) systems was tested with a hands-on usability study and an online survey of VoIP users who are visually impaired. The survey examined the importance of common VoIP features, and both methods assessed difficulty in using those features. Methods: The usability test included four paid…
Comparison of two MAC protocols based on LEO satellite networks
NASA Astrophysics Data System (ADS)
Guan, Mingxiang; Wang, Ruichun
2009-12-01
With the development of LEO satellite communication, it is the basic requirement that various kinds of services will be provided. Considering that weak channel collision detection ability, long propagation delay and heavy load in LEO satellite communication system, a valid adaptive access control protocol APRMA is proposed. Different access probability functions for different services are obtained and appropriate access probabilities for voice and data users are updated slot by slot based on the estimation of the voice traffic and the channel status. Finally simulation results demonstrate that the performance of system is improved by the APRMA compared with the conventional PRMA, with an acceptable trade-off between QoS of voice and delay of data. Also the APRMA protocol will be suitable for HAPS (high altitude platform station) with the characters of weak channel collision detection ability, long propagation delay and heavy load.
NASA Astrophysics Data System (ADS)
Arunachalam, S.
2010-10-01
Open access brings greater visibility and impact to the work of scientists as is evidenced in the examples discussed in this paper. Researchers are often reluctant and afraid to deposit their works in Institutional Repositories. However, as is shown here, once they do so, they do not regret it. Open access will shortly become the norm and will be accepted by the vast majority of scientists. Seen through the lens of the philosophy of Bertrand Russell, the moral, economic and philosophical imperatives for open access are indeed strong.
Revascularization of immature mandibular premolar with pulpal necrosis - a case report.
Raju, S Murali Krishna; Yadav, Sarjeev Singh; Kumar M, Sita Rama
2014-09-01
This case report describes the Revascularization of a Permanent Immature Mandibular Premolar with Pulp Necrosis and apical periodontitis. Access opening was done & the canal was disinfected with copious irrigation using 2.5% NaOCl and triple antibiotic paste (Ciprofloxacin, Metronidazole, and Minocycline) as intracanal medicament. After the disinfection protocol is complete, it is followed by revascularization procedure. The apex was mechanically irritated to initiate bleeding into the canal to produce a blood clot to the level just below the level of cementoenamel junction. Mineral trioxide aggregate was placed over the blood clot followed by bonded resin restoration above it. After one year follow up; the patient was asymptomatic, no sinus tract was evident. Apical periodontitis was resolved, and there was radiographic evidence of continuing thickness of dentinal walls.
A New Look at Data Usage by Using Metadata Attributes as Indicators of Data Quality
NASA Technical Reports Server (NTRS)
Won, Young-In; Wanchoo, Lalit; Behnke, Jeanne
2016-01-01
This study reviews the key metrics (users, distributed volume, and files) in multiple ways to gain an understanding of the significance of the metadata. Characterizing the usability of data by key metadata elements, such as discipline and study area, will assist in understanding how the user needs have evolved over time. The data usage pattern based on product level provides insight into the level of data quality. In addition, the data metrics by various services, such as the Open-source Project for a Network Data Access Protocol (OPeNDAP) and subsets, address how these services have extended the usage of data. Over-all, this study presents the usage of data and metadata by metrics analyses, which may assist data centers in better supporting the needs of the users.
Human Connectome Project Informatics: quality control, database services, and data visualization
Marcus, Daniel S.; Harms, Michael P.; Snyder, Abraham Z.; Jenkinson, Mark; Wilson, J Anthony; Glasser, Matthew F.; Barch, Deanna M.; Archie, Kevin A.; Burgess, Gregory C.; Ramaratnam, Mohana; Hodge, Michael; Horton, William; Herrick, Rick; Olsen, Timothy; McKay, Michael; House, Matthew; Hileman, Michael; Reid, Erin; Harwell, John; Coalson, Timothy; Schindler, Jon; Elam, Jennifer S.; Curtiss, Sandra W.; Van Essen, David C.
2013-01-01
The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study. PMID:23707591
Cooperative Vehicular Networking: A Survey
Ahmed, Ejaz
2018-01-01
With the remarkable progress of cooperative communication technology in recent years, its transformation to vehicular networking is gaining momentum. Such a transformation has brought a new research challenge in facing the realization of cooperative vehicular networking (CVN). This paper presents a comprehensive survey of recent advances in the field of CVN. We cover important aspects of CVN research, including physical, medium access control, and routing protocols, as well as link scheduling and security. We also classify these research efforts in a taxonomy of cooperative vehicular networks. A set of key requirements for realizing the vision of cooperative vehicular networks is then identified and discussed. We also discuss open research challenges in enabling CVN. Lastly, the paper concludes by highlighting key points of research and future directions in the domain of CVN. PMID:29881331
Lee, Jumin; Cheng, Xi; Swails, Jason M.; ...
2015-11-12
Here we report that proper treatment of nonbonded interactions is essential for the accuracy of molecular dynamics (MD) simulations, especially in studies of lipid bilayers. The use of the CHARMM36 force field (C36 FF) in different MD simulation programs can result in disagreements with published simulations performed with CHARMM due to differences in the protocols used to treat the long-range and 1-4 nonbonded interactions. In this study, we systematically test the use of the C36 lipid FF in NAMD, GROMACS, AMBER, OpenMM, and CHARMM/OpenMM. A wide range of Lennard-Jones (LJ) cutoff schemes and integrator algorithms were tested to find themore » optimal simulation protocol to best match bilayer properties of six lipids with varying acyl chain saturation and head groups. MD simulations of a 1,2-dipalmitoyl-sn-phosphatidylcholine (DPPC) bilayer were used to obtain the optimal protocol for each program. MD simulations with all programs were found to reasonably match the DPPC bilayer properties (surface area per lipid, chain order parameters, and area compressibility modulus) obtained using the standard protocol used in CHARMM as well as from experiments. The optimal simulation protocol was then applied to the other five lipid simulations and resulted in excellent agreement between results from most simulation programs as well as with experimental data. AMBER compared least favorably with the expected membrane properties, which appears to be due to its use of the hard-truncation in the LJ potential versus a force-based switching function used to smooth the LJ potential as it approaches the cutoff distance. The optimal simulation protocol for each program has been implemented in CHARMM-GUI. This protocol is expected to be applicable to the remainder of the additive C36 FF including the proteins, nucleic acids, carbohydrates, and small molecules.« less
Lee, Jumin; Cheng, Xi; Swails, Jason M; Yeom, Min Sun; Eastman, Peter K; Lemkul, Justin A; Wei, Shuai; Buckner, Joshua; Jeong, Jong Cheol; Qi, Yifei; Jo, Sunhwan; Pande, Vijay S; Case, David A; Brooks, Charles L; MacKerell, Alexander D; Klauda, Jeffery B; Im, Wonpil
2016-01-12
Proper treatment of nonbonded interactions is essential for the accuracy of molecular dynamics (MD) simulations, especially in studies of lipid bilayers. The use of the CHARMM36 force field (C36 FF) in different MD simulation programs can result in disagreements with published simulations performed with CHARMM due to differences in the protocols used to treat the long-range and 1-4 nonbonded interactions. In this study, we systematically test the use of the C36 lipid FF in NAMD, GROMACS, AMBER, OpenMM, and CHARMM/OpenMM. A wide range of Lennard-Jones (LJ) cutoff schemes and integrator algorithms were tested to find the optimal simulation protocol to best match bilayer properties of six lipids with varying acyl chain saturation and head groups. MD simulations of a 1,2-dipalmitoyl-sn-phosphatidylcholine (DPPC) bilayer were used to obtain the optimal protocol for each program. MD simulations with all programs were found to reasonably match the DPPC bilayer properties (surface area per lipid, chain order parameters, and area compressibility modulus) obtained using the standard protocol used in CHARMM as well as from experiments. The optimal simulation protocol was then applied to the other five lipid simulations and resulted in excellent agreement between results from most simulation programs as well as with experimental data. AMBER compared least favorably with the expected membrane properties, which appears to be due to its use of the hard-truncation in the LJ potential versus a force-based switching function used to smooth the LJ potential as it approaches the cutoff distance. The optimal simulation protocol for each program has been implemented in CHARMM-GUI. This protocol is expected to be applicable to the remainder of the additive C36 FF including the proteins, nucleic acids, carbohydrates, and small molecules.
OIL SPILL DISPERSANT EFFECTIVENESS PROTOCOL. II: PERFORMANCE OF THE REVISED PROTOCOL
The current U.S. Environmental Protection Agency (EPA) protocol for testing the effectiveness of dispersants for use in treating oil spills on the open water, the swirling flask test (SFT), has been found to give widely varying results in the hands of different testing laborator...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mapas, Jose Kenneth D.; Thomay, Tim; Cartwright, Alexander N.
2016-05-05
Block copolymer (BCP) derived periodic nanostructures with domain sizes larger than 150 nm present a versatile platform for the fabrication of photonic materials. So far, the access to such materials has been limited to highly synthetically involved protocols. Herein, we report a simple, “user-friendly” method for the preparation of ultrahigh molecular weight linear poly(solketal methacrylate-b-styrene) block copolymers by a combination of Cu-wire-mediated ATRP and RAFT polymerizations. The synthesized copolymers with molecular weights up to 1.6 million g/mol and moderate dispersities readily assemble into highly ordered cylindrical or lamella microstructures with domain sizes as large as 292 nm, as determined bymore » ultra-small-angle x-ray scattering and scanning electron microscopy analyses. Solvent cast films of the synthesized block copolymers exhibit stop bands in the visible spectrum correlated to their domain spacings. The described method opens new avenues for facilitated fabrication and the advancement of fundamental understanding of BCP-derived photonic nanomaterials for a variety of applications.« less
Performing quantum computing experiments in the cloud
NASA Astrophysics Data System (ADS)
Devitt, Simon J.
2016-09-01
Quantum computing technology has reached a second renaissance in the past five years. Increased interest from both the private and public sector combined with extraordinary theoretical and experimental progress has solidified this technology as a major advancement in the 21st century. As anticipated my many, some of the first realizations of quantum computing technology has occured over the cloud, with users logging onto dedicated hardware over the classical internet. Recently, IBM has released the Quantum Experience, which allows users to access a five-qubit quantum processor. In this paper we take advantage of this online availability of actual quantum hardware and present four quantum information experiments. We utilize the IBM chip to realize protocols in quantum error correction, quantum arithmetic, quantum graph theory, and fault-tolerant quantum computation by accessing the device remotely through the cloud. While the results are subject to significant noise, the correct results are returned from the chip. This demonstrates the power of experimental groups opening up their technology to a wider audience and will hopefully allow for the next stage of development in quantum information technology.
Mobilising Open Access to Research Data: Recommendations from the RECODE project
NASA Astrophysics Data System (ADS)
Finn, Rachel; Sveinsdottir, Thordis
2015-04-01
This paper will introduce the findings and policy recommendations from the FP7 project RECODE (Policy RECommendations for Open Access to Research Data in Europe) which aims to leverage existing networks, communities and projects to address challenges within the open access and data dissemination and preservation sector. We will introduce the key recommendations, which provide solutions relevant to opening access to PSI. The project is built on case study research of five scientific disciplines with the aim of recognizing and working with disciplinary fragmentation associated with open access to research data. The RECODE findings revealed that the mobilisation of open access to research data requires a partnership approach for developing a coherent and flexible ecosystem that is easy and transparent to embed in research practice and process. As such, the development of open access to research data needs to be: • Informed by research practices and processes in different fields • Supported by an integrated institutional and technological data infrastructure and guided by ethical and regulatory frameworks • Underpinned by infrastructure and guiding frameworks that allow for differences in disciplinary research and data management practices • Characterised by a partnership approach involving the key stakeholders, researchers, and institutions The proposed presentation will examine each of these aspects in detail and use information and good practices from the RECODE project to consider how stakeholders within the PSI movement might action each of these points. It will also highlight areas where RECODE findings and good practice recommendations have clear relevance for the PSI sector.
30 CFR 291.110 - Who may MMS require to produce information?
Code of Federal Regulations, 2010 CFR
2010-07-01
... OPEN AND NONDISCRIMINATORY ACCESS TO OIL AND GAS PIPELINES UNDER THE OUTER CONTINENTAL SHELF LANDS ACT... make a decision on whether open access or nondiscriminatory access was denied. (b) If you are a party...
30 CFR 291.110 - Who may MMS require to produce information?
Code of Federal Regulations, 2011 CFR
2011-07-01
..., DEPARTMENT OF THE INTERIOR APPEALS OPEN AND NONDISCRIMINATORY ACCESS TO OIL AND GAS PIPELINES UNDER THE OUTER... believes is necessary to make a decision on whether open access or nondiscriminatory access was denied. (b...
30 CFR 291.111 - How does MMS treat the confidential information I provide?
Code of Federal Regulations, 2011 CFR
2011-07-01
..., AND ENFORCEMENT, DEPARTMENT OF THE INTERIOR APPEALS OPEN AND NONDISCRIMINATORY ACCESS TO OIL AND GAS... to inform a decision on whether open access or nondiscriminatory access was denied may claim that...
Moorhead, Laura L; Holzmeyer, Cheryl; Maggio, Lauren A; Steinberg, Ryan M; Willinsky, John
2015-01-01
Through funding agency and publisher policies, an increasing proportion of the health sciences literature is being made open access. Such an increase in access raises questions about the awareness and potential utilization of this literature by those working in health fields. A sample of physicians (N=336) and public health non-governmental organization (NGO) staff (N=92) were provided with relatively complete access to the research literature indexed in PubMed, as well as access to the point-of-care service UpToDate, for up to one year, with their usage monitored through the tracking of web-log data. The physicians also participated in a one-month trial of relatively complete or limited access. The study found that participants' research interests were not satisfied by article abstracts alone nor, in the case of the physicians, by a clinical summary service such as UpToDate. On average, a third of the physicians viewed research a little more frequently than once a week, while two-thirds of the public health NGO staff viewed more than three articles a week. Those articles were published since the 2008 adoption of the NIH Public Access Policy, as well as prior to 2008 and during the maximum 12-month embargo period. A portion of the articles in each period was already open access, but complete access encouraged a viewing of more research articles. Those working in health fields will utilize more research in the course of their work as a result of (a) increasing open access to research, (b) improving awareness of and preparation for this access, and (c) adjusting public and open access policies to maximize the extent of potential access, through reduction in embargo periods and access to pre-policy literature.
Hard real-time closed-loop electrophysiology with the Real-Time eXperiment Interface (RTXI)
George, Ansel; Dorval, Alan D.; Christini, David J.
2017-01-01
The ability to experimentally perturb biological systems has traditionally been limited to static pre-programmed or operator-controlled protocols. In contrast, real-time control allows dynamic probing of biological systems with perturbations that are computed on-the-fly during experimentation. Real-time control applications for biological research are available; however, these systems are costly and often restrict the flexibility and customization of experimental protocols. The Real-Time eXperiment Interface (RTXI) is an open source software platform for achieving hard real-time data acquisition and closed-loop control in biological experiments while retaining the flexibility needed for experimental settings. RTXI has enabled users to implement complex custom closed-loop protocols in single cell, cell network, animal, and human electrophysiology studies. RTXI is also used as a free and open source, customizable electrophysiology platform in open-loop studies requiring online data acquisition, processing, and visualization. RTXI is easy to install, can be used with an extensive range of external experimentation and data acquisition hardware, and includes standard modules for implementing common electrophysiology protocols. PMID:28557998
Wu, Shih-Ying; Aurup, Christian; Sanchez, Carlos Sierra; Grondin, Julien; Zheng, Wenlan; Kamimura, Hermes; Ferrera, Vincent P; Konofagou, Elisa E
2018-05-22
Brain diseases including neurological disorders and tumors remain under treated due to the challenge to access the brain, and blood-brain barrier (BBB) restricting drug delivery which, also profoundly limits the development of pharmacological treatment. Focused ultrasound (FUS) with microbubbles is the sole method to open the BBB noninvasively, locally, and transiently and facilitate drug delivery, while translation to the clinic is challenging due to long procedure, targeting limitations, or invasiveness of current systems. In order to provide rapid, flexible yet precise applications, we have designed a noninvasive FUS and monitoring system with the protocol tested in monkeys (from in silico preplanning and simulation, real-time targeting and acoustic mapping, to post-treatment assessment). With a short procedure (30 min) similar to current clinical imaging duration or radiation therapy, the achieved targeting (both cerebral cortex and subcortical structures) and monitoring accuracy was close to the predicted 2-mm lower limit. This system would enable rapid clinical transcranial FUS applications outside of the MRI system without a stereotactic frame, thereby benefiting patients especially in the elderly population.
Volkov, Alexey; Gustafson, Karl P J; Tai, Cheuk-Wai; Verho, Oscar; Bäckvall, Jan-E; Adolfsson, Hans
2015-04-20
Herein, a practical and mild method for the deoxygenation of a wide range of benzylic aldehydes and ketones is described, which utilizes heterogeneous Pd/C as the catalyst together with the green hydride source, polymethylhydrosiloxane. The developed catalytic protocol is scalable and robust, as exemplified by the deoxygenation of ethyl vanillin, which was performed on a 30 mmol scale in an open-to-air setup using only 0.085 mol % Pd/C catalyst to furnish the corresponding deoxygenated product in 93 % yield within 3 hours at room temperature. Furthermore, the Pd/C catalyst was shown to be recyclable up to 6 times without any observable decrease in efficiency and it exhibited low metal leaching under the reaction conditions. © 2015 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA. This is an open access article under the terms of the Creative Commons Attribution Non-Commercial NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.
A carrier sensed multiple access protocol for high data base rate ring networks
NASA Technical Reports Server (NTRS)
Foudriat, E. C.; Maly, Kurt J.; Overstreet, C. Michael; Khanna, S.; Paterra, Frank
1990-01-01
The results of the study of a simple but effective media access protocol for high data rate networks are presented. The protocol is based on the fact that at high data rates networks can contain multiple messages simultaneously over their span, and that in a ring, nodes used to detect the presence of a message arriving from the immediate upstream neighbor. When an incoming signal is detected, the node must either abort or truncate a message it is presently sending. Thus, the protocol with local carrier sensing and multiple access is designated CSMA/RN. The performance of CSMA/RN with TTattempt and truncate is studied using analytic and simulation models. Three performance factors, wait or access time, service time and response or end-to-end travel time are presented. The service time is basically a function of the network rate, it changes by a factor of 1 between no load and full load. Wait time, which is zero for no load, remains small for load factors up to 70 percent of full load. Response time, which adds travel time while on the network to wait and service time, is mainly a function of network length, especially for longer distance networks. Simulation results are shown for CSMA/RN where messages are removed at the destination. A wide range of local and metropolitan area network parameters including variations in message size, network length, and node count are studied. Finally, a scaling factor based upon the ratio of message to network length demonstrates that the results, and hence, the CSMA/RN protocol, are applicable to wide area networks.
Comparing the biocidal properties of non-thermal plasma sources by reference protocol
NASA Astrophysics Data System (ADS)
Khun, Josef; Jirešová, Jana; Kujalová, Lucie; Hozák, Pavel; Scholtz, Vladimír
2017-10-01
The previously proposed reference protocol enabling easy comparison of biocidal properties of different non-thermal plasma sources has been followed and discussed. For inactivation tests the reference protocol has used spores of Gram positive bacterium Bacillus subtilis (ATCC 6633) deposited on a polycarbonate membrane as reference sample. In this work, biocidal properties of a negative glow corona, positive streamer corona, positive transient spark and cometary discharges are being compared in both open air and closed apparatus. Despite the total number of bacteria surviving 1 h exposure has decreased by up to 7 orders in closed apparatus, in open one, only weak inhibition bactericidal effect has been observed.
HTTP as a Data Access Protocol: Trials with XrootD in CMS’s AAA Project
NASA Astrophysics Data System (ADS)
Balcas, J.; Bockelman, B. P.; Kcira, D.; Newman, H.; Vlimant, J.; Hendricks, T. W.;
2017-10-01
The main goal of the project to demonstrate the ability of using HTTP data federations in a manner analogous to the existing AAA infrastructure of the CMS experiment. An initial testbed at Caltech has been built and changes in the CMS software (CMSSW) are being implemented in order to improve HTTP support. The testbed consists of a set of machines at the Caltech Tier2 that improve the support infrastructure for data federations at CMS. As a first step, we are building systems that produce and ingest network data transfers up to 80 Gbps. In collaboration with AAA, HTTP support is enabled at the US redirector and the Caltech testbed. A plugin for CMSSW is being developed for HTTP access based on the DaviX software. It will replace the present fork/exec or curl for HTTP access. In addition, extensions to the XRootD HTTP implementation are being developed to add functionality to it, such as client-based monitoring identifiers. In the future, patches will be developed to better integrate HTTP-over-XRootD with the Open Science Grid (OSG) distribution. First results of the transfer tests using HTTP are presented in this paper together with details about the initial setup.
Rugemalila, Joas B; Ogundahunsi, Olumide A T; Stedman, Timothy T; Kilama, Wen L
2007-12-01
Malaria is a major public health problem; about half of the world's populations live under exposure. The problem is increasing in magnitude and complexity because it is entwined with low socio-economic status, which makes African women and children particularly vulnerable. Combating malaria therefore requires concerted international efforts with an emphasis on Africa. The Multilateral Initiative on Malaria (MIM) was founded in 1997 to meet that need through strengthening research capacity in Africa, increasing international cooperation and communication, and utilization of research findings to inform malaria prevention, treatment, and control. The review undertaken in 2002 showed that through improved communication and science-focused institutional networks, MIM had brought African scientists together, opened up communication among malaria stakeholders, and provided Internet access to literature. The achievements were made through four autonomous constituents including the coordinating Secretariat being hosted for the first time in Africa by the African Malaria Network Trust (AMANET) for the period 2006-2010. The other constituents are the MIM TDR providing funding for peer-reviewed research; MIMCom facilitating Internet connectivity, access to medical literature, and communication between scientists inside and outside of Africa; and MR4 providing scientists access to research tools, standardized reagents, and protocols. Future plans will mostly consolidate the gains made under the MIM Strategic Plan for the period 2003-2005.
Tsay, Ming-Yueh; Wu, Tai-Luan; Tseng, Ling-Li
2017-01-01
This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001-2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system.
ERIC Educational Resources Information Center
Buchanan, Larry
1998-01-01
Addresses the use of e-mail for communication and collaborative projects in schools. Discusses the effectiveness of an e-mail system based on a UNIX host; problems with POP (post office protocol) client programs; and the new Internet Mail Access Protocol (IMAP) which addresses most of the shortcomings of the POP protocol while keeping advantages…
The Use of Enhanced Appointment Access Strategies by Medical Practices.
Rodriguez, Hector P; Knox, Margae; Hurley, Vanessa; Rittenhouse, Diane R; Shortell, Stephen M
2016-06-01
Strategies to enhance appointment access are being adopted by medical practices as part of patient-centered medical home (PCMH) implementation, but little is known about the use of these strategies nationally. We examine practice use of open access scheduling and after-hours care. Data were analyzed from the Third National Study of Physician Organizations (NSPO3) to examine which enhanced appointment access strategies are more likely to be used by practices with more robust PCMH capabilities and with greater external incentives. Logistic regression estimated the effect of PCMH capabilities and external incentives on practice use of open access scheduling and after-hours care. Physician organizations with >20% primary care physicians (n=1106). PCMH capabilities included team-based care, health information technology capabilities, quality improvement orientation, and patient experience orientation. External incentives included public reporting, pay-for-performance (P4P), and accountable care organization participation. A low percentage of practices (19.8%) used same-day open access scheduling, while after-hours care (56.1%) was more common. In adjusted analyses, system-owned practices and practices with greater use of team-based care, health information technology capabilities, and public reporting were more likely to use open access scheduling. Accountable care organization-affiliated practices and practices with greater use of public reporting and P4P were more likely to provide after-hours care. Open access scheduling may be most effectively implemented by practices with robust PCMH capabilities. External incentives appear to influence practice adoption of after-hours care. Expanding open access scheduling and after-hours care will require distinct policies and supports.
Sefuba, Maria; Walingo, Tom; Takawira, Fambirai
2015-09-18
This paper presents an Energy Efficient Medium Access Control (MAC) protocol for clustered wireless sensor networks that aims to improve energy efficiency and delay performance. The proposed protocol employs an adaptive cross-layer intra-cluster scheduling and an inter-cluster relay selection diversity. The scheduling is based on available data packets and remaining energy level of the source node (SN). This helps to minimize idle listening on nodes without data to transmit as well as reducing control packet overhead. The relay selection diversity is carried out between clusters, by the cluster head (CH), and the base station (BS). The diversity helps to improve network reliability and prolong the network lifetime. Relay selection is determined based on the communication distance, the remaining energy and the channel quality indicator (CQI) for the relay cluster head (RCH). An analytical framework for energy consumption and transmission delay for the proposed MAC protocol is presented in this work. The performance of the proposed MAC protocol is evaluated based on transmission delay, energy consumption, and network lifetime. The results obtained indicate that the proposed MAC protocol provides improved performance than traditional cluster based MAC protocols.